How updating the communication device known as a “diploma” will help students acquire the right skills and help companies hire the right talent.
Every year, millions of Americans embark on the quest to earn a four-year college degree. Many motives propel them. They go to acquire skills and knowledge from experts in their fields. They go, more generally, to learn how to learn, and to broaden their minds in ways that will help them function as autonomous adults participating fully in the civic life of their country. They go to find friends and mentors. They go because they know that in today’s highly competitive job market, many employers won’t even grant them an interview for a position as a receptionist or a file clerk unless they have a four-year-degree.
College marketing literature rarely expresses this last fact so bluntly. Instead, it tends to emphasize vibrant communities of scholarship and learning, stimulating atmospheres of intellectual inquiry, enduring commitments to academic excellence.
Now, however, there are an expanding number of ways to acquire specific skills and knowledge faster and less expensively than one can manage through a traditional four-year degree program. There are increasing opportunities and venues where people can seek mentorship and develop strategic alliances.
The sole unique feature of a few thousand U.S. institutions of higher learning is their ability to grant four-year degrees. And because a diploma from a four-year program is the mechanism a majority of employers use to screen potential hires, it’s both increasingly valuable and increasingly costly to obtain.
These days, getting that sheepskin from a top-flight university can cost approximately $200,000 in tuition alone. And while many schools have begun to steeply discount their advertised tuitions as they scramble to attract new students in the current market, thousands of graduates continue to emerge from college saddled with six-figure debts. In 2010, the nation’s collective student loan debt exceeded its collective credit card debt for the first time in history.
To help temper the high cost of college, a number of high-tech start-ups have been making impressive strides in the realm of online instruction. But if we truly want to retool higher education for the 21 century in the most forward-thinking way possible, we shouldn’t confine our retooling efforts to instruction alone.
To do this, we need to apply new technologies to the primary tool of traditional certification, the diploma. We need to take what now exists as a dumb, static document and turn it into a richer, updateable, more connected record of a person’s skills, expertise, and experience. And then we need to take that record and make it part of a fully networked certification platform.
Once we make this leap, certification can play a more active role in helping the higher education system clearly convey to students what skills and competencies they should pursue if their primary objective is to optimize their economic futures.
Granted, college isn’t just for training young people for the world of work. But if we truly believe that a college education is the best path toward general prosperity and personal fulfillment, we need to do more to ensure that our college graduates are economically viable.
One way to accomplish this is to establish certification as a platform in which the roles and interests of key players in the higher education system – students, educators, and employers – are explicitly articulated and tightly integrated. Functioning as a feedback loop, certification can then help achieve a goal that is at least as crucial as controlling tuition costs: Helping individuals stay employable and competitive in a professional landscape where the desired skills and competencies change rapidly.
Diplomas: Time for an Upgrade
We sometimes call a diploma a “sheepskin.” Why? Because until around a hundred years ago, that’s what most of them were made from. Then, paper diplomas began to appear. After centuries of usage, that was the big upgrade to this technology. And there really haven’t been any since.
Typically, we don’t think of diplomas as a “technology.” But they are. Economists often speak of their “signalling” value. Equipped with a diploma, a job-seeker broadcasts numerous positive attributes to potential employers: Perseverance, self-governance, competence in at least one area.
Employers, in turn, use diplomas as screening mechanisms. If you don’t have a diploma, you don’t get an interview. According to the New York Times, even employers looking for receptionists and file clerks require a bachelor’s degree these days. “When you get 800 résumés for every job ad, you need to weed them out somehow,” an executive recruiter told the newspaper.
So a diploma is essentially a communications device that signals a person’s readiness for certain jobs.
But unfortunately it’s a dumb, static communication device with roots in the 12 century.
That needs to change.
At my alma mater, Stanford University, a bachelor’s degree currently costs more than $160,000 in tuition alone. Less than ten miles from Stanford, however, another school, Foothill College, also issues degrees. There, you can get a two-year associate’s degree for around $2,790, or less than 2 percent of what you’d pay for a Stanford degree.
The problem is if the baseline requirement to obtain a job interview, even for positions like “receptionist” and “file clerk,” is a four-year bachelor’s degree, then in practical terms an associate’s degree is not even worth 2 percent of a Stanford degree. It’s worth zero.
So despite the fact that colleges and other education providers have established a variety of alternative programs and degree options, at a variety of different price points, employers have simply placed more and more emphasis on traditional four-year degrees.
Not that this means employers are satisfied with the system.
In March 2013, the radio show Marketplace teamed up with The Chronicle of Higher Education and asked around 700 employers to grade the nation’s colleges and universities on how well they were employing their graduates for the workplace.
53 percent of them said they “had trouble finding recent graduates qualified to fill positions at their company or organization.” 28 percent said colleges did only a “fair job” of producing successful employees. They also said that more than grades, major, or what school a person attended, “employers viewed an internship as the single most important credential for recent grads.”
At first glance, this perspective is baffling. Employers insist that college degrees are a prerequisite for employment, even for low-skilled clerical positions. And yet what they find most telling is not how well people do in four-year-degree programs, but how well they do in settings that approximate workplaces.
Thus, there’s actually reason for hope here. The more employers realize that four-year degrees don’t necessarily guarantee the attributes they value most, the more likely they’ll be to demand a system that does.
Design Specs for a Smarter Diploma
We spend years of our lives working to obtain a diploma. We invest substantial capital in it. And yet compared to the nuanced portraits of our aptitudes and attitudes that our teachers presented to our parents on our first-grade report cards, a college diploma is an opaque and unrevealing document.
If we were building a higher education system from scratch, would our records of assessment and certification look anything like today’s diplomas? Ask a hundred people to build a better diploma, and you’ll probably end up with a hundred different solutions. None, however, would look like a traditional sheepskin.
In my opinion, these are the characteristics a 21 century diploma should have:
- It should accommodate a completely unbundled approach to education, allowing students to easily apply credits obtained from a wide range of sources, including internships, peer to peer learning, online classes, and more, to the same certification.
- It should be dynamic and upgradeable, so individuals can add new credentials to it as they pursue new goals and educational opportunities and so that the underlying system itself is improvable.
- It should help reduce the costs of higher education and increase overall value.
- It should allow a person to convey the full scope of his or her skills and expertise with greater comprehensiveness and nuance, in part to enable better matching with jobs.
- It should be machine-readable and discoverable, so employers can easily evaluate it in numerous ways as part of a larger “certification platform”
Two hundred years ago, what you learned about Latin, the Bible, and mathematics when you were 21 was just as likely to be true when you turned 70. So you spent four straight years in college lecture halls and libraries, you acquired skills and knowledge that would serve you for life, and then you were done.
Now, in today’s fast-changing world, it makes more sense to learn provisionally, opportunistically, as new challenges and necessities arise.
To make this style of learning more practical, we need certification for it that employers will grow to trust and value even more than they do traditional bachelor’s degrees because the efficacy will be so much better.
Imagine an online document that’s iterative like a LinkedIn profile (and might even be part of the LinkedIn profile), but is administered by some master service that verifies the authenticity of its components. While you’d be the creator and primary keeper of this profile, you wouldn’t actually be able to add certifications yourself. Instead, this master service would do so, verifying information with the certification issuers, at your request, after you successfully completed a given curriculum.
Over time, this dynamic, networked diploma will contain an increasing number of icons or badges symbolizing specific certifications. It could also link to transcripts, test scores, and work examples from these curricula, and even evaluations from instructors, classmates, internship supervisors, and others who have interacted with you in your educational pursuits.
Ultimately the various certificates you earn could be bundled into higher-value certifications. If you earn five certificates in the realm of computer science, you might receive an icon or badge that symbolizes this higher level of experience and expertise. In this way, you could eventually assemble portfolios that reflect a similar breadth of experiences that you get when you pursue a traditional four-year degree.
For students, the more modularized approach to instruction embodied in such diplomas would have immediate benefits. Traditional four-year degrees maximize tuition costs, because they only award certification for lengthy courses of study that require substantial capital investments. A more modularized system would move beyond this all-or-nothing approach. Instead of taking general education classes for two years and then dropping out and ending up with little to show for their efforts except two years of debt, students could make smaller investments — in money and time — to acquire specific credentials.
This approach would also encourage students to think more strategically about specific learning paths to pursue, and make it easier to integrate internships into their education. Instead of randomly choosing courses to fulfill “general education” and “support courses” requirements, a student on a more modularized path might focus on, say, the six courses necessary to earn a certificate in “Workplace Communication Skills” or “The Future of Space Exploration.” And then complete an associated internship before moving on to subsequent certificate programs.
At LinkedIn, we’ve developed a broad “Skills & Expertise” taxonomy that our members use to describe their attributes, and which then serve as the basis for endorsements from colleagues. For example, some of my skills include “Entrepreneurship,” “Project Management,” and “Viral Marketing.” In a more outsourced form of Apple University, the in-house program that Apple now uses to teach its executives to think more like Steve Jobs, companies could use this taxonomy to publicize the skills and experiences they value most, and education providers could develop curricula that leads to certification in these areas.
For champions of a traditional liberal arts education, encouraging our nation’s youth to major in “Project Management for Yahoo!” may sound like a higher education inferno even Dante himself couldn’t stomach.
But the national mandate to produce more college graduates — as expressed by President Obama and many others — doesn’t arise from our imminent shortage of Comp Lit majors. It arises from our desire to give more people access to training that can put them on a path to economic security, and to help them develop the skills that can keep America competitive on a global level.
Diplomas that get updated over time as new certificates are added, and which exist as part of larger certification platform, could transform the ways that employers use diplomas. Traditionally, bachelor’s degrees have offered an easy way to winnow a pile of a thousand resumes into a pile of twenty resumes — but they’re also a very limited filter. Because the specific information they codify about a person is minimal, they’re more far more useful for weeding than finding.
As certification gets more granular, however, and as diplomas contain more information and exist as part of a larger, networked ecosystem, new possibilities emerge. Want to find ten potential employees who have amassed at least three certificates related to brand management and have at least five positive endorsements from their instructors? A 21 century diploma should allow you to do that.
Certification as a Platform
One of the main reasons the college degree persists as a technology is because it doesn’t need a user manual. We know what a traditional college degree signifies in general. We’re familiar with many of its nuances. A degree in Biochemistry & Molecular Biophysics from CalTech means one thing. A degree in Sculpture from Bennington means something else.
How, in a landscape of infinite certificates, will we determine which ones to value and trust? This is the problem that has always plagued alternate forms of certification, and it will only intensify as digital instruction becomes more full-featured and effective.
One organization trying to bring a sense of order to the imminent chaos is Mozilla, the non-profit that oversees the development of the open-source web browser, Firefox, and where I’m on the Board of Directors. In 2011, Mozilla introduced Open Badges, an initiative to develop free software and an open technical standard that any organization or individual can use to issue verified digital badges that symbolize a skill or achievement attained through either online or offline study or participation in some activity.
For example, you might earn a badge for completing a six-week “Introduction to Statistics” course, or for consistently making high-value contributions to an online message board where math students seek help on their homework.
As a person earns badges from multiple sources, they’re all stored in a private repository called the Mozilla Backpack. There, you can arrange your badges into themed groups and choose which ones to share on social networks and other sites. Each badge comes with a great deal of metadata attached to it, including information about the issuer, what the badge signifies, the criteria used to assess your achievements, and on some occasions, links to the work you did in pursuit of the badge.
Already, Peer to Peer University, the YMCA of Greater New York, the Corporation for Public Broadcasting, and Disney-Pixar, to name just a few, have issued or are developing badges using Mozilla’s technical standard. Mozilla is a good initial step, but there are many attributes that are important – ranging from employer trust to persistent storage of the certification if the source goes away – that still need to be worked out.
Creating a shared standard for attaching machine-readable information to certifications is an important first step for getting employer buy-in. Another key step will involve aggregating this data. If millions of people start storing their certification information in a common repository like LinkedIn, certification will evolve from a product (i.e., a traditional diploma) into a platform that can be easily searched and analyzed.
With certification as a platform, not just a product, the feedback loops between all parties will tighten. Education providers will have more capacity to track what employers are looking for and adjust their curricula accordingly. Students will have more explicit guideposts to follow, so they can invest their tuition dollars and time into developing skills that will truly increase their chances of transitioning successfully to the workforce. Employers will be able to use certification as a finding mechanism, not just a screening mechanism.
With certification as platform, “Weed out everyone who doesn’t have an Ivy League diploma” will evolve into “Let’s find someone who possesses these specific skills and attributes that will help our organization.” With certification as a platform, the communication device currently known as the ‘diploma’ becomes a much richer signal that will help businesses hire better and help individuals learn and grow faster.
Making this transition won’t happen overnight. But if we truly want to use technology to transform higher education, we can’t just confine our efforts to transforming instruction. We have to transform certification too. In doing so, we have an opportunity to create a new system that makes it clear to students what skills are most relevant and in highest demand, and thus gives them a chance to pursue these skills more strategically.
But our higher education system can’t implement such changes alone. The business world has to embrace certification-as-a-platform, too. As long as it continues to depend on a 12 century communications device, the diploma, as its preferred gateway to entry, we won’t be able to fully capitalize on 21 century innovations in technology and education.