Last month, as a part of my portfolio project, I got three MTA (Microsoft Technology Associate) certifications, in Java, JavaScript, and HTML/CSS. I documented this fact through my project updates, but if you don’t know what those are, I didn’t offer a lot of explanation.
So, let me take this opportunity to explain what a technology certification is, why it matters, and why if you had to choose between some certs and a college degree, you should choose the certs.
There are a wide variety of different certs, offered by a large number of companies, which demonstrate proficiency with a ton of disparate technologies. Each cert is accorded a certain level of respect in the tech space, based on how central the company offering the cert is to the area of technology tested by the cert. For example: SQL Server, one of the most common flavors of SQL, is owned by Microsoft. As such, the MCSE (Microsoft Certified Systems Engineer) in SQL is one of the most highly respected SQL certs.
Most companies who offer certifications offer them in tiers. I’ll use Microsoft as an example. The MTA is the lowest level of Microsoft certification. After that, the next level is the MCP (Microsoft Certified Professional), and then the MCSA (Microsoft Certified Solutions Associate). To get an MCSA, you need to pass three or four related MCP exams. The top tier is the MCSE. To get an MCSE, you need to already have an MCSA, then pass one or two additional (much more difficult) exams.
These certifications demonstrate specific things to employers. For example, “I have an MTA in this” means “I am skilled enough at this to get an entry-level job doing it”. “I have an MCSE in this” means “I am an expert at this and have a lot of experience using it”.
Tech certs and college degrees occupy very different niches. Where tech certs demonstrate that you have a certain level of hands-on skill with a specific technology or technological area, a college degree demonstrates (hopefully) that you have a broad proficiency with technology in general. Where a certification is product-specific, company-specific, and clearly leveled, a degree is more subject area focused and doesn’t necessarily include any particular language or technology, and while it does come in levels (AS, BS, MS, PhD), the levels correlate to time spent, more than skills earned.
If I say “I have a CCIE”, you know that I have a very high level of technical knowledge about routers and networking in general, and Cisco routers specifically (Cisco being one of the main manufacturers of routers). This is incredibly useful knowledge to employers, who now know exactly what I can do for them. If, however, I said “I have a Master’s degree in computer information technology”, you only know that I’ve spent six years immersed in technology. You don’t know what kind of technology, and if you’re an employer looking to hire someone who knows how to work with Cisco routers, you’ve got no clue if I can do that. My degree might have required one class in networking, or ten, or none at all. You have no idea.
It’s not just that degrees can be highly unspecific and not very useful to employers looking for specific skills. When someone says “I have a degree”, you don’t even know if their knowledge is up-to-date.
Certifications are always up-to-date, in one of two ways. Some certifications are only valid for a certain amount of time. For example, the CISSP (Certified Information Systems Security Professional, a cybersecurity certification) expires after three years, at which point it needs to be renewed. Other certifications are version-specific. For example, if you get an MCSE in SQL Server 2012-14, you have it forever, but you’ll probably want to get one in SQL Server 2016 as well once the newer version becomes ubiquitous.
But a degree doesn’t work like this. Once someone is taught a thing in a class, there is no requirement that they maintain that knowledge to keep the degree up-to-date. Furthermore, the thing taught in the class may not even have been up-to-date at the time it was taught. A lot of colleges have reputations for teaching things that are out of date, not out of malice, but because tech changes faster than colleges can. It’s rare to find college classes that teach the latest and greatest, and it’s common to find colleges teaching material that’s out of fashion, unnecessary, or just plain obsolete.
There’s yet another problem with degrees: they aren’t vendor-specific. I mentioned the CCIE before: the networking certification offered by Cisco. That last part is important. The benefit of a CCIE isn’t just that it says “This person knows routers.” It’s that it says “This person knows routers. Sincerely, the guy who makes the routers.” With a degree, it’s like “This person knows how tech works. Sincerely, a college that you may or may not have ever heard of.” So it’s not just the lack of information, it’s also the lack of credibility behind that information.
It is also important to note something kind of unique about tech: many of the seasoned working professionals in the tech space don’t have degrees in tech. This is because when these professionals were starting out in tech, the field was so new that there weren’t many degrees in it. They got degrees in other things: math, electrical engineering, underwater basket weaving, whatever. The thing that made them tech professionals was their technical knowledge, not what came after “BS in” on their resume. And nowadays, with the advent of certifications, these professionals have MCSEs or CCIEs, not college degrees.
Basically, if you have to choose between a candidate for your job who has a college degree in technology, and a candidate who has no degree but has several certifications and a good portfolio, you’re going to pick the latter. Even so, there is a reason for degrees in tech to exist besides “colleges wanted to jump on the tech bandwagon”.
Firstly, college degrees in tech demonstrate a commitment to learning technology over an extended period. Since you can get certs over however long you want so long as you keep them up-to-date, an employer has no idea whether the candidate with three certs has spent one year or five learning that material. College, by contrast, has a consistent, fast pace, so an employer can infer from a college degree that the candidate is capable of maintaining a high workload for an extended period of time.
Second, college degrees are designed to give you an understanding of the fundamentals of the field you’re entering, including at least the basics in the breadth of the field, plus some significant depth in at least one area. A major downside of having only certs is that you may not have a good foundation: you may be a one-trick pony who knows only one thing, and may do that one thing well, but may not know about how it fits into the bigger picture, what it depends on, or what depends on it.
The big advantage of a degree is that you’re going to come out of it with a general understanding of every major sub-discipline. You come out of a degree in tech with a general knowledge of networking, operating systems, electronics, math and logic, and programming. This is generally how every degree works, and it is in fact a valuable service that colleges provide.
The theoretical ideal candidate has both a degree in technology (for the breadth of knowledge and commitment) and multiple certifications (for specific, up-to-date knowledge about vendor technology). Often in the real world, though, employers will pass up someone with a nice shiny degree in favor of someone who has the right certs and a good portfolio.