25 years of A+: The past, present, and possibly troubled future of the most popular IT cert of them all
This feature first appeared in the Summer 2018 issue of Certification Magazine. Click here to get your own print or digital copy.
The A+ certification from CompTIA is one of the most popular and well-recognized IT credentials in the world. Since its introduction in the early 1990s, the A+ has risen to its current status as a standard requirement for computer technicians, support specialists, and help desk personnel. It has also been the “gateway” certification program for thousands of people looking to start a career in the IT industry.
As this year marks the 25th anniversary of the A+ certification, let’s take a look back at the origins of the credential, evaluate where the A+ currently exists in the IT industry, and peer ahead to make some educated guesses about the future of the A+ certification program.
Origin story: CompTIA and A+
The organization that became CompTIA began its life in 1982 as the Association of Better Computer Dealers, giving it the memorable — and marketable — moniker ABCD. This group was created by a handful of computer dealers who wanted to create changes that would benefit everyone involved with the IT industry.
ABCD eventually set up its home base in Illinois, and changed its name to the Computing Technology Industry Association. The name change provided a more transparent description of the organization’s mandate, and better represented the group’s desire to form a trusted connection between the swiftly evolving IT industry and the greater American business community.
In the early 1990s, CompTIA’s management noticed there was a problem with existing IT industry certifications. At that time, the majority of IT certifications were for products manufactured by a specific vendor.
Technicians could get certified on computing equipment from major players like IBM, Hewlett Packard, or Compaq. There were no vendor-neutral IT certifications, however, that could represent someone’s broader skill set for computing hardware and software.
In 1993, CompTIA introduced its own training and certification program by launching the first A+ certification exam. Consisting of one exam covering an extensive range of computing technologies, A+ offered IT professionals a vendor-neutral industry credential that would demonstrate their abilities to potential employers, while giving businesses something to differentiate themselves to customers by adding A+ certified technicians to their staff.
The new credential slowly gained in popularity with IT personnel, employers, and businesses. The fact that the A+ was not tied to any specific vendor (although the exam coverage of operating systems was certainly more heavily weighted towards MS-DOS and Microsoft Windows) helped the program to win over people in the IT industry.
It’s likely, however, that even CompTIA didn’t suspect its new certification would eventually be held by more than 1 million IT professionals.
Success story: A+ hits 1,000,000
After following a modest upward trajectory over its earliest years, A+ took off like a rocket, rapidly becoming the de facto IT industry credential for PC technicians. CompTIA quickly expanded its program to include popular certifications for computer networking technologies (Network+), PC server hardware and software (Server+), and information security (Security+).
The rapid expansion of its certification program would lead to CompTIA struggling to manage a cumbersome inventory of industry credentials during the first decade of the 2000s. Many of these certifications were eventually left to fall by the wayside and get discontinued.
Some of these lesser-known casualties of CompTIA’s certification program included the iNet+ certification for internet technologies, the CDIA+ credential for document imaging and management, and the Storage+ certification for data storage solutions.
While some certifications were buried, the A+ remained the brightest star in the CompTIA constellation. Earlier on, the credential had been expanded and split across two separate exams, one of which was somewhat more hardware-related while the other focused more on software.
Starting in 2003, CompTIA began to review and refresh the content of both exams every three years, which has helped the A+ to maintain its relevancy to an industry known for constant change. (The next A+ certification exam refresh is expected to stretch this schedule slightly; the updated exams are predicted to be available sometime in 2019.)
In 2007, the A+ was given official accreditation from the American National Standards Institute (ANSI). In effect, the A+ certification received a certification. The recognition from ANSI as well as from the International Organization for Standardization (ISO) gave the A+ a globally-recognized vote of support, further increasing its industry value.
Then in 2014, CompTIA announced an astonishing milestone: it had recently awarded its one millionth A+ certification. This was viewed as quite an achievement, but was made more impressive the following year when CompTIA sent out a June 2015 press release stating it had just awarded its two millionth certification overall.
By simple math, it’s obvious that one out of every two certifications awarded by CompTIA is the A+, easily making it the most dominant credential in CompTIA’s certification program.
Current events: Why is the A+ so popular?
How did the A+ certification reach its current status as the premier credential for computing technology technicians? There has been a combination of planning and circumstances which contributed to its present popularity.
During its debut year of 1993, CompTIA awarded fewer than 7,000 A+ certifications. Critically, however, the A+ was launched early enough to attract people who were caught up in the dot-com technology wave of the late 1990s and early 2000s. The surge in IT industry hiring during this period led many professionals to pursue accreditation in one or more of the popular certification programs offered at the time.
CompTIA wisely positioned the A+ as the top certification for people looking to enter the IT industry. CompTIA officially implied a modest prerequisite by stating that the recommended experience for an A+ candidate was nine months to one year of hands-on IT support work.
The credential was nevertheless aggressively promoted by technology schools — which were partnered with CompTIA in order to offer A+ training — as the best option for people looking to start a career in the IT industry. This was very appealing to the growing number of career switchers who wanted to take advantage of the dot-com boom’s employment surge.
The popularity of the A+ was helped by its status as a vendor-neutral certification. The advantage of a vendor-neutral credential is that it maintains its value in a wide range of work environments. Certifications based on specific products from IT hardware and software vendors come with some limitations as they naturally exclude a number of potential work opportunities.
Between 1995 and 2001, A+ certified technicians were largely providing support for Windows-based PCs and printers from HP, as these items held the dominant market share at the time. But the A+’s vendor-neutral status meant that IT workers with an A+ could slot into all kinds of work scenarios.
The A+ can also credit some of its immense popularity to its international availability. As of this writing, the A+ certification exams are available in English, German, Japanese, Portuguese, French, and Spanish. In addition, CompTIA is partnered with Pearson VUE which operates over 5,000 test centers in 180 countries. The availability of different language options and Pearson VUE’s global network of test centers opens A+ certification up to a very large audience.
Finally, the popularity of the A+ has also been elevated by CompTIA through its partnership with state school boards across the United States. By providing authorized A+ training curriculum for high school students who would like to graduate with an A+ certification next to their diploma, CompTIA is already linking arms with the next generation of technologists.
While there have been criticisms directed at CompTIA (and other certification vendors) for using high schools as certification resellers, the demand for these programs has helped push the number of awarded A+ certifications upward.
Outlook: Does the industry still need A+?
The growing A+ youth movement has, somewhat problematically, raised concern about the credential’s future status in the IT industry. K-12 education has changed over the last two decades. Students are being introduced to computing technology much earlier than in previous generations.
Dozens of schools across the United States have received grants from the Department of Education to fund STEM-related programs for their students. What’s more, students are trained from the time they enter school to the time they graduate to become experts at taking standardized tests.
A 2015 study reported by The Washington Post estimates that a public school student takes well over a hundred mandatory standardized tests between pre-kindergarten and grade 12. That’s an average of about eight standardized exams a year.
Between their advanced tech knowledge and familiarity with standardized testing, it’s easy to see how a growing number of high school students are successfully challenging the A+ exams before graduation. This raises a question about the future viability of the A+ program.
What If the A+ skills currently being taught in special add-on programs in K-12 schools end up as part of the standard public school curriculum? CompTIA could see it’s A+ certification get cannibalized by the basic high school diploma.
It can also be argued that the A+ certification has become less relevant with the growing emphasis on mobile computing, and the waning value of desktop computers. For instance, the A+ 220-901 exam has a large amount of content dedicated to traditional desktop computing hardware topics such as BIOS configuration and device monitoring, installing expansion cards, and RAM module types.
Nearly all of today’s mobile phones and tablet computers, however, are not user-serviceable. These devices are not repaired by a company’s IT department; they are sent back to the manufacturer to be repaired or replaced. The A+ 220-901 exam has partially lost touch with the average modern computing environment.
The other A+ exam, 220-902, has its issues as well. The exam has over a quarter of its content dedicated to Windows operating systems, but omits any coverage of Windows 10. The exam is also woefully low on content covering iOS and Android, the two operating systems driving more than 90 percent of all mobile devices.
These issues essentially boil down to one question: Has IT evolved to the point where the A+, a single broad foundational certification, no longer represents “the industry standard for establishing a career in IT” as CompTIA touts the credential on its website?
The A+ has served as a respectable entry-level IT certification for 25 years. It may be time for CompTIA to re-examine the A+, especially in comparison to the organization’s more specialized Network+, Server+, and Security+ certifications. The A+ may no longer be the best credential for those looking to start a career in the IT industry.