After my third semester of studying Cybersecurity at the University of Maryland, I struggled with the same question that every student asks themselves: will this degree give me the skillset I need to succeed in my career? Or is it time to switch my major again?
I chose to study Cybersecurity and Information Science hoping to build concrete cybersecurity skills and tools that I could talk about intelligently, and then apply those in my career. My peers who studied non-technical degrees were jealous that I was able to say things like “I know how to code in Python, I can build a database with SQL, and I can perform statistical analyses with R.”
The graduating Class of 2020 is all too familiar with the Catch-22 of needing experience for a job, but also needing a job to gain experience. According to the National Academies of Science, Engineering and Medicine, degree production in computer information sciences jumped by 115 percent from 2009 to 2015. More and more students each year are choosing to study technical degrees rather than degrees in the humanities. Incoming freshmen believe that by choosing a technical degree, their classes will teach them the hard skills they will need to be workforce ready. After all, isn’t college supposed to prepare you for the real world?
But let me tell you a secret. Even though I chose a technical degree and I learned some cool cybersecurity skills, I still found myself unprepared for the job market. I thought I did everything right: I studied the material, learned about cybersecurity concepts, had several internships… Why was I barely qualified for entry level positions in cybersecurity?
The Cybersecurity Talent Gap
Like many other recent grads, I left university with a ton of theoretical knowledge about my field, but I was lacking hands-on experience with real-world cybersecurity scenarios. This contributes significantly to the cybersecurity talent gap, where the global shortage of cybersecurity professionals sits above 4 million, according to the ISC2. To make matters worse, very few people currently in those positions are well-trained or qualified. Most cybersecurity professionals experience their first cyber-attack while on the job. Why are so many people entering the cybersecurity workforce without proper training? By choosing a technical degree, I thought I was getting the best bang for my buck. Unfortunately, I too, entered the workforce lacking the experience I needed.
How can university professors and curriculum writers best prepare their students for the grueling job market? How can they minimize this talent gap? Is there a way to give students real-world scenarios and experiences?
Training Through Experiential Learning Beats Theoretical Knowledge
The solution is simple, yet very few universities have taken the right steps to boost their students’ readiness to apply their degrees in the real world. Teaching cybersecurity skills through experiential learning techniques is the best way to prepare a student for a technical cybersecurity job. Teaching the concepts and skills without simulating real-world attacks is a huge missed opportunity. I wish that my professors had taught me how to apply these skills.
Following graduation, I was fortunate enough to join Cyberbit. Cyberbit offers a cybersecurity skills development platform with a robust library of labs and attack scenarios, providing a hands-on experience, and targeting muscle memory. Real-world attack scenarios usually involve time constraints, pressure, team-work, communication, and other soft skills that Cyberbit is able to simulate on their virtual network. I was quite surprised to learn that other universities around the US like Purdue, University of Maine at Augusta, Columbus State, and other large universities were already working with Cyberbit.
Experiential learning is an obvious solution to this enormous problem. A young surgeon follows a surgical guide and practices on 3D printed cadavers before using a scalpel on real flesh. Pilots train in flight simulators before their first real takeoff. Firemen encounter several controlled fires before they are trusted to enter a burning building. Soldiers train in a simulated battlefield in the Mojave Desert before entering the frontline of a civilian-filled town.
Just as doctors, pilots, firemen, and soldiers receive training and real-world simulation scenarios before going out into the field, cybersecurity professionals require the same type of learning to prepare for their first jobs.
Experiential learning not only prepares the student, but also instills confidence that they can handle similar situations in the future. In a video summarizing the experience of students on Cyberbit at Miami Dade College, Justin Burandt, Associate Director of the Cybersecurity Center of the Americas, hit on the exact same point. He said that most trainees who go through Cyberbit training have a higher level of confidence since they have experience from their classes that can actually be applied in the real world. Professors are aware that experiential learning is the most effective way to apply cybersecurity skills and knowledge, but they often lack the tools to do so. Universities must invest in experiential learning tools if they want to prepare their students for the workforce.
As the world continues to adapt to this unprecedented reality, people are starting to question the value of a university education. Companies like Microsoft, Google, and Apple have announced that they no longer require a university degree as a job requirement. It has become more and more apparent that colleges are not doing enough to prepare students for their jobs. I appreciate what my university has done for my own professional development, but I urge my professors and academic advisors to include proper cybersecurity training in their curriculums. Universities need to do a better job preparing you for the real-world.