gtag('js', new Date());
Online Cybersecurity Training [Your NEED TO KNOW Fundamentals]

Online Cybersecurity Training [Your NEED TO KNOW Fundamentals]

Understanding fundamentals is a key part of becoming a skilled Cybersecurity engineer. Our program will teach you how to think like a hacker and help defend against attacks with practical real-world skills you’ll need for the job. In this blog, we’ll give you an overview of each module so you can see the key points you’ll be learning to build a strong foundation.

What Are The Key, Job-Ready Cybersecurity Skills We Teach At NexGenT?

  • Identify & Analyze Threats: In the first module of the program, we will do an introduction to a high-level overview of the cybersecurity ecosystem. We will understand the threat actors and the different types of attacks you see in this domain. We will also take a look into the various toolkits and how they fit into various security frameworks.
  • Cryptography: This module will cover the fundamentals of cryptography along with practical use cases in today’s world. It is an important aspect of security and forms the basis to many protocols that keep us safe. This module will help you understand what happens behind the scenes with these algorithms and how they are useful.
  • Network Security: Network security is the basis of Cybersecurity operations. In this module we will look at the common security implementations in place and what the common weaknesses are. We will learn about low hanging fruit, which are typically overlooked, and provide a great way to raise threat awareness.
  • Secure Protocols: This module is meant to help understand basic protocols and best practices needed to create a security-focused organization. The best offense is a defense. We will learn about the different tactics needed to raise the bar.
  • Symptoms of Compromise: By recognizing the symptoms of an attack, analysts can help stop them much sooner. Here we will cover what to expect in different scenarios so that you can diagnose the problem in an efficient manner. This analysis is key to understanding what went wrong and how to prevent it from happening again in the future.
  • Cyber toolkits: A successful analyst has a wide arsenal of tools and knows how to effectively use them. In this module, we will teach you which tools are available and how to apply them for all the various security solutions and strategies.
  • Testing the Infrastructure: In this module, we will take a look at all the practical applications of attacking your own infrastructure to help defend it. We will teach you how to identify your organization’s own weaknesses such that you can help mitigate weaknesses and help define what changes need to be made.
  • And finally, Incident Response is a crucial approach on what to do in the event of a security breach. An organization needs the security team to have a playbook ready in times of crisis and know how to react to certain events. We will take a look at planning what is needed to be done.

To find out more about our Cyber Security program, click here.

The Growing Gap between Education & Employment

The Growing Gap between Education & Employment

It’s a tired trope to cite the percentage of English majors that are working as a Starbucks barista, but the point is well taken. With 44% of recent college graduates working a job that doesn’t require a degree, higher education must be missing something. And while the question is simple, the answer is less so.

What is it that needs to be gleaned from the months or years between graduation and someone’s first “big-kid” job?

Coming out bright-eyed and bushy-tailed for careers only to be forced into a low-wage position just to make ends meet or start servicing tens or even hundreds of thousands of dollars worth of student loans just doesn’t make sense. With colleges offering fast-track degrees, externships, internships, co-ops, etc. why is it that there’s still a gap between education and employment? 

Download our free 5 step guide to becoming a Network Engineer

You’ll start to find the picture becoming more clear by looking at the most recent trends. There are some previously basic truths that may not be basic nor true anymore in regards to what employers are looking for. The very word “education” may not fully suffice to describe the assortment of knowledge and skills one needs to arm themselves within a professional setting this day in age.

So what are they looking for? 

Education has long been about being well-rounded. It’s about developing critical thinking skills, researching to prove out a set of assumptions, and learning how to convey that knowledge to others. These are the soft skills that come as a byproduct rather than a direct or overtly pursued result of education. 

“To be able to teach is to be able to be taught-”

In many ways, these soft skills increase in demand in the job market at a faster pace than the particular knowledge base that may be at the heart of someone’s major or degree. These are also a valuable metric for a prospective employee’s flexible skills- to be able to teach is to be able to be taught- a mirror of one’s ability to learn new processes and technologies quickly. These being “intangible” and only able to be proven overtime on the job has necessitated a longer runway in getting a recent grad to where they want to go.

In many ways, it also offloads the cost and risk of a company having to test the waters of new employees onto “sub-degree” employers. In short, education does not mitigate the risk that someone doesn’t have these soft skills. 

Education vs. Training

The other side of the coin is training- specifically in a practical rather than theoretical way. The increase in soft skills is only overshadowed by the increase in demand for technical skills. In a recent study by Upwork, the addition of technical skills can fully double the amount of available job offers. We see this all the time without identifying it as such- the unbundling of degrees- a creation of micro-credentials to put together a more comprehensive picture of what applicants can, and, as importantly- can’t do.

The addition of technical skills can fully double the amount of available job offers.

Noted By Upwork

Take an MBA- high level and theoretical skills that are of course invaluable in any business setting. For multiple generations, you’re hard-pressed to find them out of work or anything less than extremely high demand. While this still holds true, it is not the silver bullet that it used to be.

The question you ask an MBA has now become “How do you manage and optimize in a technology-driven workplace if you don’t have the technical skills?”

“Degrees are static- The job market is not” 

They may possess 6 years worth of soft skills, but without technical skills, it becomes abstract and even academic. The confluence of these new realities bring us to an inescapable truth- degrees, while valuable, are far less valuable and a far cry from a job guarantee than ever before.

The fact of the matter is that degrees are static- the job market is not. Degrees often collect dust in the staid walls of their owners, but true credentials- skills that are used, augmented and scaled- require constant validation.

The pace at which any industry changes has rendered degrees subject to something entirely new- expiration dates. Today’s most in-demand jobs didn’t even exist a mere decade ago. Your job, duties, roles, and responsibilities are dynamic and constantly changing. A degree shows that you have the knowledge base, whereas the true skills and micro-credentials show you know how to handle them. 

Now the key is how will you put together a plan for education, credentialing, and validation?

Check out our real-world job-ready training guaranteed to land you a job in IT at

Exclusive Interview Tips from an Amazon Cybersecurity Solutions Architect

Exclusive Interview Tips from an Amazon Cybersecurity Solutions Architect

Anthony Nguyen, a Security Solutions Architect at Amazon, spoke with Sam Stuber from the NexGenT Career Services team about his career in IT and cybersecurity. At the end of the interview, students had the chance to ask him success questions. 

While the exclusive hour-long interview for NexGenT students was packed with helpful tips where Anthony shares about his own experience, this condensed highlight real offers a plethora of brilliant job searching tips.

In this insightful interview, Anthony shares the following things:

  • Triumphant steps Anthony took to get where he is at Amazon
  • How failures can be opportunities
  • Bots vs. people reviewing resumes and how to optimize for IT and cybersecurity
  • The IT hiring process, both technical and non-technical, and what you should know to prepare yourself
  • Sample interview questions
  • What to do when you don’t know the answer to an interview question
  • What hiring managers are looking for, including the T shape career development model
Interested in starting a cybersecurity career of your own? Click here to get trained in 6 months.

Additionally, here are some of the amazing resources that Anthony mentions and references:

  • How to Interview at Amazon: While this guide is specifically written by Amazon, for Amazon job seekers, it’s extremely applicable for any job you are preparing for. Included are sections on behavioral-based interviewing, the STAR answer format, tips for great interview answers
  • 101 Cybersecurity Slides: While Anthony took the time to share exclusive knowledge with NexGenT students, he has also mentored students in the past at places like UC Irvine. He shared with us the slides he uses to help students understand the different paths into IT and cybersecurity. Link to InfoSec 101: How to be a Cybersecurity Expert Slides Here
  • Coding Interviews & Challenges: Use websites like Leet Code to help you enhance your skills and prepare for technical IT interviews.

Ready to reach out to technical IT recruiters? Check out our step-by-step guide with email templates.

Is College Really Necessary?

Is College Really Necessary?

NexGenT was founded on the belief that education is for everyone and that the ideal educational system should be based on real-world skills training. The company ethos includes the belief that education cannot leave people in debt with degrees that do not teach the real skills needed to succeed, and that it should prepare people to be ready for the workforce. We want our students to be field-ready after completing our program, similar to how we trained network engineers in the military.

This is how we view proper training – it should actually prepare folks for a real job and give them tangible skills that are necessary to do the job (I know, crazy right!?). However, we find that traditional College education is lacking in technical fields of study such as information technology. What is needed in IT are people who can do the job, not people with a head full of concepts with no application.

For this reason, people mostly get hired based on their skills and certifications. There are not enough college programs that teach the necessary technical skills and to make things even worse, traditional institutions leave their students in massive amounts of debt. So, it’s important to highlight this issue of College debt, and discuss alternatives to traditional academia, but alternatives that actually provide the education needed for a great career.

Last year, more than 20 million students attended College or University, and 70% graduated with a significant amount of student loan debt. The national student debt is nearly $1.5 Trillion, collectively held by around 44 million Americans. This figure is truly unsustainable, and there must be change. The average student debt is around $37,000, and that significant amount of money could have been otherwise invested somewhere else.

At NexGenT, we provide real-world skills training for a fraction of the cost of College. Students graduate in just months instead of years and gain sought after skills without the burden of large amounts of debt. This is the kind of thinking we will need in order to fill the millions of tech jobs that will be open in just the next couple years.

And, don’t only take it from us – we created a short video with raw footage from some of our students who were inspired to share their stories and discuss this topic. The video starts with a question about College and then students share their authentic stories providing genuine insight into the value of alternative educational programs and the mission at NexGenT.

Launch your career in cybersecurity in just 6 months! Find out how.

Enjoy the video!

What is Edge Computing and Why You Should Care About It

What is Edge Computing and Why You Should Care About It

The underlying concept of Edge Computing is simple: it’s the idea that pushing “intelligence” to the edge of the network will reduce the load on the core servers and on the network itself, and make the network more efficient overall.  

Edge Computing, like many other technological advancements, arose as a response to an issue. The issue? A massive overload on servers caused by a “surplus” of data generated by networked devices incapable of making decisions on their own.

If you’ve read The Hottest IT Trends Of Our Time, you’d probably recall the security camera example. In this example, I refer to a scenario in which you have a few security cameras recording information and sending it to a centralized server as long as they are turned on.

This, of course, is not an issue if you have just a few cameras. But, when you get into a situation like that of the city of London, which has over 400,000 cameras, you run into a huge issue: an overload on the main server and network that can, and probably will, break things even if you have the biggest pipe in the world.  

Edge Computing Network Engineering Information Technology IoT

37298498 – cloud computing concept, global computer network

In a hypothetical case similar to London’s, ideally you’d want to filter the information that is being sent to the network’s main servers. This, as you might already know, would require those data-generating devices to be capable of making decisions—of identifying which information is relevant and which is irrelevant.

There are many more applications of Edge Computing other than surveillance cameras. However, surveillance cameras are one of the biggest use cases for Edge Computing because they require a lot of bandwidth at peak operation. Also, there are several things that can be done with data collected from surveillance cameras, that wasn’t realistically possible even within the past few years when most cameras were still “dumb” devices.  

For example, you could be a British government agency hunting down a criminal in London. So, you scan a picture of the criminal and create a biometric map of his/her face. Then, you can configure your security cameras to only report back to the main server once they detect someone that matches that biometric map (obviously this would require facial recognition software).

This, of course, is only possible if security cameras sitting on the edge of the network are capable of making decisions. Otherwise, the network’s main servers would go nuts processing data, most of which would be garbage, from over 400,000 cameras streaming 24/7!

The overload problem wasn’t an easy one to solve, but since sensors and software are now incorporated into “edge” devices (these are devices living on the edge of the network), network engineers can now configure these devices to only send relevant information to the network’s main servers, or in other words, these devices can now make decisions without having to rely on anything other than their own computational power. Thus, making things better for everybody.

Related: How to become a network engineer in less than a year

Edge Computing not only helps to reduce network loads, but also increases efficiency, functionality of devices and speed of information processing since data doesn’t have to travel far to be analyzed. But it’s not all good news, there are problems that arise from the incorporation of Edge Computing, as well as smaller issues that can also affect business’ operations. Let’s take a look at the pros and the cons of Edge Computing:  

Pros to Edge Computing

  1. Reduces network load

Let’s go back to the London security camera example for a second. Pretend you’re running security for the entire city, and that you have decided to upgrade all of your cameras to stream in 4k definition to make better use of recordings…

A 4k definition stream consumes something around 25Mbps. So, if all of these cameras were only capable of sending information back, your network would have to smoothly process the information coming from over 400,000 cameras every second. In that case, you’d have to wait for technology to catch up to your needs, my friend!

Edge Computing makes these type of crazy but quite common scenarios possible. With Edge Computing, networks can scale to meet the demands of the IoT world without having to worry about overconsumption of resources on the network and servers or wasting resources on processing irrelevant data.  

  1. Functionality

Network engineers can program Edge Computing devices to perform several different kinds of functions. I’ve covered the example of filtering data before sending it over the network, but Edge Computing devices are capable of doing many more things. Since they have their own software and can process their own data, they can be configured to handle edge data in ways that have not yet been imagined.

With the new capabilities presented by leveraging data at the edges, networks will inevitably have more functionality and will hopefully become increasingly more efficient as well.

  1. Efficiency (real time access to analytics)  

Another benefit of Edge Computing is that it enables real time data analysis performed on the spot—which is a big deal for businesses.

For example, if you’re part of a manufacturing business with several manufacturing plants, as a manager of one of those plants you could greatly benefit from analyzing your plant’s data as it is being recorded, rather than having to wait for that data to go to a central server to be analyzed and then be sent back to your plant.

Such speed translates into immediate action, which ultimately results  in cost reductions and/or revenue increases (the main things businesses try to do).

Imagine being a manager of a manufacturing plant having an issue with your production process and you have to wait for your data to be analyzed by the company’s main server. Whereas with Edge Computing, you could find what that issue is virtually immediately. Thus, saving significant resources!

Cons to Edge Computing

  1. Security

The main negative aspect about Edge Computing is security. Before edge devices became “intelligent” they weren’t a vulnerable part of the network—they were just “dumb” devices performing very limited tasks.

However, by adding advanced software and hardware into these devices and also empowering them analyze their data locally, all of these devices become more vulnerable to malicious attacks.

edge computing

With networked devices analyzing their own data on the spot, it is now even more likely for any of them to be infected with malware and begin distributing it across your network.

The security problem isn’t going away anytime soon. As of right now, and for several years to come, building full-blown security into every endpoint of a network would not be feasible, which makes conventional security virtually impossible to accomplish. Therefore, networks utilizing Edge Computing will have to rely more on security through the network itself. [More on that topic here]  

  1. Increased risk of human error

With several new intelligent devices connected to a network, configuring all of these devices correctly becomes a challenge. Network engineers can now go into each one of those little devices and perform several configurations, which makes it easy to make a mistake.

Going back to the security camera example, imagine something as simple as a setting up one of your cameras to record during the day rather than at night. This, of course, isn’t inherently caused by Edge Computing, but by human error. However, the possibility for these potential misconfigurations can certainly be enhanced with the incorporation of more Edge Computing devices.

  1. Cost of functionality

The advancement of Edge Computing devices makes it possible for new businesses to modify their business model. Nowadays, for example, instead of selling you a device, they sell you the device plus the ability to use certain key functionalities—only if you pay extra, of course. It is very easy to pretty much double the cost of the hardware just by adding additional functionalities.

Related: How the Network Intuitive will change the future of IT professionals

This is something to watch out for if you’re deploying Edge Computing into your network because, as you know, the IT industry revolves around money so you must keep your priorities in check. Make sure that you read the fine print every time you purchase new technology so that you’re only paying for what you’re truly going to use, and aren’t being charged for functionalities that you don’t need.

Why should you care?  

Edge Computing is enabling the Internet of Things to take over the world. According to Cisco Systems, by 2020 there will be tens of billions of devices connected to the Internet. Even if all of these devices were to send text files all day every day, we’d still need Edge Computing technologies to avoid big issues.

This means that every single network in the near future will use Edge Computing to operate. Hence, if you start digging into the weeds of Edge Computing, you’d be at the forefront of the industry—at least until the next big change comes by.

Nonetheless, you must watch out for costs, since they can skyrocket in the blink of an eye; you must go the extra mile to protect the integrity of your data, since Edge Computing could make it quite vulnerable to malicious attacks; and, you must think about better methods for configuration management and orchestration of network devices as more and more computing and intelligence is deployed to the edge of the network!

These Are The Hottest IT Trends Of Our Time

These Are The Hottest IT Trends Of Our Time

According to the CEO of Cisco Systems, soon there will be one million new connections every hour. So unless something drastic happens, we will live in a world where almost everything will be connected to a network. Gadgets ranging from an Apple Watch to a toothbrush will be generating data and communicating with a server.

This unprecedented growth of IoT connections, which is being caused by the incorporation of sensors and software in everyday tools, will eventually make people’s lives easier and prevent future problems.

However, the alarming growth of devices being connected to the internet is generating certain problems with the infrastructure on which information runs on.

There are two main problems that come with the boom of IoT, and these are scalability and security.

Scalability because the current way that devices communicate with servers isn’t optimal if there are too many devices communicating with the same server—specially if you require advanced functionality on those devices; and, security because the proposed solution for the scalability problem, can make any network vulnerable to malicious attacks.

Related: How the Network Intuitive will change the future of IT professionals 

Solving The Scalability Problem

Back in the day, when a device was connected to a network—let’s say a security camera, there was no “intelligence” incorporated into that device. Meaning that the camera could basically be either on or off, and, while it was on, it would be sending information to the server the entire time.

If you had a few cameras talking to a server this wouldn’t be an issue because the server could process all of the information. But, when you add hundreds or thousands of cameras that are doing the same thing, you start to have major issues even if you have a huge pipe.  

The scalability problem is solved by Edge Computing. This is the idea that the “intelligence” of a network can be pushed to the “edge” of the network. Edge computing takes advantage of intelligent devices that are connected to a network by configuring them in such a way that they only send relevant information to the server.

In the security camera example, since old security cameras had no way to determine whether they were sending relevant information or not, they would simply send anything that was recorded as long as they were turned on.

However, with the advancement of IoT devices, these cameras got smarter, and now, using sensors and software incorporated in them, they are able to send, for example, only video where there is motion.

There are several benefits to Edge Computing. Among them, of course, is the capability of significantly reducing the load on the network devices and primary servers, allowing the network to operate more efficiently.

Related: How to become a network engineer in less than a year

Another not-so-obvious benefit is the increased functionality of common devices. Now that technology has made simple everyday devices “intelligent,” these devices can be configured to perform advanced tasks and make decisions “on the spot” without having to completely rely on centralized servers.

Edge Computing can solve the scalability issue, and those who take advantage of this trend will become very valuable as IoT connections grow worldwide. However, solving the scalability issue is only half the battle, which leads us to our next topic:

Solving The Security Issue

Everything is butterflies and roses until you realize that with every IoT connection you are adding one more “hackable” device to your network.

Consider the fact that (probably) it is never going to be financially feasible to build robust security into every single IoT endpoint and you’ll realize that, at least for now, true security for IoT networks will have to come from somewhere else in the stack—that place is the network itself.

The main challenge with IoT security is financial resources. Since it is not viable to build the same level of security into a sensor that helps open a door than into an endpoint where sensitive data is being stored, many devices will remain vulnerable to attacks.

With this in mind, visibility into the network becomes crucial. However, even being able to look at the network holistically, knowing whether a device is distributing malware or stealing credentials could be very challenging if you have thousands and thousands of devices.

Add to this the fact that most traffic is now encrypted, which makes securing the network even more difficult. According to Cisco, over 80% of the world’s traffic will be encrypted in just a couple of years. You’d think that this increases security levels. However, Cisco also estimates that 70% of attacks will use encryption as well.

So, we find ourselves at a point where too many devices are vulnerable to attacks that can, and probably will be, camouflaged as encrypted traffic. Hence, detecting which devices have been compromised will be a very challenging task—especially since dealing with software updates in so many devices could easily compromise security as well.

The solution to this is something called Encrypted Traffic Analytics (ETA). ETA allows a network to analyze traffic metadata and, with a very high degree of certainty, determine whether traffic is malicious or not.

ETA uses machine learning to build a history of how encrypted benign traffic looks and behaves. This cutting edge technology allows the network to develop a frame of reference that is later used to infer the probability of traffic being “infected” by comparing it to previous data sets.

ETA, of course, threads the needle between privacy and security because the technology could be used for many other things other than detecting malicious traffic. But, perhaps true privacy is something that will inevitably have to go away as IoT connections grow all over the world.

Here’s a great explanation of ETA from Cisco Systems Sr. Technical Security Lead:


The Bottom Line

IoT connections will only continue to grow over the next few years. With technology becoming cheaper and, as a result, more accessible to everyone in the world, it will be just a matter of time before virtually everything is connected to the internet.

This will prove to be beneficial for the world. Imagine how many tragedies could be prevented or how much easier life could be for children and elders. However, in order for IoT to truly takeover, networking technology must advance to the point where this outrageous IoT growth is even possible.

Edge Computing proposes a viable solution to the scalability problem. With the implementation of Edge Computing we will have access to data that has never been leveraged before, which would lead to the simplification of human life.

But this will come at a cost—and that cost could be privacy. With so many intelligent devices that are vulnerable to encrypted attacks, it would be up to solutions such as ETA to help prevent the misuse of critical information.

Unless you’re the NSA, there is really no easy way to know whether a device in a network of hundreds of thousands of devices is distributing malware, or if due to a recent update a device has been made vulnerable to malicious attacks. It is now up to the network itself to prevent these attacks by using cutting edge technology such as ETA.

Cisco Systems is tackling this issue directly. They have announced their Network Intuitive, which uses ETA and Predictive Maintenance to “flag” potential malicious traffic and fix loopholes before they are taken advantage of by hackers.

But even with Cisco’s efforts, the imminent issues we are faced with due to exponential IoT growth are far from being solved. Hence, any IT professional who stays on top of the evolution of IoT, Edge Computing or ETA will become an invaluable asset to any organization.