The last forty years have played host to an unprecedented surge in technological integration throughout all classes of society. Devices that only a few decades ago were absurd to the point of science fiction are now staples of modern life. This trend was first recognized in 1965 by Intel co-founder Gordon Moore, who, in his seminal article, “Cramming more components onto integrated circuits,” predicted that the number of transistors within an integrated circuit would double every two years in perpetuity. In essence, the processing power of the average CPU has maintained the same exponential growth rate, doubling every two years, for the last fifty years. Only since last year has this growth rate decayed, and then, only by an additional half-year.
Technological advancement on this scale was something society was objectively not ready for, and as such, growing pains were felt by those generations caught in the middle. The phrase “only ‘90s kids will remember _____,” while more of a meme these days than anything else, is a reference to a turning point in American society in which virtually all aspects of modern life (appliances, infrastructure, entertainment, occupations, mass media, etc.) underwent varying degrees of electronic integration. The ‘90s saw the rise of the internet, the first Browser War, the first iteration of the modern Windows OS, 3D video games, and 2G cell phones. The phrase exists because of both the large amount and variety of technology that got outmoded in the ‘90s. This massive schism created a divide in generations, as Gen Y (‘90s kids) became the last generation to recall a time before mass integration, and Gen Z (millennials) became the first generation to have no recollection of that time. Consequently, Gen Z has taken not only all the new technology, but also the rate that this technology is being replaced, for granted. What these generations share, however, is a complete lack of understanding about the technology which permeates all facets of modern life.
The average consumer, whether 89, 34, or 13, has virtually no idea how his or her devices work conceptually. Much of this is due to the abstract nature of computers and even basic electronics. These lack a basic intuition that is inherent in many other technologies. Take guns for instance. Though I am by no means an expert in firearms, through observation I can note that a trigger is pulled, causing a hammer to strike the small explosive on the back of a bullet, accelerating the bullet out of the chamber. I know this, not because I sought the information out, but because it can be casually discerned. The same cannot be said for Wi-Fi connections; the average person can determine that there is some sort of link between the router and their device, and that somehow data is transferred between the two, but anything more than that would likely require either research or an intent investigation of the process. And then, even if someone gained the impetus to attempt to become well-versed in an aspect of modern electronics, the perceived complexity and strangeness of the field could easily drive them away.
Many assume that society’s general ignorance towards computer science is of no significance. Indeed, for most concepts that are not intuitive, such as computer science, the lack of comprehension among the general populace bears very few, if any, consequences for society. For example, the method by which airplanes generate lift is not common knowledge, yet situations in which the average person would benefit from this knowledge are extremely uncommon. Normally, for any sort of complex field, understanding can simply fall to a specialized group in society; in the case of airplanes, to pilots, aeronautical engineers, and others within the industry. Similarly, the responsibility of maintaining our electronics has been relegated to the computer science industry.
However, the issue with this field is that electronics, unlike airplanes or guns, play a part in virtually every aspect of modern life. Consider all of the electronics involved in a trip to the grocery store. If your car is younger than 30, it relies on an electric igniter. If it’s younger than four, there’s a good chance it has some form of touch screen interface and internet connectivity. In the future, your car could very well drive itself. Each stoplight uses circuitry and sensors to determine turn order. After shopping, you use an electronic card to transfer money to the store, and the entire time a micro-computer that contains virtually all your personal information sits in your pocket. Succinctly, computers are everywhere and are therefore unavoidable. Computer science must be widely understood because it is a uniquely integrated field, built upon decades of rapid evolution, that virtually all of society interacts within on an hour by hour basis. When a situation of technical illiteracy of this magnitude arises, it creates a massive risk of exploitation for the average user at the hands of those who have mastered, to even a slight degree, computer science.
Be it individuals or teams, there are many active threats that the average person is entirely unprepared for. Many users don’t truly conceive of the level of damage that can be done to them. Consider the hacking of a smart phone, most likely perpetrated by an individual. The hacker would gain access to the user’s email address. Through the email address, the hacker could then access logins on any number of sites through a change of password which, depending on the hacker’s motivations, could lead to hijacked social media, stolen bank account information, and/or identity theft. The hacker might not only gain access to the user’s profiles, but could potentially block the user from accessing their own profiles. Additionally, if the user is unaware of the intrusion, he or she may continue to use the infected phone while the intruder has access to their camera and/or microphone, potentially allowing illicit photos or video to be taken without the user’s consent or awareness. More advanced hacker teams like Teslacrypt and Ultracrypter make money by gaining access to users’ systems and encrypting all their files through the use of ransomware. They then sell the victim a decryption key that will allow the victim to get their files back. While these scenarios are relatively uncommon, more generic attacks like automated data breach hacks strike roughly one half of adults each year. If that number seems ludicrous, it’s because it is.
These hacks are not unavoidable; rather their numbers are the result of ignorance on the part of users. Often uneducated users engage in risky behaviors, unwittingly making themselves vulnerable to attack, and, as many attacks are now fully automated, if a user consistently maintains a vulnerability they are all but guaranteed to become a victim of some level of infection. Furthermore, when people think of risky behaviors they tend to point to things like porn sites and pop-up ads. While those are definitely high-risk decisions, there are many other risks that people are generally not aware of. To name a few, shared USB-keys, open Wi-Fi networks, unsecured hyper-links, and yes, opening email attachments, all fall under the blanket of risk. These actions, while completely avoidable, and capable of being done safely with a bit of preparation, are common mistakes. A much easier solution than grasping the basics of computer science would be to produce a list of “don’ts” informing people about risky behaviors and how to perform them safely or avoid them altogether. However, there are two big issues with such a list. Firstly, due to the intense rate of technological advancement, new vulnerabilities are both created and discovered constantly. Secondly, hackers may be aware of vulnerabilities long before the rest of the industry is. Alternatively, if users understood the essential concepts of computer science, they could independently determine whether or not their actions were secure, eliminating the need for a blacklist and greatly diminishing the success rate of hackers.
I won’t claim that learning computer science is easy, or that doing so will guarantee that you won’t get hacked. Frankly, computer science is a complicated field, and even the basics take some time to learn. What it comes down to is odds. The more risky behaviors you take part in, intentionally or otherwise, the higher your chances of getting hacked, and in today’s society, it’s impossible to avoid all of them. As time goes on and computers become even more integrated into our daily lives, the damage hackers can do will only become more substantial. However, by taking the time to become well versed, you can effectively minimize the risk of attack, and in all likelihood repel attackers without incident.