Believe it or not, achieving the highest score in Angry Birds does not translate into technological literacy.
By Ben Zhang, Duke University
Technology is a wonderful thing.
People often take it for granted, but it is amazing to consider how far it has come during the last few centuries.
In the early nineteenth century, the advent of the steam locomotive helped to shrink the world many times over. Then, just one hundred years ago, humanity first took to the skies, as the Wright Brothers and others around the world powered their flying machines off the ground. Around World War II, the earliest modern computers were born.
And now? Well, the ENIVAC and UNIVAC developers of just a few decades ago could never have dreamed of today’s gadgets. Everywhere you go, you can find people texting their best buddies on their smartphones or checking out the latest “NYT” Bestsellers on their Kindles. Perhaps, as a result, you occasionally stop to marvel at how comfortable everyone has become with the latest technologies. Surely, you think, this is evidence of how advanced our society has become.
But just how good are people when it comes to technology?
A 2015 Educational Testing Service study found that U.S. millennials rank among the worst in the world when it comes to applying their technical knowledge to solve problems. In 2016, the National Assessment Governing Board found that less than half of American eighth-graders could be deemed “proficient” when tested on their technological literacy. These results are bound to surprise a majority of millennials, many of whom would likely retort along the lines of, “How is that possible? My parents don’t know anything about texting! How dare you say they’re better at technology than we are!”
It is an argument you have probably heard countless times from younger generations. When it comes to tech, older people are horribly incompetent. They have no idea what a Twitter or a Facebook is. It takes a tremendous amount of patience to teach them the most basic concepts, and so on and so forth.
What is often forgotten, though, is that the main reason today’s kids seem so much better with Instagram, Snapchat and the like is because they are more familiar with the products. If you use something every day, chances are you will become decently knowledgeable about it within a few years.
When many millennials’ parents were growing up, the World Wide Web didn’t exist, “Google” was just a misspelling of the number and cell phones were the size of bricks. So, of course there will be growing pains as the older generations adapt to new technologies, but lack of knowledge does not imply incompetence. After all, are all modern teenagers called dumb for not knowing how to use a record player or, heaven forbid, a VCR?
If you really want a glimpse at how little people know about technology, ask them to explain how their cell phone or computer works. Can they name some of the moving parts under the hood? Can they identify the driving forces behind Wikipedia or other websites? You don’t even need to mention DDoS attacks.
Chances are, you will have already gotten a blank stare in return. Small wonder that Hollywood fails time and time again when it comes to depicting things such as hacking. Though many people claim to be masters of technology, much of their so-called “knowledge” consists of oft-repeated clichés and stereotypes, that of the four hundred-pound hacker, for example.
America’s technological backwardness isn’t just a matter of youthful hubris, though it has far-ranging consequences that are becoming more evident with each passing day. Cybersecurity has increasingly begun to dominate the headlines, especially during last year’s election cycle.
From Russian hackers allegedly aiding and abetting the Trump campaign, to the San Bernardino iPhone kerfuffle, Americans have shown that they are willing to talk about complex issues involving technology and privacy, but what good is a discussion if people don’t know enough about the topic in the first place? The price of ignorance is steep in this case.
Nowadays, you routinely hear about people falling victim to phishing scams, identity theft or the latest diabolical ransomware. What did they do to end up in such dire straits? Well, they probably entered their credit card information on an insecure site, blithely downloaded and installed a virus or used “123456” or “password” as their password, despite being told a thousand times not to do so.
In short, even the most tech-savvy individuals slip up and do things that they later come to regret. The main culprit? A lack of education. If people knew more about the pitfalls of their actions, they would likely be more careful and try to avoid making the same mistakes. Some U.S. schools have tried to fill the gaps in students’ knowledge by allowing computer coding to count towards their foreign language requirement. This approach, however, fails to address the fundamental issues.
Instead, technology classes should be required for all young schoolchildren, taking their place amongst the likes of math and civics. No, not all students will use what they learn to become the next Bill Gates or Mark Zuckerberg, but they certainly can learn enough of the basics to reverse some of the more odious problems of the day. Maybe airports would no longer shut down because of their reliance on Windows 3.1. Perhaps, website developers would learn to sanitize their database inputs, and perchance hacks of hundreds of millions of accounts would no longer become commonplace.
In the future, technology will only become more and more integrated into the fabric of society. Computers will be everywhere, from your garage door to your satellite navigation to your toaster, but progress always brings with it certain risks. For every good white-hat hacker out there, there exists a black-hat counterpart who would gladly take all of the money from your bank accounts without a second thought. Luckily, these people can be thwarted with some planning and patience.
Over time, members of Generation Y and beyond can be taught the fundamentals of technology, so that when they enter the workforce, they will be better prepared for what is to come. The task may seem daunting. After all, computers and the like are incredibly complex (Moore’s law, anyone?), but perhaps sometime in the near future, the United States will achieve 100 percent technological literacy. Maybe then, pop culture could get more “Mr. Robots ” and fewer “Scorpions.”