Throughout history, people have been skeptical of new technology and its potential impact on human cognition. From Socrates to the Internet, many new technologies have been met with fear and concern that they would negatively affect people's ability to think, remember, and communicate effectively. We will explore the historical context of this skepticism, its basis, and the reasons behind society's hesitancy toward new technology, such as self-driving cars and artificial intelligence (AI).
Socrates was famously skeptical of the technology of writing, which was a new and emerging skill during his time. In Plato's "Phaedrus," Socrates expresses his concerns about the use of writing, stating that it would lead to the decline of memory and the ability to think deeply about ideas. He argued that writing things down would make people forgetful, as they would no longer need to rely on their memory. Instead, they would trust in the written word and lose the ability to think deeply and critically about ideas.
"[Writing] will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing."
Socrates did not necessarily fear writing as a new technology. He stated that in philosophy and politics, speaking is superior to writing. In his words,
“Writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence… And when [speeches] have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves.”
Socrates argued that dialog is more effective in conveying justice, goodness, and nobility principles. Actual knowledge and wisdom come from internalizing these principles and cultivating them within oneself and others through dialogue and debate rather than simply reciting or reading written works without critical analysis. Essentially, the actual value of knowledge lies not in the medium through which it is communicated but in the active engagement and critical examination of ideas that it inspires. Socrates was correct, which is why we rely heavily on interviews, debates, and lectures to learn about the truth.
The Printing Press
The printing press is one of the earliest examples of how technology would cause a decline in cognition. When the printing press was invented in the 15th century, some people were concerned that it would reduce the quality of education and memorization skills, as people would no longer need to memorize long text passages. Conrad Gessner (a Swiss biologist from the mid-1500s) was worried that with the invention of the printing press and the increase in libraries (both of which he supported), readers could become overwhelmed with information. Therefore, he created Bibliotheca Universalis in 1545, which cataloged every book he could find written in Latin, Hebrew, and Greek. This work paved the way for the creation of encyclopedias.
Cellphones and the Internet
The invention of the telephone in the late 19th century also sparked concerns about its potential impact on cognition. Some people feared that the phone would lead to a decline in face-to-face communication, the ability to read social cues, and a decline in writing skills, as people would no longer need to write letters to communicate with one another.
More recently, the internet has been blamed for declining cognition, particularly among young people. Some worry that the easy availability of information on the internet has made people lazy and less likely to engage in critical thinking. Others are concerned that social media has led to a decline in the ability to focus and concentrate, as people are constantly distracted by notifications and updates.
Artificial intelligence (AI) is quickly becoming a part of our daily lives, from self-driving cars to language processing tools. A recent study found that most people understand AI's benefits but are still very concerned about failures in the technology. The concern of using self-driving cars goes beyond technology. Yes, many people do not trust technology, and that trust is difficult to gain when they see news headlines like “Self-driving car runs over a pedestrian.” So, the question becomes, who is to blame? Of course, the driver is to blame in a standard vehicle as they should have been more aware and cautious. But is the driver to blame when the car was supposed to drive itself? Drivers should pay attention, even though they are not actively driving.
The decrease in attention as technology does more work is what psychologists call “cognitive offloading.” A study found that when people rely on a computer to navigate the streets of London, they have significantly less brain activity in the hippocampus and the prefrontal cortex than participants actively thinking about the routes. The results indicate that navigation systems like GPS would decrease learning and memory and increase reliance on technology. This study is often used as evidence for why many are concerned with cognitive offloading regarding self-driving cars. Although people fear failures in technology and worry about the effects on cognition, this fear does not impact their likelihood of using self-driving cars.
Similarly, many worry about what artificial intelligence (AI) software, such as ChatGPT, will have on education. When students receive an assignment to write an essay on the philosophical debate of free will, for example, they could ask ChatGPT a series of questions to get enough text for their essay quickly. However, ChatGPT’s responses lack a voice and originality. Educators are starting to leverage this to create assignments that explore the AI’s writing ability and how the writer’s voice brings life to the writing.
Another concern is that people will use ChatGPT to find answers to specific questions. A recent paper outlined failures in ChatGPT in areas such as critical thinking, math, coding, facts, and biases. Some of the responses were blatantly wrong, while others were contradictory. The author provides a list of suggestions that would improve ChatGPT, but one of the more impactful ones would be a confidence rating. That is, when the AI responds, it provides confidence in its answer. Currently, ChatGPT makes statements that seem confident but are incorrect. A learner that does not know the correct answer may be misled. Adding a confidence rating might prompt the learner to make further inquiries to get the truth.
What Does Research Say?
The effects of increased use of technology are twofold. On the one hand, research in 2004 found that people would spend, on average, two and a half minutes engaged in a task, like working on a Word document. Ten years later, in 2012, that average dropped to 75 seconds; today, that average is 47 seconds. That means that every 47 seconds, people will switch tasks. (Interestingly, movie scenes have decreased in duration from an average of 12 seconds in the 1950s to 4 seconds in the 2000s.)
On the other hand, this comes at a cost as well. Rapid task switching increases stress response and reduces productivity. Every time you switch a task, it takes time to re-engage in the previous task. For example, if you are writing a blog and then receive a notification on your phone because someone posted to social media. Whether or not the social media post engages you (e.g., you look at the message and then put your phone down or you open the post and read/watch it), it will take time to get back in the flow of writing. Even if it only takes a few seconds to get back to it, those few seconds add up when it happens every 47 seconds.
Some research also suggests that technology enhances cognitive skills, such as problem-solving and spatial reasoning. For example, video games have been shown to improve visuospatial attention, but at the cost of reducing emotional and social processing. Other research wondered if a video game could be used to improve cognitive function in older adults (60-85 years old). The authors had participants either play the actual video game (experimental group), a sham video game (active control group), or no video game for six months (control group). They also asked a group of 20-year-olds to play the game to compare their score to the older groups’ scores. The results indicated that the training significantly improved cognitive function (sustained attention and working memory), on a behavioral and neurological level, in the experimental group but not in the two control groups. They also found that the experimental group was able to achieve similar scores in the game to the 20-year-old group. Another video game developed by this lab was the first game-based therapeutic to receive FDA approval for treating children with ADHD.
Technology has come a long way since the time of the ancient Greeks. It is not the new technology we fear; it is how it may be misused. The research discussed here outlines that technology can be helpful or harmful depending on its use. Socrates may have been correct in that new technology will affect critical thinking. However, thankfully, Plato did not share Socrates’ disdain for writing down speeches; otherwise, we would not be able to read and dissect his arguments. Also, like Conrad Gessner, we need to fully understand the “right” ways to use technology and use it to assist people.
How the next innovation affects our ability to think and behave is up to us. We can either let the new technology take control and potentially lose critical cognitive and social skills, or we can use the power of technology to improve our lives, develop discussion topics that increase critical thinking, create safer driving conditions, and work environments, and much more.
Author: Robert Torrence, Ph.D.