Getting Better: Is the 21st Century Really All That Different?

 
Classroom.jpg
 

December 31st, 1999 was not your average New Year’s Eve. While excitement about the turn of the century was palpable, uneasiness permeated as the second hand of everybody’s watch made its final revolution of the 20th Century. Y2K, the catchy acronym assigned to the collective nationwide fear that computer systems were doomed to fail at the stroke of midnight, was on the tip of everybody’s tongue. The mere uttering of those three characters led to intense debate about the implications of our computer systems failing. Many feared that banks would fail, airplanes would lose contact with control towers, city infrastructure would collapse, and we’d essentially fall into a state of anarchy. Some people even “prepped,” hoarding food, weapons, and supplies. Yet, at 12:00:00 a.m. on January 1st, 2000, fireballs didn’t rain from the sky. In fact, not much happened at all.

Last week, I read an article that began with the image I’ve recreated below. The authors then asked, “In the 21st Century, which type of student do YOU believe will attain success?” The answer is supposed to be implicit. But that answer is wrong.

AdobeStock_104435884.jpg

STUDENT A
Math
English
Science
Social Studies

AdobeStock_270492769.jpeg

STUDENT B
Content Mastery
Communication
Collaboration
21st Century Skills

“21st Century Learning,” like many educational ideologies, will likely elicit five different definitions from five different people. Generally, the term refers to certain competencies that schools believe their students need to master in order to have success in today’s world; skills like digital literacy, problem-solving, and the “4Cs” (communication, collaboration, critical thinking, and creativity). Yet one can’t help but reflect...haven’t human beings used those skills for centuries?

Humans have landed a man safely on the moon and returned him to Earth. Humans have built a railroad extending from Council Bluffs, Iowa to the San Francisco Bay. Humans have invented the printing press to allow for the dissemination of a wealth of knowledge in written form. Out of rugged terrain, humans carved the Erie Canal from the Hudson River to Lake Erie to connect the Great Lakes to the Atlantic Ocean. Humans have invented the steam engine, the automobile, and the plane to transport other human beings around the world. Humans have invented penicillin, vaccines, and innumerable surgical techniques to fight disease and radically alter human life expectancy. Humans produced the cell phone, the computer, and the internet that have connected people in disparate parts of the world, changing forever the way we communicate. There are scores of other examples.

Daisy Christodoulou memorably stated in her influential work, Seven Myths About Education, “it is quite patronising to suggest that nobody before the year 2000 ever needed to think critically, solve problems, communicate, collaborate, create, innovate, or read.” Humans have been engaging in “21st Century Skills” for centuries now; one could argue that nothing, in fact, is more human.

As an educator, I am accustomed to the constant emails and advertisements concerning technology and digital literacy. Whether outfitting one’s entire school with iPads, or equipping a technology lab with the latest 3D printer, it’s all the rage. Yet, while technology, if used correctly, can be a great supplemental tool to engage and educate students, technology in and of itself is not going to cause a student to grow and achieve academically. At times, poor instructional practices in the traditional classroom setting are simply replicated on digital platforms.

 
Student Writing_Single.jpg
 

In the frenzy to expose students to technology in the hopes that it will expand their digital literacy, we forget that 1) most of us currently have, and will continue to have, a casual relationship with technology, and 2) most of the technology we utilize on a daily basis is extremely easy to use. We’ve all seen a three-year-old play games on a cell phone, as well as a 70-year-old posting a status on Facebook. Cal Newport, in his book Deep Work, explains eloquently that, “The complex reality of the technologies that real companies leverage to get ahead emphasizes the absurdity of the now common idea that exposure to simplistic, consumer-facing products - especially in schools - somehow prepares people to succeed in a high-tech economy. Giving students iPads or allowing them to film homework assignments on YouTube prepares them for a high-tech economy about as much as playing with Hot Wheels would prepare students to thrive as auto mechanics.”

Expounding on the idea above, it’s important to understand that there’s nothing inherently “wrong” with prioritizing skills commonly associated with 21st Century Learning. We all want to educate students and raise children who will eventually embody those competencies. Yet I would argue, based on research like this from respected cognitive scientists, that we can’t put the cart before the horse. Critical thinking is not a “general skill” that can be taught. One’s ability to think critically about a topic is highly dependent on the extent of his or her background knowledge of that topic. Try it yourself; go to the New York Times and pick a topic: the impeachment inquiry, the intensifying conflict in Syria, the protests in Hong Kong, the recent election in Israel, climate change, the tense relationship between Japan and South Korea, the turbulent situation in Kashmir. Your ability to think critically about the topic you choose will be wholly dependent upon how much you already know about the topic. 

It’s also very difficult to be creative without a strong knowledge base that will allow you to express your creativity. Often we perpetuate the myth of the “creative genius,” when in actuality, a tremendous amount of time and hard work leads to masterful achievement. Mozart didn’t meander through life, occasionally producing masterpieces. He was an exceptionally hard worker, constantly writing music, going to concerts, listening to others’ work, reviewing and rewriting his own work. Michelangelo didn’t just wake up one morning and carve the captivating David. He spent thousands upon thousands of hours as a teenage apprentice during the Renaissance learning from master artists. Declaring that creativity is a “skill” that can be expressed in the absence of an abundance of domain-specific knowledge is like saying commercial airline pilots would be ready for the dogfights of war if called upon. 

 
Teacher Student_Lab1.jpg
 

To be fair, there are some “21st Century Skills” that should be incorporated into a student’s education, especially in the area of digital literacy. It is vital, particularly in light of events that have transpired in our country over the past few years, that our students are able to evaluate the trustworthiness of information they find online. Yet I would argue that a complete rethinking of our educational system is simply overkill. Including some lessons over multiple grades on evaluating the trustworthiness of online information would be sufficient. There is no need to use a sledgehammer when a scalpel will do just fine.

Returning to the images above, I would emphatically choose Student A so that they’d be better prepared to be Student B in the future. Providing our students with a trove of content, vocabulary, and background knowledge will allow them to express the 4Cs down the line. To be clear, there is no inherent problem with “21st Century Skills,” just the process through which many believe they are generated. Skills like the 4Cs are the product, not the process. Providing students with a timeless, classical, broad education will prepare them for the unforeseen jobs they will eventually hold. Every single worker in the technology sector today, by rule, grew up in an era with significantly less technology. Yet, they’ve managed. We’ve been preparing our youngest generation for “jobs that don’t yet exist” since the beginning of time. The kids will be alright.

On the subject of Y2K, I later found out that the scare was actually pretty serious. To save money, many computer programs initially abbreviated four-digit years as two-digits to save memory. People had legitimate concern about what would happen when the “99” turned into a “00.” Hundreds of billions of dollars were spent world-wide to address that problem, and computer programmers worked relentlessly during the years prior to the new millennium to fix the bug and avoid major issues. Yet, when it was all said and done, this episode did not lead to any major systemic change in how computer programmers approached their work. They simply communicated, collaborated, problem-solved, and critically thought their way to a solution. In the 20th Century, no less. 

Ben Pacht is the Director of Improvement of the School Performance Institute in Columbus, Ohio. The School Performance Institute is the learning and improvement arm of the United Schools Network. Send feedback to bpacht@unitedschoolsnetwork.org.