We already know that rapid changes in information technology are changing the way that we deliver education. Computers in the classroom, “Bring Your Own Device” policies, adaptive educational software, digital textbooks, instruction by podcast and a host of others are already old news that are transforming the daily classroom routine.
In addition, we must reckon with the notion that—at a speed that will likely astonish—rapid improvements to technology are going to change both the very nature of what it means to be educated and the already fraught relationship between local taxpayers and our public schools.
It is already long past the time when we must acknowledge that any classroom experience that involves the basic transmission of information—names, dates and bulk data—can be readily provided without the involvement of a human teacher. To relieve the ongoing burden on taxpayers, it would be much more cost-effective to provide this kind of “instruction by regurgitation” on a screen, at the student’s pace, and have the work checked via an online test.
This type of “teaching” should be added to the pile of protractors and plastic chairs piled by the dumpster.
However, instruction that involves the teaching of higher-order skills—evaluation, synthesis and presentation—is increasingly valuable in our globalized workplace of ideas, and educators who can help their students learn how to compose an argumentative essay based on sources, understand complex and challenging written materials, or divine the underlying meanings buried in abstruse data sets are key to our efforts to move the needle on building the critical thinking skills necessary in the 21st-century workplace.
Moreover, we are still left—due to near miraculous improvements in the powers and reach of information technology—to deal with significant challenges to our traditional ideas about what it means to be “educated.”
Can you “read” if a bit of 99-cent software does the reading for you? Do you know how to “research” a topic if you can type search terms into Google? Have you “learned” how to balance a chemical equation if an app on your iPhone takes care of it for you?
We are not quite at the singularity yet, but it is easy to envision a point in the not-too-distant future when to be “educated” might be synonymous with a mastery of the most up-to-date technology. Just as we now chuckle if someone brags about their expertise with MS-DOS, our children might someday be laughing at the Luddites who are not sending their holograms to school each day.
When we get to that point, will anyone really care whether your essay is typed up in proper MLA style?
Will we value an annotated bibliography or a blog? Should students slog through the snow to show up for a course on American literature when they can sit at home in their jammies and watch a topical online course on their laptop while participating in an online class discussion?
Will the very notion of a physical classroom, in fact, become a quaint artifact when we are all communicating via our custom avatars?
This technology-driven transformation is already well underway at our colleges and universities; no matter how much institutional resistance is thrown up to prevent it, these same changes will be coming soon to a public school near you. Just imagine how disruptive—and terrifying—these changes will be to a centuries-old model of public education whose very existence is predicated on precepts as quaint as a postage stamp.
Because these issues so profoundly question a status quo that pumps out millions of paychecks and provides K-12 day care for a nation packed with overstretched parents, this fundamental inquiry is often short-circuited by a desperate yearning to cling to a time gone by.
However, just as bricks-and-mortar retailers have been supplanted by click-and-order websites, public schools may have to trade in their pom-poms and move to both a completely different model of providing education and a new paradigm for what it means to be educated.