The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey
Simply connect to the web to gain this book The End Of College: Creating The Future Of Learning And The University Of Everywhere, By Kevin Carey This is why we indicate you to make use of and use the established technology. Reading book does not mean to bring the printed The End Of College: Creating The Future Of Learning And The University Of Everywhere, By Kevin Carey Developed modern technology has actually enabled you to review only the soft data of the book The End Of College: Creating The Future Of Learning And The University Of Everywhere, By Kevin Carey It is same. You could not have to go and obtain conventionally in searching the book The End Of College: Creating The Future Of Learning And The University Of Everywhere, By Kevin Carey You may not have sufficient time to spend, may you? This is why we offer you the very best method to obtain the book The End Of College: Creating The Future Of Learning And The University Of Everywhere, By Kevin Carey now!
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey
Read and Download Ebook The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey
From a renowned education writer comes a paradigm-shifting examination of the rapidly changing world of college that every parent, student, educator, and investor needs to understand.Over the span of just nine months in 2011 and 2012, the world’s most famous universities and high-powered technology entrepreneurs began a race to revolutionize higher education. College courses that had been kept for centuries from all but an elite few were released to millions of students throughout the world—for free.Exploding college prices and a flagging global economy, combined with the derring-do of a few intrepid innovators, have created a dynamic climate for a total rethinking of an industry that has remained virtually unchanged for a hundred years. In The End of College, Kevin Carey, an education researcher and writer, draws on years of in-depth reporting and cutting-edge research to paint a vivid and surprising portrait of the future of education. Carey explains how two trends—the skyrocketing cost of college and the revolution in information technology—are converging in ways that will radically alter the college experience, upend the traditional meritocracy, and emancipate hundreds of millions of people around the world. Insightful, innovative, and accessible, The End of College is a must-read, and an important contribution to the developing conversation about education in this country.
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey- Amazon Sales Rank: #129385 in Books
- Brand: Carey, Kevin
- Published on: 2015-03-03
- Released on: 2015-03-03
- Original language: English
- Number of items: 1
- Dimensions: 9.25" h x 1.00" w x 6.33" l, 1.00 pounds
- Binding: Hardcover
- 288 pages
Review “Carey elegantly blends policy analysis, reportage [and] memoir into a hard-charging indictment of the eggheads and ivory towers many Americans love to hate….Part alternative history and part road trip, the book is a tour guide to higher education…[A] readable and thoughtful book…[Carey has] a gift for finding fascinating characters and explaining complex ideas clearly.” – The New York Times Book Review"Thorough ... this is thought-provoking, fascinating material."—The Washington Post“In The End of College, Kevin Carey delves into some of the most complicated – and important – issues facing students, parents and educators today. This is a fascinating read."—Daniel H. Pink, author of Drive and A Whole New Mind“Don't even think about going to college (or paying for it) until you have read this book. Kevin Carey has changed forever how I think about the modern American university. The End of College delivers a scathing indictment of the past and present—alongside a glorious prediction for what comes next.”—Amanda Ripley, author of The Smartest Kids in the World
About the Author Kevin Carey directs the Education Policy Program at the New America Foundation. His writing has appeared in The New York Times, The New Republic, The Washington Post, and elsewhere. Carey lives with his wife and daughter on Capitol Hill.
Excerpt. © Reprinted by permission. All rights reserved.
• The Secret of Life • A Sham, a Bauble, a Dodge • The Absolut Rolex Plan • Cathedrals • Learning Like Alexander • Thunder Lizards • Anything for Anyone, Anywhere • Imaginary Harvard and Virtual MIT • Less Like a Yacht • Open Badges • The Weight of Large Numbers • Your Children and the University of Everywhere
1
The Japanese television crew and excitable LA producer were the first signs that something unusual was happening at MIT.
It was a warm evening in April, barely a week after a pair of mad bombers had terrorized the city of Boston and shot a campus security guard dead in front of Building 46, the glass-and-stone complex I was standing inside. Most colleges name their structures after wealthy donors. MIT likes to keep things rational, so when the Brain and Cognitive Sciences Complex was completed in 2005, the administration just gave it the next number in line. As I walked through the building’s ninety-foot-high atrium toward a nearby lecture hall, I recognized many of the students who had, like me, spent much of the last semester learning about genetics from one of the smartest people in the world.
Introduction to Biology—The Secret of Life is a mandatory course for all MIT freshmen. It’s taught by a professor named Eric Lander, who is a walking advertisement for the triumphs of American higher education. Born in the working-class Flatlands neighborhood of Brooklyn, Lander tested into Stuyvesant High, an elite New York City public school, where he discovered an aptitude for math. From there it was a rapid climb up the ladder of academic meritocracy: International Math Olympiad, Princeton valedictorian at age twenty, Rhodes Scholarship, Harvard professorship, MacArthur “Genius” award.
But unlike some of his monastic colleagues, Eric Lander was a sociable person with eclectic intellectual tastes. Pure mathematics were beautiful and thrilling, but they were also a solitary pursuit. Lander liked to interact with other smart people and dive into whole new fields. First he switched from exploring math to teaching economics at Harvard Business School. Then a connection with an MIT biologist led him to the field that would become his calling: uncovering the mysteries of genetics. Lander led the Human Genome Project, which created the historic first complete sequence of human DNA. He went on to cochair President Obama’s Council of Advisors on Science and Technology and found a multidisciplinary medical research center. When the Boston Globe listed the 150 most important things ever done at MIT, Lander and his work ranked second, after the guy who invented the World Wide Web.
Lander is also a very good teacher. Many great scholars are inept in the classroom. Their intense, internal focus works against them when it comes to forming connections with students. Lander is outgoing, personable, and almost as good at lecturing as he is at discovering new ways to unravel the meaning of human DNA. As I entered the lecture hall where his class would take place, a group of freshmen had already staked out seats in the front row. One, a young woman named Abbey, was standing expectantly by her desk, holding a cupcake in a plastic box. Abbey grew up in a suburb of Salt Lake City. Her father said he would only pay for her to attend two colleges other than Brigham Young: Harvard and MIT. So she aced her college boards and made the trip to Cambridge, Massachusetts. As Lander reached his lectern, she walked up to him, face flushing, and gave him the cupcake. “Happy anniversary!” she said, before quickly turning back to her seat. Because of this class, she later told me, she plans to major in chemical and biological engineering.
I settled into a seat in the back row, flipped up the desk, and pulled out my study materials. After two months of watching Lander lecture, my class notebook was almost full. We had begun with the building blocks of biochemistry before proceeding on a long voyage of intellectual discovery, through Mendelian genetics, Crick and Watson’s double helix, and the modern age of biotechnology. Lander used a storyteller’s flair for drama as he worked through complex explanations of biochemistry, genetic mutation, and RNA transcription. When his tales reached a point of crucial discovery (often involving a Nobel Prize awarded to one of his MIT colleagues), you could see sparks of enthusiasm in his eyes—even from the last row.
After each lecture, my fellow students and I would retreat to our laptops to tackle MIT’s famously challenging “problem sets,” exercises meant to test and solidify the knowledge we had gained in class. I found myself staying up late at night, trying to make sense of RNA base pair chains and a list of stop codons, restriction enzymes, and plasmids. But I could always ask other students or teaching assistants for help—MIT encourages this—and I eventually made it through the assignments. At the end of the semester, I passed the two midterms and the final with a respectable 87 percent average. In exchange, MIT gave me an official university document certifying that I had completed Introduction to Biology—The Secret of Life.
Yet I was not, and never have been, an MIT undergraduate. I did not fill out a college application or financial aid form or write a personal essay explaining why my life-altering experience founding a shelter for homeless marmots made me uniquely qualified to attend MIT. In fact, until that evening, I had never set foot on the MIT campus in my life. Nor did I pay MIT any tuition, which runs over $42,000 per year, plus another $15,000 for books, fees, and room and board. The entire Secret of Life course—lectures, problem sets, class forums, exams, and certificate—was totally free.
And I wasn’t the only one. At the same time, all around the world, tens of thousands of other students were taking The Secret of Life for free. There were doctors and medical students from South America, a group of high schoolers in Greece, a seventy-two-year-old retired chemist living in the Netherlands, a Sri Lankan college dropout, a full-time homemaker in India, a Ukrainian software engineer, and a nurse in the Philippines. One young woman wrote on the class message board, “My dad is letting me take this instead of my regular 8th grade science. I am 13 years old.” Most of them had never been to the United States and could not imagine experiencing, or affording, an elite American education. But they were, in most of the ways that mattered, doing exactly that.
That’s why, throughout Lander’s lecture that evening, a small team of camera operators from Japan’s NHK (Japan Broadcasting Corporation) television network prowled the sides and back of the lecture hall, shooting a news documentary under the direction of a California-based TV producer eager to get shots of Lander and his students at work. It’s also why the Japanese camera techs had to work around an entirely different production crew that was also in the room, filming Lander from a variety of angles in digital HD.
The class was being produced by a brand-new online educational organization jointly run by MIT and Harvard University. For the better part of the last 150 years, those two institutions have sat, less than two miles apart, at the zenith of global higher education. They are bitter rivals for the world’s best scholars and students. Yet they had, in this time and place, decided to put their rivalries aside.
This unlikely collaboration came about because higher education now stands at the brink of transformation by information technology. Harvard and MIT are accelerating seismic forces that threaten colleges that have stood, largely unchanged, for decades or more. These historic developments will liberate hundreds of millions of people around the world, creating ways of learning that have never existed before. They will also upend a cornerstone of the American meritocracy, fundamentally altering the way our society creates knowledge and economic opportunity.
Whether they know it or not, Harvard and MIT are helping to build a new and unprecedented institution: the University of Everywhere.
The University of Everywhere is where students of the future will go to college. Parts of it will be familiar to anyone who’s gotten a great college education, because some aspects of human learning are eternal. But in many respects, it will be like nothing that has come before.
At the University of Everywhere, educational resources that have been scarce and expensive for centuries will be abundant and free. Anything that can be digitized—books, lecture videos, images, sounds, and increasingly powerful digital learning environments—will be available to anyone in the world with an Internet connection.
The idea of “admission” to college will become an anachronism, because the University of Everywhere will be open to everyone. It won’t, in fact, be a single place or institution at all. The next generation of students will not waste their teenage years jostling for spots in a tiny number of elitist schools. Their educational experience will come from dozens of organizations, each specializing in different aspects of human learning.
The University of Everywhere will span the earth. The students will come from towns, cities, and countries in all cultures and societies, members of a growing global middle class who will transform the experience of higher education.
These students will be educated in digital learning environments of unprecedented sophistication. The University of Everywhere will solve the basic problem that has bedeviled universities since they were first invented over a millennium ago: how to provide a personalized, individual education to large numbers of people at a reasonable price. The intense tutorial education that has historically been the province of kings and princes will be available to anyone in the world.
That personalization will be driven by advances in artificial intelligence and fueled by massive amounts of educational data. Information about student learning will be used to continually adapt and improve people’s educational experience based on their unique strengths, needs, flaws, and aspirations.
The University of Everywhere will not be devoid of people, however. In fact, learners and educators will be all around us. The new digital learning environments will be designed by education engineers collaborating across organizations and cultures, sharing insights and tools in a way that far surpasses what any single college professor can accomplish alone. Students will be part of rich global communities as small as a half dozen people working intently together and as large as millions of students contending with timeless questions and monuments of human thinking at the same time.
Learning at the University of Everywhere will be challenging. There will be no more “gentleman’s Cs,” no grade inflation, no more slacking through late adolescence in a haze, confident that social connections and inertia will see you through. Standards of excellence will rise to the highest common denominator of the most talented and motivated students in the world. The new digital learning environments will be designed to stimulate the kind of sustained hard work that authentic education always requires.
Traditional college credentials, based on arbitrary amounts of time spent in obsolete institutions, will fade into memory. Instead of four-year bachelor’s degrees and two-year associate’s degrees, students will accumulate digital evidence of their learning throughout their lives, information that will be used to get jobs, access new educational opportunities, and connect with other learners. People will control their personal educational identities instead of leaving that crucial information in the hands of organizations acting from selfish interests.
Enrollment in the University of Everywhere will be lifelong, a fundamental aspect of modern living. Instead of checking into a single college for a few years on the cusp of adulthood, people will form relationships with learning organizations that last decades based on their personal preferences, circumstances, and needs. Unlike today, belonging to a learning organization will not involve massive expenses and crippling amounts of debt.
Some of those organizations will have the names of colleges and universities that we know today. Traditional institutions that move quickly and adapt to the opportunities of information technology will become centers of learning in the networked University of Everywhere. Those that cannot change will disappear. The story of higher education’s future is a tale of ancient institutions in their last days of decadence, creating the seeds of a new world to come.
—
I SIGNED UP FOR The Secret of Life because I was both intensely curious and increasingly fearful about the future of American higher education. As a child, I was immersed in traditional university life. My father was a PhD computer scientist who taught at a large public university in Connecticut. While raising three children, my mother got her doctorate in education from the same institution. There was never a question of whether I would follow them to college, only where I would enroll.
Fortunately, a middle-class family in the late 1980s could still send their children to a good public university without breaking the bank. A scholarship helped cover tuition at a selective state school in New York, and I emerged four years later with a bachelor’s degree and a clear path in front of me. Some of my peers came from well-off families while others were first-generation students from single-parent, working-class households. In all my time there I never once heard the words “student loan.”
But as I grew older and began to study America’s education system in depth, it became clear that the affordable college of my youth was a historical relic. Colleges in the United States have become, by a wide margin, the most expensive in the world. Since I first enrolled, inflation-adjusted tuition at public universities has more than tripled, rising much faster than the average family income. The only way parents and students have been able to make up the difference is debt. By 2004, Americans owed nearly $250 billion in student loans, which at the time was considered to be an alarming sum. By comparison, outstanding credit card debt then stood at $700 billion, the hangover of a ravenous consumer culture with a taste for easy credit.
Over the next eight years, student loan debt quadrupled, passing $1 trillion, leaving credit cards in the dust. The share of twenty-five-year-olds with student loans increased by 60 percent. In the early 1990s, most undergraduates were able to avoid borrowing entirely. By 2012, 71 percent of students graduated with an average debt of nearly $30,000. Leaving school with swollen loans during the worst economic crisis in generations, many students found they couldn’t afford the monthly payments. They put off buying homes because more borrowing was impossible. They defaulted on their loans in larger numbers, bringing repo men to their doors.
I also came to realize how unusual my college experience had been. Only a third of working-age American adults have a bachelor’s degree, a percentage that grew slowly, and in some years not at all, throughout the 1990s and 2000s even as other industrialized nations were achieving dramatic increases in the percentage of adults with degrees. In part, this was because many American students were falling off the path to graduation. Less than 40 percent of students enrolling for the first time at a four-year college actually graduate in four years. Even allowing an extra two years for changed majors, illnesses, and other circumstances, fewer than two-thirds graduate within six years.
And these are only the averages, which include places like Harvard and MIT, where nearly everyone graduates unless they drop out to found a multibillion-dollar software company. On the other end of the spectrum, hundreds of colleges and universities fail to graduate even a third of their students within six years. In urban centers like Chicago, Detroit, and Washington, D.C., some colleges graduate less than 10 percent of minority students on time. Attrition rates resemble those of soldiers sent over the top in World War I.
At two-year community colleges, where nearly half of all Americans start higher education, the numbers are also dire. Only 34 percent graduate or transfer to another school within three years of enrolling. Although most students starting community college say they want a bachelor’s degree, only 11.6 percent earn one within six years.
According to the U.S. Census, there are almost 35 million college dropouts in America over the age of twenty-five. Many of them have large student loans yet no access to the well-paying jobs that require a college degree.
At least, most people have assumed, those who did graduate from college had the knowledge and skills they need to survive in an increasingly perilous economy. This, too, turns out to be wrong. In 2005 a U.S. Department of Education study of adult literacy found that the majority of college graduates couldn’t do things like compare and contrast the viewpoints in two newspaper editorials. Fourteen percent of college graduates scored at only the “basic” level of literacy: good enough to read grade-school books but not much more. The results showed a sharp decline from the same exams given a decade before. The study was written up in the New York Times—and then disappeared from the public debate without a trace.
In late 2010, I met a New York University sociologist named Richard Arum at a coffee shop near my office in downtown Washington, D.C. He had just completed a book, Academically Adrift: Limited Learning on College Campuses, that would be published a few months later. It described a four-year study of how much students at a diverse group of colleges and universities had actually learned while they were in school. The results were shocking. Arum and his coauthor, Josipa Roksa, found that 45 percent of students made no gains on a widely used test of critical thinking, analytic reasoning, and communication skills during their first two years in college. Thirty-six percent made no statistically significant gains over the entire four years. “American higher education,” they wrote, “is characterized by limited or no learning for a large proportion of students.”
The evidence continued to pile up. In 2013 the nonprofit Organisation for Economic Co-operation and Development published a groundbreaking study that compared the literacy, numeracy, and problem-solving skills of adults in different countries. Fully 38 percent of American college graduates failed to meet at least the third level on a five-level assessment of numeracy that involves solving problems with math and performing “basic analysis of data and statistics.” Only 19 percent met the fourth level, compared to the average of 25 percent in other industrialized nations. Americans have long been told that our colleges and universities are the best in the world. It turns out that when it comes to college student learning, we are decidedly mediocre.
One of the big reasons is that colleges have been demanding less and less of students over time. A study published by the National Bureau of Economic Research found that in 1961, full-time college students worked more or less full-time, devoting forty hours a week to studying and attending class. By 2003, the average had fallen to twenty-seven hours—even though, over the same time period, the percentage of all class grades given an A rose from 15 to 43 percent. Nearly 20 percent of students reported studying less than five hours a week outside of class.
These alarming trends came at a time when the abstract statistics I pored over in my job at a D.C. think tank had become very personal. A few months before I met with Richard Arum and learned about his damning findings, my wife and I had our first child. Suddenly those long-term trends of rapidly escalating tuition led to a specific, unavoidable destination: In eighteen years, my daughter would graduate from high school. Like her parents and our parents before us, I expected her to go to college. How much would it cost? The numbers were mind-boggling. Over four years, the bill would be $122,000 for an average public university, in today’s dollars, after adjusting for inflation. The typical private university would be nearly double that, $228,000. That was if she beat the odds and finished in four years. And even if she did, would it be worth it? A quarter of a million dollars for “limited or no learning”?
Yet, even as the tuition trends and research studies were piling up, something else was happening that gave me reason to be hopeful. When my father let me tag along on trips to his university laboratory in the 1970s, computers were housed in six-foot-high metal cabinets powered by thick electrical cables. Times have changed since then. Over the following decades, the technology revolution radically altered broad swaths of the economy. Organizations that were in the business of controlling access to information that could be digitized—sounds, words, and images—found themselves challenged by upstart competitors. People around the world who were once isolated from the centers of culture and commerce suddenly became connected through information networks and computers that are getting cheaper and more powerful by the year. Information technology became interwoven into the social, economic, and cultural fabric of modern life.
I sensed that the University of Everywhere was out there, somewhere, waiting in the near future, a blurred outline that was beginning to come into view.
So when MIT and Harvard joined a handful of other world-class universities by offering complete versions of their highly sought-after courses for free, online, I decided to find out for myself if the digital future of higher learning had finally arrived. I asked Anant Agarwal, the former director of MIT’s computer science and artificial intelligence laboratory and the recently installed CEO of edX, the joint Harvard-MIT online effort, to recommend a course to take. As an undergraduate political science and graduate public policy major who studied education policy for a living, I wanted something completely outside of my expertise, so that the experience was as close to that of a newly enrolled student as possible. I was also interested in taking a course that would show off what the edX online education system could do. Agarwal recommended a class starting in a few weeks, taught by one of MIT’s most esteemed professors: Eric Lander.
2
College admission has become a rite of passage in American life, a chance for parents to spend the last precious months before their children finally leave the nest in a state of constant bewilderment and frustration, sorting through mountains of college brochures, filling out invasively detailed financial aid applications, constructing intricate college visit itineraries, and endlessly hassling their teenagers to please, please, finish writing their “personal essay.”
Signing up for The Secret of Life, by contrast, took all of two minutes. I used my laptop computer to search for “edX,” selected the class from a list of available courses, typed in a username and password, and was done. Clicking on the course title opened up the edX “learning management system,” which consisted of several neatly arranged menus surrounding a video of Eric Lander standing in front of a whiteboard in the lecture hall that I would eventually visit in person several months later.
I was, at that point, still skeptical about the idea of taking an entire MIT-caliber course online. Like most people, I had ridden the wave of technological change as neither an early nor a late adopter. I had a love/hate relationship with my smartphone—so convenient and useful, so endlessly distracting. I liked reading detective novels on my Kindle but still loved browsing the stacks at my local bookstore and leaving with a full bag of books that would surely make me a smarter, better person. My music CDs had long since been thrown in a closet, but the sound in my earbuds still didn’t measure up to what came from my old stereo gear.
College, meanwhile, seemed to occupy a more exalted plane of existence. I knew that education was much more complicated than simply reproducing words or sounds. The Secret of Life would help me understand what could be replicated in a free class offered entirely on my laptop, and what was unavoidably left out.
The course itself was a faithful digital translation of the education that all MIT undergraduates receive. The math, science, and engineering prodigies who win admission to MIT are required to complete a core curriculum called the General Institute Requirements. It consists of six introductory courses in biology, chemistry, physics, and mathematics. There are a few options in the sense that MIT offers an extra-hard version of chemistry for students who spent their high school years knee-deep in beakers and Erlenmeyer flasks, but otherwise there is no testing out with AP credits or otherwise avoiding the core curriculum. The courses are hard enough that, even given the kind of people who are admitted to MIT, the university prohibits freshmen from taking more than four of them in one term. First-semester freshmen don’t even receive letter grades and cannot fail; the only options are “pass” or no grade at all.
The educational model for the MIT courses is straightforward: Students attend lectures and discussion sections every week, read supplementary materials, complete problem sets, and takes exams.
To produce The Secret of Life, MIT installed several professional-grade high-definition video cameras in the lecture hall where Eric Lander was teaching freshmen the standard Introduction to Biology course. Because MIT also numbers its classes and majors in an orderly fashion (the Mechanical Engineering major is called “Course 2,” Biology is “Course 7,” and the first biology class is designated “7.01”), the experimental online course was designated “7.00x.” The videos were posted on edX a few weeks after the live lectures occurred. Other than the short time delay, the thousands of students taking 7.00x online proceeded through the course in exactly the same way as the select students at MIT.
The edX system allows you to download the videos or stream them onto your laptop, tablet, or smartphone. Once you press “Play” and Lander begins to talk, the text of his lecture starts to scroll down a window to the right of the video, keeping time with the lecture itself. You can pause at any time and read back through what he just said or click on the text, which rewinds the video to that point in the lecture.
I quickly realized that 7.00x was nothing like the AP biology course I had taken in high school. At MIT, professors assume you know all of that already and dive right into the good stuff: genetics and the underlying code of life. As a rule, I’m a reader: I usually find listening to people explain things to be frustrating, unless I can interrupt them and ask questions. But Lander’s lectures were engrossing. The course was an intricate story of discovery, during which students learn the fundamental principles and intellectual unification of genetics and biology.
We began with four one-hour lectures on biochemistry, starting with the composition of cells and moving quickly through various types of molecular bonding. From there we went into protein structures and the various interesting and complicated ways that amino acids behave and interact. Then we covered enzymes and biochemical reactions, how molecules move through transition states of various energy profiles, and how enzymes help them along. All of this came together in a lengthy explanation of biochemical pathways, with the elaborate glycolysis process as the main example.
Molecular biology is a visual subject, and the lectures primarily consisted of Lander covering whiteboard after whiteboard with diagrams and explaining what they mean. I carefully copied his diagrams into my notebook, making annotations at crucial points and pausing the video to back up, read, and listen again when a concept wasn’t clear. He was the pedagogue, the master, and I was the acolyte receiving wisdom and information.
In doing this, Lander and I were engaged in a kind of educational interaction that dates back to the very beginning of modern higher education. The university as we know it today—the institution that is failing to help so many students learn and graduate even as it charges them ever-increasing amounts of money—did not appear, fully formed, from the ether. It emerged from particular historical circumstances and evolved, often strangely, over time. And it turns out that information technology has been part of that story from the very beginning. The nature of the deeply flawed modern university has been shaped, century by century, by the way technology has mediated the balance of power between students and their masters. Only now have the scales finally begun to tip in students’ favor.
—
THE ARCHETYPICAL LIBERAL ARTS major is classics, the study of how the Greeks and Romans invented much of Western civilization. Notably, the ancients accomplished this without universities themselves. We may think of Raphael’s magnificent fresco in the Vatican’s Apostolic Palace showing Plato and Aristotle surrounded by fellow philosophers, but its title, The School of Athens, was added centuries after the painting itself was completed. The classical world had teachers and students but no deans or departments or diplomas—no places of higher learning as we know them today.
The first modern university wasn’t created until 1088, more or less, in Bologna, Italy. It’s still there, and when I visited the University of Bologna’s campus in autumn 2012, it seemed entirely familiar. Throngs of young people wearing backpacks and blue jeans laughed and flirted outside bars and cafés near academic buildings clad with red roofs and orange plaster walls. There was a bookstore, paper signs with little tear-off stubs advertising apartments for rent, and bicycles locked to street-side poles. The area had a certain worn-in look that comes from having tens of thousands of students circulate through every year. A plaque on the administration building declared that it is the Alma Mater Studiorum—the “Nourishing Mother of Education.”
As a city, Bologna is like Florence without the Uffizi, Michelangelo’s David, and 95 percent of the tourists, which is to say it’s a wonderful place to live and learn. You can spend hours under the covered walkways wandering through shops, galleries, and restaurants. Local businesses sell cheap food and drinks, and articles in the city newspaper denounce student drunkenness. After visiting Bologna, Charles Dickens remarked that “there is a grave and learned air about the city, and pleasant gloom upon it.” It’s the original college town.
Everything about the University of Bologna reinforces the idea that colleges are eternal institutions, unchanged through enemy sackings, plagues, papal feuds, and world wars. This is an idea that modern universities are eager to reinforce, with their founding dates stamped in bold type on ancient-looking seals. In a society that worships youth and modernity, universities stand out for their pride in antiquity. We have always been here, they imply, very much like this, and so we will always be.
But in truth the first university was a very different beast from the ones we know today. Even the University of Bologna’s founding date is something of an approximation. All we know for sure is that in 1888 the civic leaders of Bologna decided that they wanted to celebrate the university’s eight hundredth anniversary. Still, it’s reasonably certain that in the late eleventh century a group of students in Bologna got together and decided to pool their resources—financial, intellectual, and spiritual—in order to learn.
Europe then was emerging from the Dark Ages. The wisdom that was the true subject of Raphael’s fresco—in the center, Aristotle holds a copy of his Nicomachean Ethics, Plato his dialogue Timaeus—had begun to return to Europe, after centuries in exile, transported onto the continent by the Islamic scholars of Spain. Along with philosophy came Euclidean geometry, Arabic numerals, and rediscovered Roman law. Goods and people began to circulate through the European economy, along with a desire for knowledge.
Young people eager to learn congregated in cities. In Bologna they organized themselves into associations based on national origin, first to better bargain with greedy landlords. Next the students hired teachers, on terms that would be unimaginable to today’s tenured professors. Standards for teaching were established in code and contract. Professors were never allowed to skip classes and were required to begin and finish on time. They had to post a security deposit if they left the city and were obligated to proceed through the entire curriculum over the course of a year. If fewer than five students showed up for a lecture, the professor was fined, on the theory that it must have been a bad lecture.
Professors responded by forming organizations of their own. Today, we use the words “college” and “university” somewhat interchangeably. But in the beginning, a university was a collection of scholars, or students—universitas scholarium—while a “college” was a guild of masters. The academic guilds administered tests for admission, and those who passed got a license to teach—the first “college” degrees.
The balance of power favoring the students was short-lived. The second major European university, in Paris, was organized around the professors, and the pattern has changed little since. The University of Paris grew out of the cathedral school at Notre Dame in the middle of the twelfth century. The faculty there grouped themselves by discipline: canon law, arts, theology, and medicine.
This idea of the university thrived because it had the weight of supply and demand on its side. As Europe gradually moved toward renaissance and enlightenment, knowledge and wisdom were scarce and precious resources. There were only two ways to get them in an educational setting: listen to a master speak, or read words written in a book. Universities were able to accumulate a critical mass of books and masters, usually in a city that served as a hub of communications and commerce. There were few such places, and the people who owned the intellectual capital could decide who would be allowed to access it and on what terms.
So students came to listen to the likes of Peter Abelard, the great French scholastic philosopher whose doomed love affair with Heloise became the stuff of legend and song. A prodigy and champion logician, Abelard attracted throngs to his lectures and public debates in Paris and the surrounding wilderness, where students would camp for days to hear the master speak. He was, as the medieval historian Charles Homer Haskins wrote, “bold, original, lucid, sharply polemical, always fresh and stimulating, and withal ‘able to move to laughter the minds of serious men.’” Abelard reached the height of his fame just as the University of Paris emerged, exemplifying the idea of professor-centered education.
The idea soon germinated elsewhere. In 1167, England’s Henry II forbade English students from studying at the University of Paris. Scholars, masters, and books began congregating at Oxford instead. Violent clashes between students and the local townspeople prompted the university to build dormitories, which evolved into a new meaning of “college”: independent places where students lived under the supervision of masters. By 1264, Balliol, Merton, and University colleges had been built, all of which continue to operate today along with thirty-five other Oxford University colleges. The colleges were largely autonomous, raised their own money, and were responsible for teaching and student life. The university administered exams and granted degrees. Cambridge University was founded not long after, by a charter from Henry III. It, too, adopted the college model.
The medieval university worked in this way for the next few centuries, expanding to more locations in Europe and beyond. It was not always the center of intellectual life—the great artists and scholars of the Renaissance did not work at universities—but it continued to grow, adapting to the last great information technology revolution in higher education before the current one: Johannes Gutenberg’s invention of the printing press in the mid-fifteenth century.
This was not the first time that teachers had been confronted with new learning technology. Millennia before, another invention had upset those wedded to traditional teaching methods: the written word. In the Phaedrus, Socrates tells the story of a conversation between two gods, Theuth and Thamus. Theuth was “the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters.” Thamus was god of all Egypt, and said this about Theuth’s prized invention:
“O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions. . . . [Y]ou who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”
Socrates distrusted learning from books. Students reading words, he argued, would gain only shadows of the author’s original insight and, worse, not understand the difference. Anyone who has suffered in the tiresome company of a book-read pedant and his shows of wisdom would probably concede that Socrates had a point.
He was, however, shortsighted about the long-term net benefits of books to human civilization. Without written words, complexities of human thinking are limited to what can be held, recalled, and spoken by a single mind. If Plato hadn’t recorded the wisdom of Socrates in books, it would be lost to the world today.
Books were hard to come by in medieval times, so much so that universities were given the power to regulate their production and sale in order to prevent monopoly pricing. Students generally rented books for a limited time, and since each book was hand-copied, universities employed book inspectors to spot-check for accuracy. The University of Bologna created a book supply by requiring professors to submit copies of their lectures for publication. The image of a professor reading from a lectern as a roomful of students write down the words he speaks—an experience familiar to anyone who has attended college—is rooted in this pre-Gutenberg era of higher education.
Because paper for books was scarce, scribes would omit spaces between words. Readers had to learn to recognize words in long strings of characters, a skill that was taught by reading out loud under the supervision of a master. Teachers got used to this approach and reacted with consternation when printing press technology made it obsolete. Well into the fourteenth century, professors at the University of Paris outlawed silent reading. If they didn’t know what their students were reading, how could they help them learn?
The Gutenberg revolution changed and destabilized so many dimensions of society that some have wondered why it had, relatively speaking, less effect on the structure of the medieval university. Bologna and Oxford and Cambridge are still operating (the University of Paris didn’t survive the French Revolution) in ways that their founders would easily recognize.
The answer is that the new technology complemented the university’s established business model. While people could now read alone (if they were literate) and a few wealthy individuals could amass small libraries, books were still expensive to own. Only a few people and institutions had the means to accumulate, store, and catalogue all of the books worth reading. Then there was the question of which books to read and who to talk to about them when it was hard to find more than a vague semblance of the author’s true meaning in a few characters on a page. Students also needed some kind of credential signifying what they had learned. Universities provided teachers, guidance, peers, and diplomas, things no book could offer alone.
Printed books solidified the logic of universities as scarce, expensive places. If you wanted to learn, you needed to travel to where the smart people, books, and other students were. The limitations of transportation, communication, and information storage technologies gave universities the upper hand.
—
HIGHER EDUCATION was very much on the minds of the Puritans who made the hard voyage to the New World in the seventeenth century. Their faith had grown out of the printing press revolution, which freed people to find their own meaning in the Word. Eight years after the Massachusetts Bay Colony was founded in 1628, its Great and General Court voted to create a “schoale or a colledge.” A year later they designated Newetowne as the location, and then renamed the town Cambridge, after the university that had educated many of them, in 1638.
That was also the year that a wealthy thirty-one-year-old Cambridge alumnus named John Harvard died of tuberculosis. He left the new college half his money and his personal four-hundred-book library. In gratitude, the Great and General Court named the new college after Harvard in 1639. The Reverend Henry Dunster, another Cambridge graduate, was appointed the first president of Harvard College the following year.
Dunster duly implemented a four-year bachelor’s degree program based on the residential college model at Cambridge. Of course, the college needed more than John Harvard’s initial gift to keep going. A few years later the Puritans began another practice that, like many things in higher education, would endure: They sent a letter back to England hitting up friends and alumni for money.
After God has carried us safe to New England, and we had builded our houses, provided necessaries for our livelihood, reared convenient places for God’s worship, and settled the civil government: One of the next things we longed for and looked after was to advance learning and perpetuate it to posterity; dreading to leave an illiterate ministry to the churches, when our present ministers shall lie in the dust.
Others followed in their footsteps. By the time the American Revolution came, there were nine colonial colleges: Harvard, Yale, Dartmouth, William and Mary, New Jersey, Rhode Island, King’s, Queens, and Philadelphia. The last five are now known as Princeton, Brown, Columbia, Rutgers, and the University of Pennsylvania. The core of the present-day American higher-education aristocracy predates America itself.
Organizationally, the colonial colleges strongly resembled their English ancestors, as did those that followed them. But the American approach to starting up new colleges was different. At the time, English authorities kept tight control over the founding of colleges. Oxford, Cambridge, and the University of London were the only universities that had official government charters and the authority to grant degrees. Those places enjoyed monopoly power and government subsidies, which was good for the people who ran them and for the privileged students—all men—who were allowed inside. But it also meant there were relatively few college opportunities for the population of a literate, advanced nation on the verge of an empire.
Americans, by contrast, adopted a laissez-faire approach to college creation. The federal government played almost no role, despite vigorous attempts by James Madison, Thomas Jefferson, and George Washington to create a “national” university. Building new colleges was left to the states, which were generally happy to grant charters to a profusion of religious denominations and other organizations, with the proviso that when it came to raising money, they were on their own. The country was full of enthusiasm and open spaces. By the eve of the Civil War there were nearly 250 colleges and universities in a nation stretching from Maine to Florida and Massachusetts to California.
Those institutions were very different from colleges as we know them today. The continental model of masters and students living together remained intact. But the institutions themselves were small and inconsequential. Two hundred twenty-nine years after its establishment, Harvard graduated a class of seventy-seven students. As late as 1880, only twenty-six colleges enrolled more than two hundred people. Students were taught a standard curriculum of Greek, Latin, and mathematics, with some ancient philosophy and history thrown in. There were few lectures, laboratories, seminars, or other modern teaching methods. Instead, college learning meant recitation, a tedious process of students orally regurgitating large quantities of memorized text.
The teachers who supervised the recitations were of middling social status and low pay. Rather than leave colleges in the hands of the faculty, as at Oxford and Cambridge, the Puritans entrusted their academic institutions to pious old men charged with preserving their religious character. At the end of the Civil War, 90 percent of American college presidents had been recruited from the clergy.
The goal of college then was neither to create nor distribute knowledge. Instead, America’s flinty Protestant founders believed that students needed to learn “mental discipline.” Just as hard exercise, godliness, and deprivation would hone the body and spirit, years spent memorizing long passages of ancient Greek would sharpen the faculties of young men, even if they never had occasion to read Greek again. Educational methods were also a function of available technology. Paper remained expensive, so it was cheaper to grade students in person based on oral presentation than have them submit written work.
The 1852 catalogue for the University of Pennsylvania lists the name of every undergraduate—there were eighty-one—along with a precise course of instruction for all four years. The sophomore class, for example, would study “Plane and Spherical Trigonometry . . . with applications to Surveying, Navigation, &c.,” along with logic, rhetoric, and “Livy (Second Punic War), Demosthenes, Horace (Epistles and Art of Poetry).” The schedule could be punishing. “On each day of the week, except Saturday,” the catalogue noted, “there are three recitations of one hour each for every class. On Saturday each class recites once.”
But just as America eventually grew far beyond the cultural and religious beliefs of its founders, the nation’s colleges and universities were poised to undergo a rapid transformation. The next three decades saw the most important debate in the long history of American higher learning, one in which three ideas battled for supremacy in defining what exactly college should be.
—
THE FIRST IDEA FOCUSED on how colleges could help the waking giant of the American economy. By 1862, the Southern states had seceded into the Confederacy, taking their senators and representatives with them. That altered the balance of power in Washington and created new opportunities to pass legislation. Justin Smith Morrill, a representative from Vermont, pushed through the first Morrill Land-Grant Act, which was signed into law by President Lincoln. It granted each state rights to federal land in the western territories, the income from which would be used to create
at least one college where the leading object shall be, without excluding other scientific and classical studies, and including military tactics, to teach such branches of learning as are related to agriculture and the mechanic arts . . . in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life.
Mechanic arts and practical education for the industrial classes were a far cry from Horace and Demosthenes, and for good reason. People were spreading out toward the frontier as the nation was beginning to be knit together by railroads, steam, and telegraph. In the three decades after the Civil War, America would use its abundant natural resources to become the biggest manufacturer in the world. It needed skilled people, and colleges seemed like natural places to train them.
The universities created in the Morrill spirit would eventually become some of the nation’s largest and most productive institutions of higher learning. Some of their leaders were openly disdainful of the older colleges. In California, the robber baron Leland Stanford used a fortune made building the transcontinental railroad with exploited Chinese workers to found a university in the memory of his dead son. David Starr Jordan, the first president of Stanford, declared that colleges should not prepare students “for a holiness class which is rendered unclean by material concerns.” At the University of Nebraska, the chancellor had little use for “institutions that seem to love scholarship and erudition for their own sakes; who make these ends and not means; who hug themselves with joy because they are not as other men, and especially are not as this practical fellow.”
The second big idea about what college should be came from overseas. After the Civil War, American scholars who traveled to Europe began returning with stories of a new kind of institution: the German research university. While the medieval universities had continued to operate from their founding principles, the Enlightenment had brought new ways of thinking and centers of power. In 1810 the Prussian linguist and philosopher Friedrich Wilhelm Christian Karl Ferdinand von Humboldt petitioned his king to create a new university in Berlin. The Humboldtian model was built around the independent scholar, a learned man given broad freedoms to push the boundaries of human knowledge as he saw fit. Students would assist the masters and learn from their example. But the master and his research, not the student, would be the center of the institution.
This idea had obvious appeal to the professoriat. It was bolstered by the 1876 founding of Johns Hopkins, America’s first research university. Hopkins made clear from the outset that, unlike the land-grant institutions, it was not in the business of training people for the trades. In a founding statement, it said, “The Johns Hopkins University provides advanced instruction, not professional, to properly qualified students, in various departments of literature and science.” The reference to departments also signaled a new structure for colleges and universities. Professors would be organized into autonomous divisions, each focused on a particular academic discipline, with the primary goal of advancing knowledge in that field.
The third big idea was liberal arts education, a notion that many people favor and fewer can adequately explain. The man who proposed the most enduring definition of liberal education was the great British theologian John Henry Newman. Newman would eventually be elevated to the rank of cardinal and was beatified in 2010. But first he was sent on an educational mission to help found a new Catholic university in Dublin, Ireland. Shortly after arriving, he delivered a series of lectures that were collected in a book titled The Idea of a University.
Newman began with a definitive statement. The university, he said, “is a place of teaching universal knowledge. This implies that its object is, on the one hand, intellectual, not moral; on the other, that it is the diffusion and extension of knowledge rather than the advancement. If its object were scientific and philosophical discovery, I do not see why a University should have students.”
As a man of God, Newman saw religious and liberal education as distinct. “Knowledge is one thing; virtue is another; good sense is not conscience, refinement is not humility, nor largeness and justness of view faith. . . . Quarry the granite rock with razors, or moor the vessel with a thread of silk; then may you hope with such keen and delicate instruments as human knowledge and human reason to contend against those giants, the passion and the pride of man.”
Nonetheless, Newman was a great believer in those instruments. True liberal education, he believed, was not a matter of merely accumulating knowledge in a specific subject. The most important goal was to understand how all the different aspects of the world are connected. As the intellect is perfected, Newman said, “the mind never views any part of the extended subject-matter of Knowledge without recollecting that it is but a part. . . . It makes every thing in some sort lead to everything else.” So educated, the student “apprehends the great outlines of knowledge, the principles on which it rests, the scale of its parts, its lights and shades, its great points and its little, as he otherwise cannot apprehend them.”
Achieving this, Newman believed, takes both hard work and a well-designed educational program. “The intellect in its present state”—that is, before going to college—“does not discern truth intuitively, or as a whole.” People learn “by the employment, concentration, and joint action of many faculties and exercises of mind. Such a union and concert of the intellectual powers, such an enlargement and development, such a comprehensiveness, is necessarily a matter of training.”
Where to Download The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey
Most helpful customer reviews
24 of 29 people found the following review helpful. The End of College as We Currently Know It By jem If you are a college or university professor or administrator, this book will not be a surprise, but if you are a parent of elementary or teenage children that you want to provide the best education possible, this book offers the information to make well- informed decisions.Carey provides an excellent overview of higher education history for the layman in the first part of the book, and an enlightening analysis of the current state and potential of digital open source courses in higher education. Those who have watched the predictions of dramatic educational change as radio and television technologies became available but had minimal impact, will no doubt question Carey's predictions for the effects of digital technology on higher education. The impact on other information sources such as magazines and newspapers, the tremendous worldwide response to early offerings, and the venture capital estimates regarding potential monetary rewards, and the dramatic cost savings for students, however, suggest this technology may indeed have a significant impact in the very near future.Carey's personal experience enrolling in an MIT genetics course on the origins of life adds a practical touch that is as effective as his many interviews with professors, educational administrators, and online course entrepreneurs in determining the effectiveness of online technology and its acceptability by students. He discusses at some length the next issue to be addressed: determining credit for online courses that will be acceptable in the marketplace. This is an excellent and thought provoking consideration of a major current issue -- the cost, method, and credentials of lifelong learning experiences.
20 of 25 people found the following review helpful. Thank you Kevin Carey By Steve I was worried initially that the book was going to be more of the disruptive, online-edutopia boosterism that we hear so much of but The End of College is a much more nuanced, careful, and insightful piece of work than that. Carey presents an intriguing history of what he terms the "hybrid" American university with all of its deep flaws and contradictions along with a compelling vision of what he calls "The University of Everywhere" and what it might become. The feature of the book that I found most appealing was that he does all that with his focus squarely on what has to be the central issue - student learning.I've followed Carey's columns and I was pleased to find that his considerable gifts as a writer in that shorter format translate nicely into this book. I was so taken with the book that I read it in just two intensely engaged sittings.I hope that this excellent book gets the wide readership that it deserves among educators and administrators in higher education - but educators in secondary ed ought to take note as well.I'll be looking forward to hearing more from Mr. Carey.
3 of 3 people found the following review helpful. Current state of the American educational system By Helpful Advice ‘The End of College’ written by Kevin Carey is well-made analysis of US college system and regardless of assessment and predictions that were presented are not overly optimistic, it is a book that is certainly worth reading.Kevin Carey in his book managed to present, in a very comprehensive and systematic manner, the real causes of problems these days American colleges have. The truth is not so comfortable to hear, but that doesn’t mean that clear picture he gave within his book is unnecessary criticism. Quite opposite, because regardless of how discomforting presented conclusions are, there is a way to get out of this situation, which of course requires a change in thinking and functioning of the US education system.Given that my education is product of completely different education system – the European one - it was somewhat surprising to read some things about US colleges in the book, considering that the American education system, regardless of high costs in Europe is still very much appreciated and many young people dream about studying abroad.Carey summarizes that US colleges won’t change as long as Americans are so hardly trying to keep everything as-is, being so resistant to change. He speaks about history, everything that led up to point where schools in US are right now – offering education that is too expensive, while at the same time offering little or too little for the money invested. As main cause of problem the author sees an impossible combination of the research institutes, arts colleges and trading schools which is mix that is simply not functioning.Carey does not stop only with criticism; instead he is offering his expertise based on rich experience proposing some technological innovations that could catalyze changes. Still it is good to see that he managed to avoid the trap of thinking that technology will solve everything because behind each technology are still people, and without human perseverance and genuine desire to change nothing would be possible.Overall, with ‘The End of College’ Kevin Carey made a surgically precise picture of the current state of the American educational system as insider, someone who knows what he is talking about, and his book except the bitter criticism does show that there is a path that leads out of this situation. Therefore his book is certainly worth reading, especially for people who are professionally connected to the US college system.
See all 67 customer reviews... The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin CareyThe End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey PDF
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey iBooks
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey ePub
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey rtf
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey AZW
The End of College: Creating the Future of Learning and the University of Everywhere, by Kevin Carey Kindle
Tidak ada komentar:
Posting Komentar