College helps you decide what to learn and who from
As a non-economist at the fringes of academia, I’m probably too close and too far from it to understand the economics of universities, MOOCs, and books. But I thought it was worth responding to Matthew Yglesias’s declaration that he is “skeptical that we really understand all that much about why people go to college or what it is exactly that videos on the Internet could possibly substitute for,” and that it would have made more sense for the availability of books to destroy the university long ago than for internet-based learning to destroy it now.
I agree: if you want to learn something about a specific subject, it’s faster and cheaper to read about it than to pay a knowledgeable person to read you a lecture. But this doesn’t imply that going to college is all about signaling and conformity and not seeming like a weirdo.
One thing the university experience gives you is a map of the enormous quantity of human knowledge already captured in books. Course catalogs, distribution requirements, majors give you a sense of what has been studied in the past and which fields were prestigious when. If you study Russian, you learn that linguistics, literature, and “area studies” are considered different tracks valuable enough to major in, but learning a language is just a tool to be used in the service of something else. When I was an undergrad in the 1990s, administrative classifications kept being changed in a growing field of biology, as traditional botany and zoology gave way to courses about genetics or ethical dilemmas. There were a lot of political science, economics, and biology majors around, and not as many people studying Ancient Greek as there used to be.
Meeting the professors and majors in a department gives you an idea of who goes into that field and why. In class you get not only the facts and arguments you could read faster in a book, but offhand remarks and informal evaluations of scholars and schools of thought. Professors who would be cautious about criticizing their colleagues in print will let you know — in digressions or improvised answers to questions or responses to your mistakes during a presentation — who they respect, who they think is outdated or overrated, what they think is related to what. It takes practice to get this kind of meta-knowledge from the footnotes or bibliography of an academic article. You find out a lot just from a one-page syllabus; even people whose stated aim is questioning and tearing down canons have to decide what not to talk about. And this meta-knowledge is multiplied as you compare the perspectives of different professors and disciplines to each other.
If you’re an autodidact diving into the internet or a library, you’ll never run out of interesting things to read. You’ll certainly discover books you’d miss if you let a college draw you a map of intellectual history. It’s also a safe bet that you’ll go down more blind alleys and miss books that would have helped you.
Or, on the internet, most of us think we can tell the difference between serious academic historians, honest but unoriginal popular history, poorly done popular history, and crackpot conspiracy theories. But we usually do this without specialized knowledge of our own, and I suspect the heuristics I personally use have as much to do with sifting skills that emerge from the college experience as anything else.
I have no idea whether informal meta-instruction about what to read and who to trust can help save U.S. universities’ business model. And the phenomenon of famous universities putting their name to online courses and content allows the “map of prestigious knowledge” function to go off-campus for free. To some extent it’s the word “Yale” that lets non-historians know David Blight’s lectures on the U.S. Civil War are well worth your time and not just some guy ranting on YouTube. But there’s more to university education than books being inefficiently paraphrased out loud.