When I wrote Should You Go To College, I was trying to figure out what the actual value was for most people these days, considering skyrocketing tuition costs and stagnant wages. Sometimes I think back to my three degrees too and wonder what I was thinking.
But there's a question that came up towards the end, of what an alternate system should look like. After all, colleges exist the way they do today because of four interrelated reasons that reinforce each other.
They provide a place to meet likeminded others, and socialising
They provide a space for learning
They provide the ability to get jobs
They provide space for faculty to do research
Barring this, we have situations like this.
Granted, it's mostly for philosophy PhDs, who I suppose have the benefit of looking at things philosophically. But it does showcase the problem. There's high priced areas that people study because they like it, but it leaves them high and dry in the 'earning a living' department. And since both these things are combined together in one space-time location, there's no easy way out.
Universities are bundled goods. We all get some of what we want from it, treating it like a buffet, but like most buffets it sucks at providing most folks with what they actually want.
College doesn't just teach you "finance". It's supposed to say you're a very smart (Wharton grad) who studied finance, and therefore have some aptitude for it. That's why the analyst class at Harvard trains its analysts like the Marine core, since it's reasonably assumed they barely know anything. It's selecting for more things than what you actually learnt in class.
And so, it's time to unbundle the university.
The reason for their incredible staying power, making them some of the longest surviving organisations in the world, come from the fact that these pillars are intertwined.
Without providing a space for research, the space for learning or networking will be less valuable, and the credential will be less valuable.
Without providing the ability to get jobs, the credential and learning gets less valuable.
Without providing proper learning, faculty won't be interested to do research and the credential won't be valuable.
And so on and on.
Tyler Cowen recently asked people to reimagine what a university might look like in his Bloomberg column, and gave the blueprint for an online-offline hybrid that would probably work far better than any today. But even he doesn't go too far beyond providing online access to newer courses and creating a choose-your-own-adventure journey to the student. The idea of a marketplace for professors is pretty interesting, but also implicit if the students don't have to take certain courses, which would mean the professors would have to compete for them.
This doesn't seem like enough.
Neither does other commonly offered pieces of advice, like focusing on STEM by Noah or focusing on trade schools. Giving a false choice between liberal arts education and trade schools, or prioritising STEM without taking into account students' preferences don't actually help the actual students.
If something new has to replace, or at least enhance, the current setup, then it also has to be able to provide a solution to all of the pillars. Simultaneously, ideally, it has to solve the problems of cost and access.
And for that it has to provide an aura of excellence that attracts both students and faculty.
It would probably be better (and faster) if Harvard or Google decided to provide an alternate path to get a Harvard certificate, say by creating something actually worthy. For them to, say, contact the top contributors on Stackoverflow and giving them a CS Masters degree that's undeniably well earned. But that seems unlikely to happen, so we're left with the guerrilla option. Instead of getting one big credential once in a few years, let's split that into a whole bunch of different credentials that can come together Captain Planet style.
Now what stops this from happening today? For one thing universities have extremely high switching costs. You kind of have to make a decision to stay in one place for close to half a decade.
That's a big decision. No wonder 17 year olds agonise over it. Two decades after the fact I still don't know if I made the right decisions. That's why the decision becomes fraught, and folk do risk minimisation by choosing the best brand. If you had to choose one handbag to keep for the rest of your life, you'd probably pick an expensive brand-name too if you could.
But even with that there still lies the problem of the few decisions and long term optimisation. The fix for our pillars:
For learning - provide the best of any topic through online and offline classes
For network and socialisation - provide some degree of co-living at least for a couple years
For mentorship - provide regular connections to those at the forefront of their topics
For jobs - provide regular "live" projects so that every student graduates with a portfolio
But if they're doing this while locked in to the same university structure we're just reinforcing the age old institutional structure of set course structures, exams and graduation.
For instance, exams become less relevant for most subjects compared to an actual output that showcases the benefits of what they learnt. Exams enable testing of one's understanding of a predetermined curricula, which is a focus on broad knowledge. But as the canonical example from computer science classes have taught us, actual programming ability is correlated rather weakly with taking OOP classes. A few weeks of self directed real life project work is worth far far more.
So while making the focus of education the actual degree attained and the prestige thereof results in the Harvard-ification of education, breaking this down into its components would help. This future university would be student led, in that they would choose what courses to study and how, be focused on providing as many opportunities as possible for students to actually create job level outputs (papers, posts, articles, analyses, code, trades, films etc.), focused on providing space to interact with lions in the field, and ideally have some level of colocation so that the essential college experience is somewhat maintained.
The teaching and mentorship provisions would be split into offline and online worlds, with the provision therefore split between physically present faculty and the best of that which is available online. The projects would be underwritten and marked/ published by those in the industry and open to the world.
Universities shouldn't just graduate students to the world, but also act as knowledge repositories that they share with the world or have their students contribute to. There is zero reason that the only intellectual output that comes out of the universities comes from faculty research papers, and not instead the undergraduate and graduate students, even if it is in the form of a blog or a substack.
This idea of focusing on output isn't something unknown to the world either. After all, if you wanted to break into, say, British film, Cambridge Footlights would've been a far better bet than some specialist degree. We need to move the margins drastically in this direction.
To do this, we need to break the dependence of the student on an individual institution or degree program. We should make it a smarter version of a self directed Khan Academy marathon, complete with class participation.
Currently the education system sees it's atomic unit as the degree, which is the culmination of years of learning of a subject. That's a four year bundle of topics and subjects all shoved together in the hope that it helps the median student. One step lower is perhaps the difference between electives and core courses, an aggregation level of around a year or so.
And yet another step lower are the specialisations - where you might take a few accounting courses to get a minor in the subject - surely closest to the platonic ideal of what a student ought to do, if self directed and trying to optimise their learning.
But what if we take this to its logical conclusion? To try and make all selections self directed, and focus them on outputs (not exams)? We should start to see the ability for students to self select courses of study to optimise both skills for future jobs and actual learning, including the much maligned "liberal arts" learning.
There's no reason someone shouldn't be able to study French philosophy should they choose, and coding alongside, they should be able to do it easily. If they want to demonstrate their ability in graphic design, they should be able to create a portfolio.
Funnily enough, this is kind of what the more go-getter students do today, just under the constraints of the existing education system. Instead if they had the ability to pick and choose, how great would that be!
And even in courses where arguably book learning is incredibly valuable, say philosophy, it would a) definitely be helpful for the students to construct more arguments or write more papers, instead of answering exam questions, and b) simultaneously ensure they get credit for other, more practical, education which hopefully makes them more employable. And this isn't the same as taking a few coding classes in school, because those, as seen above, are pretty much useless!
So what should happen? We need to split out two things - outcome focused classes which can gradually increase in difficulty, or learning focused courses which students can pick and choose.
For instance, if you want to study filmmaking, you'd need to actually make films (the practical bit resulting in a portfolio) and choose study subjects (presumably things you find interesting or useful like the ... I don't know ... history of photography or something?).
The Wall Street Journal @WSJColumbia and other wealthy universities steer master's students to federal loans that can exceed $250,000. After graduation, many learn the debt is well beyond their means. https://t.co/VO2VrQVK7g
This won't work for everything. If you want to become a doctor, for instance, med school as it is seems a good, time-tested, idea. But even here, diagnoses seems something that can be learnt through doing, otherwise the doctors I've met in my life have been a sorry lot! And another key learning for doctors has been the importance of lifelong learning, and how the information they learn has a half life of a few decades. If you want to become an economist, you can decide which parts of econ to study, each of which can have a couple prerequisites, but prepping for producing a couple papers will be far more important than focus on exam marks.
To make this come about we'll need to radically modularise education. We'll have to look at the value in a granular sense provided by each course rather than only at the aggregate level at graduation. Provenance doesn't prove capability. This is what's needed to enable students to mix and match the education that they'd prefer. It would also make the costs more manageable, by both ensuring that with the use of online classes and certifications people only pay what they want to pay.
Not to mention the fact that by separating the job hunting aspect of colleges, and by bringing in those who actually provide those jobs and showcasing the actual job done by the students, you'd create a far better marketplace.
There are barriers to this of course. For one thing, accreditation is an enormous challenge. Administrators and lawmakers have had an easier time dealing with entire institutions and overall curricula, and change is hard.
For another, prestige from the final degree is much harder to make fungible. Harvard would prefer to award its Harvard degrees to those who spend four years or more at Harvard. That's how the alumni networks are forged and donations are sought.
But there are a wealth of institutions outside Harvard who don't have this problem. Whose raison d'etre doesn't revolve around multi generational institutional prestige. Who actually have the freedom to try and experiment.
Can you trust the students to choose wisely?
If you let the students choose their course of study, the thought goes, they wouldn't know what to learn. There needs to be some curation and continuation of a syllabus, if only to result in a sufficiently broad base and depth in something.
Which is true to some extent. But the fact that we give a choose-your-own-adventure as an option does in no way diminish the availability of a roadmap someone else has laid out.
What it does do is to provide kids with agency. If they realise that they want to learn specific subjects from specific universities, right now there's no way to make that happen. Sure you could learn from MIT Online, but it's purely for your own cred. It doesn't impact your actual university experience in any way except tangential.
So having an open source mentality to creating your own learning adventure, where you could choose either to a) learn everything in a subject and go for further studies, b) focus on studying that which gives you a job and create the best resume/ portfolio, or c) study that which interests you purely because it interests you while also making yourself not unemployable, that would be a great compromise.
The implications of having this would be phenomenal.
Will they continue to get high pedigree faculty?
There's been an uneasy relationship between the professionals who give jobs and the academics who provide education, oftentimes confrontational. In fact calling a problem academic is a great way to dismiss it.
But a) even in academia most education is done by poorly paid adjuncts and graduate students anyway, and b) including professionals in the judging of portfolio bit can only help!
They already do the latter in MBA schools, in film schools, in finance classes. It's just more ad hoc and less common than it should be.
And after all, top notch faculty still need space, salary and resources to do their work. That doesn't go away. The halo effect, even if it only exists as a historical curio, from the major educational brands will continue to serve them well. What changes is the pressure they (or, more likely, their grad students and adjuncts) will feel to provide an actual education to their students.
Will the employers be okay with it?
More and more were seeing the elite employers recognise that the university you went to has little correlation with job performance. They're trying all sorts of new recruitment tools desperate to weed signal from noise.
This can only help.
Will universities continue to exist or have utility?
None of this will obviate the need for universities. There still needs to be physical spaces to put everyone together. I wonder if the other accoutrements will continue to exist, like the multiple sports facilities or the acapella groups galore who take trips to Paris, but that's kind of the true benefit of unbundling. You get to stop paying for stuff you never use.