Today the marketplace is swamped, and still being swamped with graduates completing not just typically three-year degrees, but often a masters as well; having five years of university education under their belts. Not to mention longer vocational degrees such as dentistry, medicine and veterinary sciences.
It also seems to be increasingly acknowledged that most degrees do not equip graduates with the vital skills needed in the workplace today, such as experience analysing company data using specific software. This is coming from both sides: from employers and graduates. See LinkedIn Discussion with Isaac Faber Ph.D. for more.
Isaac Faber Ph.D. on LinkedIn: #datascience | 78 comments
Over the past few years, the amount of learning content available for free or low cost has become overwhelmingly large…
Take myself, for example. Currently studying a BSc in Economics, I have aspirations to perhaps go into business analysis or business analytics — I say aspirations, as who knows what I’ll be doing in five years — if you’d told me I’d be studying economics a few years after graduating in dentistry I wouldn’t have believed you.
Typical requirements for business analytics roles (aside from having a quantitative degree such as mathematics, economics etc.) usually include proficiency in SQL, Excel, and data visualisation software such as Tableau or Microsoft Power BI, amongst others. Some start verging into data science by wanting programming experience with R, Python or equivalent. However, even with taking a module more aimed at this career path (titled Business Analytics Applied Modelling and Prediction), does the content equip the student well enough for the workplace? Well, yes and no.
I’m in a good position to compare my current degree with a Business Analytics Course from a well-established Massive Open Online Course (MOOC) I took in 2018. This was assignment based with real-world data, and gave direct training and experience in SQL, Tableau, Excel and a variety of analytical approaches to problem solving, including presentation of projects.
The university course has a piece of coursework that involves producing a basic dashboard with Tableau (Desktop). The learning of the software is directed to the company itself. The exam is based on screenshots from Excel and basically it is left up to the student as to how far he or she wants to delve into the recommended textbook. You could easily pass this module without having any decent Excel skills (or Tableau skills for that matter). Compare this to the MOOC course and the two could not be more disparate.
With people now expecting to change careers an average 5–7 times throughout their working life, job mobility now not only gives workers an enhanced freedom but also comes with the string attached that they must remain agile by keeping up to date with knowledge, skills and technology. So does a degree help or hinder?
Google have recently announced they’re disrupting education with low-cost certification courses that they’ll value the same as a four-year degree. Combined with the ever-increasing MOOCs available, surely an investment in one of these which is up to date is more affordable, accessible and applicable to the workplace compared to a traditional?
Isaac Faber Ph.D. on LinkedIn: Google is trying to disrupt traditional education with a program…
Google is trying to disrupt traditional education with a program called **Grow with Google.** The program is getting a…
Can Spence’s signalling model from the ’70s really be applied nowadays considering the often extortionate fees for courses? I was lucky enough to study dentistry at a time before the ridiculous hikes in UK tuition fees (but after the time of free higher education). But a major choice in me choosing my current subject and how I studied it, was affordability and accessibility. I wasn’t willing to go back to being a student full time and give up work; especially not when dentistry allowed me to choose my days and do part time work. I decided I’d rather get my quantitative degree and then take short courses, and do my own projects to showcase my ability to use tools such as SQL rather than pursue a master’s degree that guaranteed neither.
I can safely say that the rigour of the mathematics and statistics modules with the university institution I examined with lived up to its reputation. You couldn’t fluff this stuff. Not to mention the impossible timelines of the exams — even the professors who set them said they wouldn’t have enough time to complete them to the standard with required working — go figure. Sadly, the course handbook and video content relied too much on the institution’s reverence and turned out to be poorly digestible from a student’s perspective. This wasn’t just my opinion: scouring through student forums reveals this is a widespread criticism. Thus I gained most of my concepts from comprehensive YouTube Searches.
This detail separates most degrees from MOOCS (I can’t speak for all of them, I doubt anyone can, there are so many out there). But having esoteric statistics questions doesn’t prepare the student for the many useful, real-life applications of the discipline. It provides understanding, which is obviously important, but with the cost of degrees now, the return on investment needs to be more justified, more tangible. Would your employer care if you could regurgitate a statistical proof or would they rather you apply the method of moments estimation to an important business problem?
In my opinion, university courses should focus more on providing the foundation for understanding but test more the applications of such knowledge in a way applicable to the modern business world. Some will argue exams set by their universities do that anyway. And that maybe the case for some. But from my experience today, there is evidence that at least some universities are still assessing students in an archaic way. My statistics examinations in particular were mainly about proofs rather than application of techniques to data.
You could argue that there is nothing wrong with doing as I have done: studying a degree then supplement with MOOCs and separate certifications and training in software. However, doesn’t that mean we’d be spending even more on supplementing our human capital than we already are? Or is it the whole point to keep that money machine churning? With college education suffering from Baumol’s disease, one could argue a serious overhaul is required. Not only on the institutional side — as long as there is an incentive they will churn out courses and some will continue to flock to them — but from the consumer (employer) side.
With those less able to afford — in terms of both time and money — prestigious (and maybe not so prestigious) university course fees, employers and recruiters should be more willing to recognise MOOC qualifications. However, this is at odds with applicant tracking system software used in the industry to filter through sometimes overwhelming numbers of candidates for job posts.
If there was a small step forward to creating this incentive, it would be nice to see ‘degree in… or PhD in…’ removed from requirements of position descriptions unless absolutely required; instead focusing more on demonstrated skill sets. At the moment though, it feels that there is an increasing counterculture of those championing MOOCs and portfolio work over traditional qualifications and the ecosystem supporting them. If we are to give the benefit of the doubt, maybe we can argue there is an incipient change in the form of the career programs from Google and the like. Societal norms haven’t typically had the reputation of evolving quickly, so perhaps we’re impatient while the rest of society catches up, trying to iron out the kinks?