You have /5 articles left.
Sign up for a free account or log in.

There seems to be a growing trend to deconstruct the traditional university degree in favor of an amalgam of separate pieces.  This trend further complicates the current debate about what a university degree should represent.   

Traditionally, US universities have been quite generous in incorporating prior learning for progress towards a degree, accepting AP exams and international qualifications such as the international baccalaureate, the German Abiturprüfung, the British A-levels for advance standing in an undergraduate degree program.  These pre-university programs are generally recognized as providing the equivalent of first-year introductory courses at the university level and students who are successful in these examinations are rewarded with college credit and advanced standing.  This allows students to skip over classes that might prove to be redundant for them, move more quickly towards their degree, and save tuition fees.  Additionally, most American universities welcome transfer students and will make every effort to incorporate previous study into the student’s progress towards degree completion.

The practice of absorbing study completed elsewhere towards the degree has not been common outside the US.  In fact, in many Latin American universities, study completed successfully in one department is not likely to be accepted towards a degree in another department within the same university, should a student decide to change their major or career track.  If a student wants a degree, then he or she must complete the entire program precisely as it was designed by the professors of that program.

I find it difficult to endorse the rigidity that will not recognize an introductory physics taught by the science faculty of a university towards a degree in engineering at the same university.  On the other hand, I have serious doubts about putting a lot of pieces together, collected in several different places and calling it a university degree.

Increasingly, US institutions are granting credit towards degree for other types of “pre” or “extra” learning in lieu of campus-based, campus-designed curricula.  This includes awarding credit for an growing array of experiences—knowledge acquired through life experience, coursework completed online, study completed in pathway programs, and job-oriented training provided in boot camps. 

In theory this makes sense.  After all, why make someone endure (and pay for) a class when they already possess the knowledge being covered?  Why not offer students different kinds of learning opportunities?  Does it make a difference where and how knowledge and skills were acquired?  A degree should simply represent that the holder possess a certain body and level of knowledge and a set of competencies.  Isn’t that kind of what “qualification frameworks” are all about?

It’s “outsourcing” that makes me particularly uneasy.  Pathway programs were one of the earlier forms of “outsourcing” part of the degree program, where universities partner with (often) for-profit corporations who provide transitional programs to international students with part of the program offering “guaranteed” university credit. Boot camps seem to be the latest twist in the tradition of validating work done elsewhere.  Boot camps are programs designed and provided by unaccredited, non-academic institutions, for the purpose of developing employment-oriented skills.  Several colleges already outsource part of their degree program to pathway program providers; others are experimenting with “outsourcing” sections of their curriculum to boot camp providers.  

As universities incorporate more and more work completed elsewhere, and now with a growing trend to validate study provided by third-party partners outside of the academy, the degree begins to look like a patchwork quilt of study.

I can already sense some readers accusing me of being stuck in the past, biased towards an elite and expensive model of higher education, etc.  But as the criticism of higher education has progressed to a din, we may be making things even worse.   Expectations of universities are much higher than even a few decades back—knowledge, hard skills, soft skills, job skills, etc. —and society and government are demanding greater accountability.  How can a university be truly accountable when they allow a significant part of the degree program to be completed elsewhere?

When colleges are advancing students based on learning (and the evaluation of that learning by others) done off campus, to what extent can that college be responsible for the personal development of that student?  When a university offers credit for work completed elsewhere, what do they really know about the learning experience—who taught those students, with what qualifications, what was the quality of learning experience offered, how was the student's work evaluated?  Sure, universities can monitor course content but it’s not easy to monitor the rest.

If we put this trend into international context, it gets even more complicated.  US institutions want (and expect) their degree to be recognized internationally.  I suggest that the credibility of the US degree abroad is weakened when significant parts of the academic program have been outsourced to off-campus partners. 

So, it comes down to this—what is a college degree?  Is it a compilation of credits, or a coherent program designed by scholars, researchers, and administrators with multiple objectives to foster the personal and intellectual development of the individual recipients of that degree?  If it is meant to be a coherent program, then how many external pieces can be assembled before the whole is nothing more than a collection of parts? 

Next Story

Written By