When I first floated the idea of writing a weekly column from my perch as director of institutional research and assessment at my college, everyone in the dean’s office seemed to be on board. But when I proposed calling it “Delicious Ambiguity,” I got more than a few funny looks.
Although these looks could have been a mere byproduct of the low-grade bewilderment that I normally inspire, let’s just say for the sake of argument that they were largely triggered by the apparent paradox of a column written by the measurement guy that seems to advocate winging it. But strange as it may seem, I think the phrase “Delicious Ambiguity” embodies the real purpose of Institutional Research and Assessment. Let me explain why.
This particular phrase is part of a longer quote from Gilda Radner – a brilliant improvisational comedian and one of the early stars of “Saturday Night Live.” The line goes like this:
“Life is about not knowing, having to change, taking the moment and making the best of it, without knowing what’s going to happen next. Delicious Ambiguity.”
For those of you who chose a career in academia specifically to reduce ambiguity – to use scholarly research methods to discover truths and uncover new knowledge -- this statement probably inspires a measure of discomfort. And there is a part of me that admittedly finds some solace in the task of isolating statistically significant “truths.” I suppose I could have decided to name my column “Bland Certainty,” but – in addition to single-handedly squelching reader interest – such a title would suggest that my only role is to provide final answers – nuggets of fact that function like the period at the end of a sentence.
Radner’s view of life is even more intriguing because she wrote this sentence as her body succumbed to cancer. For me, her words exemplify intentional – if not stubborn – optimism in the face of darkly discouraging odds. I have seen this trait repeatedly demonstrated in many of the faculty and staff members I know over the last several years as you have committed yourself to helping a particular student even as that student seems entirely uninterested in learning.
Some have asserted that a college education is a black box; some good can happen, some good does happen – we just don’t know how it happens. On the contrary, we actually know a lot about how student learning and development happens – it’s just that student learning doesn’t work like an assembly line.
Instead, student learning is like a budding organism that depends on the conduciveness of its environment; a condition that emerges through the interaction between the learner and the learning context. And because both of these factors perpetually influence each other, we are most successful in our work to the degree that we know which educational ingredients to introduce, how to introduce them, and when to stir them into the mix. The exact sequence of the student learning process is, by its very nature, ambiguous because it is unique to each individual learner.
In my mind, the act of educating is deeply satisfying precisely because of its unpredictability. Knowing that we can make a profound difference in a young person’s life – a difference that will ripple forward and touch the lives of many more long after a student graduates – has driven many of us to extraordinary effort and sacrifice even as the ultimate outcome remains admittedly unknown. What’s more, we look forward to that moment when our perseverance suddenly sparks a flicker of unexpected light that we know increases the likelihood – no matter how small – that this person will blossom into the lifelong student we believe they can be.
The purpose of collecting educational data should be to propel us – the teacher and the student – through this unpredictability, to help us navigate the uncertainty that comes with a process that is so utterly dependent upon the perpetually reconstituted synergy between teacher and student. The primary role of institutional research and assessment is to help us figure out the very best ways to cultivate – and in just the right ways – manipulate this process.
The evidence of our success isn’t a result at the end of this process. The evidence of our success is the process. And pooling our collective expertise, if we focus on cultivating the quality, depth, and inclusiveness of that process, it isn’t outlandish at all to believe that our efforts can put our students on a path that someday just might change the world.
To me, this is delicious ambiguity.
Mark Salisbury is director of institutional research and assessment at Augustana College, in Illinois. This essay is adapted from the first post on his new blog.
The Higher Education Opportunity Act of 2008 required the Department of Education to publish "College Affordability and Transparency Lists" of colleges and universities with the highest and lowest published prices and the highest and lowest net prices of attendance. To get on the list a college needs to be in the top 5 percent or bottom 10 percent of the cost scale. The new lists are now out. On the Education Department website, Secretary of Education Arne Duncan says, "These lists are a helpful tool for students and families as they determine what college or university is the best fit for them." We wish they really were.
A casual glance at the shame list of the most expensive private colleges pulls up names like the Culinary Institute of America, the Art Institute of Chicago, Cornish College of the Arts, and Berklee College of Music. At the other end of the scale, the list of the lowest-cost programs includes quite a number of tiny Talmudic institutions and Bible colleges. These are all fine programs, but to paraphrase former President Clinton, this isn't a list that looks like where America goes to college.
The Department of Education's definition of the net price of attendance paints a false picture of what most students have to pay, and the list of colleges and universities held up for public rebuke clearly reflects the weakness of this measure. In addition, the way net cost is calculated by the Department of Education may induce some institutions to change their behavior in truly unhelpful ways as they attempt to get off the list or to make sure they don’t get on it.
Here is the federal definition of the net price of attendance: "Average net price is for full-time beginning undergraduate students who received grant or scholarship aid from federal, state or local governments, or the institution." For starters, this price of attendance includes room and board charges, which arguably should not be a cost, since the student must eat and sleep whether or not they are in college.
But the key problem with the definition is that the net cost measure for the whole institution is based only on those students who actually get aid. Here is an example of the type of problem that this measure can create.
College X has 5 students and a list price of $40,000. Suppose that one of the five students at this college gets a full $40,000 scholarship and the remaining four are full pay students whose families fork out $40,000. For this institution the Education Department would report an average net price of $0 since the only student getting aid received a completely free ride. Now consider College Y, which also has 5 students and the same list price of $40,000. College Y has a scholarship budget of $40,000, just like College X. But instead of concentrating its grants upon one student, college Y gives each student an $8,000 scholarship. The Department of Education would tell us that average net price for College Y is $32,000.
College X would show up as the world’s greatest bargain, while College Y might find itself on the shame list. This disparity of rankings happens even though both colleges have the same number of students, the same list price, and spend the same amount on scholarships. College X has found a way to game the system. By concentrating its aid, it decreases the net price for students receiving aid. But it has very few of those students.
Individuals and groups tend to respond to incentives. Colleges and universities are no different. When policy makers set up the rules of the game, colleges will respond to the incentives if they can. Since the shame list is a fixed percentage of the total number of colleges in the group, any college near the margin has a strong reason to try to game its way off the list. As our example makes clear, with the current rules one way to avoid being on the shame list is to concentrate grants and scholarships among as few students as possible.
We are sure that this is not what the Department of Education intends. But let's look at the socially perverse ways that colleges could game the system to get off the shame list. One way to concentrate aid is to avoid recruiting students who are likely to qualify for federal student aid. This is diametrically opposed to the thrust of federal financial aid policy ever since the passage of the Higher Education Act of 1965. Federal financial aid has been aimed at students with demonstrated financial need in order to improve access to higher education.
Additionally, if an institution does have a student who qualifies for federal need-based financial aid, the incentives are to pile on institutional grants. As a result, this poor student would get a better deal, but slightly less-poor students will be left out, and the middle-class students, unless they are targets for merit-based grants, will get a worse deal. In fact, middle-class students become somewhat poisonous in this system because they are a numerous group and because they require smaller dollops of aid than poorer students. Having a lot of middle-class students who receive small grants is a guaranteed way to see your net price go up in the federal calculation.
We’re sure the federal government does not want colleges to concentrate their grants and scholarships on a smaller number of students, but that is the new incentive for institutions that might be able to reallocate their grant monies.
The reasonable alternative way to measure the net cost of attendance is to compute what the average student actually pays, not what the average student who gets aid winds up paying. This method adds the full-pay students to the calculation. Many colleges, however, will favor the current federal definition of net price because the federal formula produces a lower number for net price than a formula that includes all students. But the colleges' preference for a lower number should not stand in the way of producing a measure that is more game-proof.
This way of calculating net price has the advantage that the number it gives you for net price is independent of how grant aid is distributed among the students who receive aid. If our hypothetical five-student college has $40,000 in aid to distribute, this measure will be unchanged whether that college gives all the money to one student or an equal amount to each. Using this measure of net price, if the institution wants to reduce its average net price then it will either have to find a way to reduce its list price or it will have to increase its scholarship budget. We think this is what policymakers want.
Our example is not a crazy hypothetical. For the colleges on the current shame list the percentage of their students getting aid is a very high 82.5 percent. But spreading this aid so widely as a tool to craft the freshman class now comes with consequences. It gets you smacked with a higher calculated net cost of attendance than if you had recruited differently.
We have recalculated net price to include all students and produced a very different "expensive 5 percent." On the alternative list, institutions such as Northwestern, Brown, Georgetown, and Villanova Universities and Boston College now appear. This list looks more like America’s major private colleges and universities. And on this list, aid is more concentrated. In the alternative top 5 percent, the percentage of students receiving aid is only 55 percent. The rest are full-pay students.
For individual colleges the effect is substantial. To take one example, Oberlin College is on the Education Department's shame list, holding down the 31st spot. Oberlin gives aid to 85 percent of its students. If we calculate net price for all students, Oberlin falls out of the shameful 5 percent.
Yet no matter the methodology, we do not think ranking colleges by net price is particularly useful. A single number for each institution tells a family almost nothing. The whole shame list ranking is a political exercise of dubious social value. The Education Department website does send students to each institution's net price calculator, but these calculators tend not to be standardized and they are often quite complex.
There is mileage, however, in reporting the size of the average grant aid package students receive at each college, broken down by rough categories of family income. A family should be able to find their income level and see that on average a student at university X that comes from a family like theirs received $5,682 dollars of federal, state, and institutional grant aid (not loans). Then the family could decide for itself whether or not a particular college offered value for money. There is no shame in having an expensive program and no particular virtue in being cheap.
We wish we could share Secretary Duncan’s optimism about the current lists, but we cannot. These lists will not help students and families find the college that best fits them. The process of matching students to colleges is complex, and the information contained in the net price rankings actually is misleading. Lastly, as institutions respond to the incentives inherent in the current methodology for calculating net price, institutional aid may be distributed in increasingly unfair and socially inefficient ways.
Robert B. Archibald and David H. Feldman are professors of economics at the College of William and Mary and are the co-authors of Why Does College Cost So Much?(Oxford University Press).
Newton’s First Law of Motion states that an object at rest tends to stay at rest and an object in motion tends to stay in motion, and once in motion, that is when it develops momentum. It will tend to stay in motion unless acted upon by an external force.
Elucidated by Newton in 1687, the first law of motion can also be applied to study of student completion, for like objects, students at rest tend to stay at rest and students in motion tend to stay in motion. Once they gain momentum (that is, acquire more degree credits), they are more likely to stay in motion unless acted upon by an external force.
Gaining and maintaining momentum is key to student completion. Students who progress more quickly through the curriculum are considerably more likely to complete their degrees than those who do not.
This is but one reason why a number of states have begun to focus on the importance of student momentum to completion. The Washington State Board for Community and Technical Colleges, for instance,utilized the analysis of the transcripts of more than 87,000 first-time community and technical college students who entered the Washington system in the 2001–2 academic year to identify key points in the curriculum, referred to as momentum points or milestones, whose timely attainment was associated with student progress to degree completion.
For most institutions, these intermediate points of attainment include the successful completion of developmental coursework, the timely declaration of a major, and the earning, within a particular time period, of a number of degree credit hours. These momentum points were then folded into the state’s funding formula such that institutions are now rewarded when they improve the number of students attaining those points of intermediate achievement. Other states have or will soon follow suit with similar models of funding that center on the importance of student momentum to completion.
Identifying intermediate points of attainment is one thing. Helping students gain momentum and attain them in a timely fashion is another. Unfortunately, not all students are able to do so. Take the case of students who begin college academically under-prepared. Too many spend too much time on coursework for which they earn no college credit. It some cases it may take some students two or more years to complete basic skill requirements, if they are able to do so at all.
This is but one reason why an increasing number of colleges, such as the Community College of Baltimore County, are turning to accelerated learning programs for those students who begin just one level below college-level work. In this case, rather than being placed in a stand-alone basic skills course for which students do not earn college credit, they are placed in the college-level course to which that course would have provided entry together with a study skills course that is directly connected to that course. In this manner, students earn college credit while acquiring needed basic skills.
Similarly, colleges such as the Community College of Denver have condensed what would otherwise be a two-semester sequence of either developmental math or developmental English into one semester in their FastStart program. By adopting interactive teaching and learning strategies, contextualization of developmental coursework, and cohort-based models, they have been able to substantially increase the percentage of students who complete their developmental coursework and continue in college.
A number of other institutions have taken a different approach to speeding up student progress through developmental coursework by revising the way students’ skill levels are assessed at entry. Tarrant County Community College, for instance, employs ALEKS and MyMathLab not only to assess student math skills bur also provide students an online vehicle to address those skills that require improvement.
Rather than categorizing students into three math levels, each of which requires an individual course to address, Tarrant officials identify 15 math skill modules and ask students to take only the specific modules in which they need help. Using Computer Assisted Instruction, they have greatly accelerated students’ movement through developmental math and in turn reduced institutional costs. Other institutions, such as Capital Community College and Kapi’olani Community College, have successfully employed summer bridge programs that enable underprepared students to get a head start of their first year of college and therefore move more quickly to earning college credits.
Gaining momentum toward degree completion requires that students not only earn college credits but also do so in ways that lead to degree completion. Yet many students begin college undecided or change their majors, sometimes several times. This is but one reason for the growing emphasis on intrusive first-year advising merged with career counseling. In addition to the front loading of such advising and the use of first year student success courses in which advising and counseling are embedded, as they are in Florida, a number of institutions have employed web-based solutions to help students establish career and educational goals in a timely manner.
Programs such as Valencia Community College’s LifeMap and Century College’s GPS LifePlan, now widely used in Minnesota, have used such programs to increase goal setting and in turn retention and completion. Other institutions, such as Saddleback College, utilize predictive analytics to construct real-time on-line advising systems that responds directly to student advising needs as they progress through the institution.
Unfortunately,student progress is frequently constrained if not halted by the incoherent array of courses that typify most college offerings. Lacking any clear structure, students tend to wander through the curriculum in ways that undermine their ability to make timely progress. Some leave in frustration and others amass more credits than they need for program completion, that is, if they are ever able to do so. It is for this reason that a number of colleges seeking to improve rates of completion have turned their attention to curricular structure and coherence. Under the auspices of the Bill & Melinda Gates Foundation’s Completion By Design initiative, consortia of community colleges in four states -- Florida, North Carolina, Ohio, and Texas -- are working to develop coherent course pathways whose structure enable, if not require, students to move more quickly through the curriculum to the certificate or degree completion.
In these and other ways, institutions and states are coming to recognize the wisdom of Newton’s First Law of Motion and the importance of student momentum to college completion. Hopefully these and other efforts will take on a life of their own and gain sufficient momentum to transform how institutions approach the task of improving college completion.
Vincent Tinto is Distinguished University Professor at Syracuse University.