Earlier this past summer, the U.S. Department of Education announced it would eliminate a student’s opportunity to list in rank order the colleges and universities to which he or she had submitted the Free Application for Federal Student Aid. Many in higher education, and most involved in college counseling, applauded the decision.
Then, this month, the National Association for College Admissions Counseling amended its ethical guidelines to memorialize the department’s action, and it now discourages colleges from asking applicants to list in rank order the colleges they are considering.
These recent changes will force many of us who work at colleges and universities to more directly ask students about their level of interest in our institution. Because we will no longer be able to rely on our ranked position on the FAFSA, which had very high predictive value related to a student’s prospect for enrolling, we now will have to do the asking. This will be new territory for many of us and for students, but I believe such directness can be good for colleges, admissions offices, families and students.
I suspect this shift in communication may have been unintentional on the parts of both the Education Department and NACAC. I also think their actions were the result of a “parade of horribles” -- what-ifs and speculations -- that undoubtedly will bring focus to other important strategies and tools used by many colleges in the contemporary practice of admissions.
Oft mentioned among the parade of horribles are:
the potential for admissions offices to use (“misuse” is a better term) information, like rank order, to influence admissions and financial aid decisions;
the pressure on students to develop a strategy in developing their list order to make sure to maximize their options;
the potential that first-generation students and those from underserved or underresourced areas will not understand the process.
These sound pretty awful, while the actions of the Department of Education and NACAC, designed to protect students, seem sensible. So why in the world would admissions and enrollment professionals, also presumably interested in serving and recruiting students, engage in such practices?
Let’s start with two premises.
First, there are three types of colleges: superselective institutions that have the luxury of “crafting a class,” open-access colleges that accept everyone who applies and colleges that work tirelessly all year just to make each class.
Second, one of the primary responsibilities of today’s enrollment manager or senior admissions leader is to predict who will enroll.
While my institution may be positioned between the superselective and the just-make-the-class types, my sympathies are more closely aligned with the latter, given the realities of demographic shifts, changes in ability and willingness of students and their families to pay, and the affordability advocates who tout cutbacks to areas such as marketing, administration and recruitment.
At Augustana College, where I work in admissions, one of my primary responsibilities is to offer the president and the Board of Trustees a data-informed prediction about who will enroll each year. This prediction sets in motion a budget and planning process that impacts the quality of education we offer our students and the livelihoods of the people who serve our students. Therefore, I want to have as many resources as possible to help inform that prediction.
We don’t ask students to rank order the institutions to which they’ve applied, but we do ask admitted students whether Augustana ranks first or in the top three or top five choices. We’ve done this for years, postadmission, and have found it to be very helpful in prioritizing our outreach to students and making the best use of our time as admissions professionals. We’ve used this information along with FAFSA position to help predict who will show up on our campus in the fall.
So, let me offer a few reasons -- not in any rank order -- why an admissions office might want to have a good idea about our relative standing with students in an effort to be efficient and make credible predictions.
Limited, constrained human resources. For most college admissions offices, especially at those institutions that need to work very hard to make the class, human resources must be deployed carefully, thoughtfully and with the greatest good in mind. Given the size of applicant pools, it is usually impossible to develop relationships with everyone who applies. Many admissions offices try to learn where to focus their efforts to make the most meaningful connections. Information like the ranking of colleges, and many other things that demonstrate students’ interests, can help an admissions counselor prioritize work and concentrate on the students most likely to enroll. At institutions that need 20 to 25 percent of our admitted students to enroll, being able to connect with those most likely to choose our college is quite useful.
The need to work smarter. A constant chorus on college campuses today is to “work smarter, not harder.” Data equip an admissions office to do that. I am aware of very few admissions offices that are increasing staff sizes, which means we are expected to work smarter every year in an environment of heavier workloads and shrinking resources. Lacking human resources, we need data, tools and processes that streamline and focus attention and allow us to be smart in our work.
Vital volunteer engagement. When it takes a village to make the class, ensuring that your village of volunteers has meaningful engagements with prospective students is crucial to long-term recruitment and admissions success. Most admissions offices rely on campus partners to supplement the recruitment effort and ultimately be effective. If there’s one thing I know about volunteers, it is that one bad experience can turn an enthusiastic volunteer away forever. Many admissions offices need to do an internal sort to make sure volunteers have good experiences. Data that inform an internal sort are important to maintaining valuable relationships with our volunteers, too.
Efficiency and access. Most important, good use of time means we can focus more on first-generation or underresourced students and families. One of the reasons we must prioritize is so we can spend more hours on creating access -- working with populations who are not as familiar with the college search process or our type of college. Understanding that one student is clear about choosing your college can free you up to counsel others who need more information to make a comfortable and informed decision.
Most people would agree this list does not in any way sound related to a “parade of horribles.” In the end, it may just come down to the fact that communication patterns and predictions keep changing. Perhaps in a couple of years, students, becoming more savvy by the minute, will decide once they’re admitted to tell each college or university that it is number one on their list -- thus hoping to get more attention. To get to the real truth, we will again have to change our approach to how we ask them.
Because, ultimately, we should do all we can to communicate honestly and in depth with our accepted students, and that begins with directness and an effort to truly know what they are thinking. It’s the kind of communication that should precede any commitment of this magnitude.
W. Kent Barnds is vice president of enrollment, communications and planning at Augustana College.
As director of admissions at the College of Holy Cross, one of the 80 colleges and universities that have joined to launch the coalition, I am delighted -- and encouraged -- to be part of this effort to improve and reform the college admissions process for all students.
In my 36 years in college admissions, I have seen the stress and angst of students during the college search grow exponentially each year. Many students are under enormous pressure (some of it self-imposed, much of it driven by a marketplace focused on rankings and test scores) to get into the “right” college. Too often, students don’t devote time and energy to truly thinking about who they are, who they want to become and how their choice of college can help them achieve their goals. In addition, too many talented students are opting out of or severely limiting their college search because of the perception that a college is out of their and their family’s financial reach.
The coalition’s online tools promise to alleviate both of those obstacles. That will benefit not only high school students in navigating their search, but also colleges like mine in recruiting and enrolling our classes. The tools will drive students to start the college search much earlier and help in finding a diverse set of colleges and universities that will invest in them financially and academically.
Providing a way to start building a digital portfolio early in their high school career will, I hope, encourage more and more students to give more time and thought to what they want out of college. I also am excited that the application process promises to be a resource for first-generation college students and those from underrepresented groups or low-income households. For example, a student from a low-income background can now use the collaborative platform to invite mentors, advisers, a parent and others to engage in a dialogue. They can provide feedback directly on the platform and let the student know if what he or she is producing is on the right track. I see enormous possibilities for students in these groups to be empowered by the options and flexibility this platform will provide. I also hope that starting earlier in the process will give them a college mind-set.
At Holy Cross, we use a holistic admissions process and evaluate every aspect of an applying student’s background, experience and achievement in order to work toward the diversity of a class and the campus community as a whole. Currently the admissions office evaluates students based on the four-year story they tell us through their transcripts, essays and interviews -- a file that is typically put together in a few months. The coalition tool will allow students to spend even more time and reflection on their applications. The new tool will give high schools across the country a free and sophisticated system that has not been available to them in the past.
At Holy Cross, we are committed to building a campus community that represents diversity in all respects, including cultural, ethnic, racial, socioeconomic and geographic. For us, diversity is a constant work in progress, and we seek students who will thrive in and contribute their talents and perspectives to our community. The coalition’s direction and tools will help us get even better at meeting these goals. These tools -- across the board -- will encourage students to think about college earlier in the process and also help them to find an alternative way to represent themselves beyond essays and SAT scores.
Holy Cross became SAT optional in 2006. Almost 10 years later, I can say with confidence that becoming SAT optional has brought our college very positive results. The first classes to be admitted under the new policy -- beginning with the Class of 2010 -- have been more geographically and ethnically diverse than previous classes. The percentage of ALANA (African-American, Latin American, Asian-American and Native American) students went from 17 percent in 2006 to 21 percent in 2010 to 24 percent this year.
As a Jesuit institution, Holy Cross places a high value on the unique combination of background, experience and personal qualities in each individual and the opportunity to learn from many life situations. As an alternative to the Common Application, I expect that the coalition’s application will work with our current admissions process in choosing future classes. That being said, it won't be without challenges for our staffing and processes. But we will use those challenges to create opportunities and adapt to the changing admissions needs. I eagerly look forward to reading the applications from students applying to the Holy Cross who opt to use the new platform.
Ann McDermott is director of admissions at the College of the Holy Cross.
Hampshire is the only college that not only doesn't require the SAT, but won't look at applicants' scores. The college is no longer ranked by U.S. News -- and it may have just had its best admissions year ever.
While the U.S. Department of Education’s College Scorecard website may be a scaled-back version of what President Obama first announced on the State University of New York’s own Buffalo campus in 2013, it will be a useful tool for providing the information students and their families need to make decisions about college costs and return on investment.
We agree with President Obama: it’s not a moment too soon for colleges and universities across the nation to be held to a standard of transparency and accountability. The bottom line is that if we really want to take a bite out of student debt, we have to help students understand the true cost of college and what it is they’re paying for. The College Scorecard, which provides new measures of student outcomes at specific colleges and universities -- including graduation rates, median salaries and loan repayment rates -- is an important step in the right direction. Increasing college completion ought to be the next.
While SUNY is proud to offer fair and predictable tuition that is the most affordable of public colleges in the Northeast, we know that controlling tuition alone will not solve the debt crisis. There must also be a strong commitment to ensuring that students finish their degrees as quickly as possible, without taking unnecessary courses and thus ringing up additional cost.
SUNY has committed to increasing the number of degrees awarded annually from 93,000 to 150,000 by 2020. We’re going to ensure that more students complete on time at lower cost. And in doing so, we will expand access to what we know is one of the most valuable commodities in today’s society: a high-quality college degree and an educational experience that has prepared each graduate for workforce success.
SUNY is already a leader when it comes to student completion and achievement, in part because we have created and expanded programs that help students get their degree. Our four-, five- and six-year graduation rates for baccalaureate students surpass those of our national public peers, and the same is true of our two- and three-year graduation rates at the associate level. We launched our own financial literacy tool, SUNY Smart Track, which ensures that students and families understand their borrowing options and responsibilities; we adopted the nation's most comprehensive seamless transfer policy; and we are significantly expanding online course offerings through Open SUNY.
However, we know that until every student completes, we have more work to do.
We recognize the need to continuously improve and welcome effective ways to do so. I am pleased to see that the metrics included in the Scorecard mirror those used to ensure quality through SUNY’s own performance management system, SUNY Excels. In fact, the 64 campuses of SUNY are currently at work fine-tuning performance plans for how they will answer a systemwide call for improved retention and graduation rates, greater financial literacy among students, expanded applied learning and research opportunities, and more. The College Scorecard could help us measure our progress on some of those goals, both within SUNY and in comparison to others nationally. It will allow us to identify the programs and interventions that really move the dial on student completion so we can take them to scale across our university system.
I am especially encouraged by the administration’s commitment to adding Student Achievement Measure (SAM) data, which accounts for the outcomes of transfer students, to the Scorecard. A large number of students move in and out of institutions or transfer without a degree, which means that many colleges and universities have a majority of students that the federal system would otherwise not count. Throughout this process, I have stressed the importance of SAM, joining my colleagues in the Association of Public and Land-grant Universities just recently in a final push to use this data because it is so vital in providing students and their families with the complete picture on degree attainment. At SUNY alone, nearly 30,000 students transfer annually among our institutions, and last year, 35 percent of all our undergraduate degrees were awarded to transfer students.
In pivoting from the original proposal to rate colleges and universities -- many of which have significantly different missions and serve vastly different student bodies -- and ultimately adding in SAM data, the Scorecard will also account for the diversity of institutions and the students they serve. As a public institution with a founding commitment to access for New Yorkers, we see transparency and accountability as fundamental to helping parents and students understand opportunities and challenges as they navigate an increasingly complex cradle-to-career pipeline.
I applaud President Obama and Education Secretary Arne Duncan for recognizing, as SUNY has, that data must be a driving factor as higher education works toward continued improvement. I look forward to working with my colleagues in higher education and with our federal partners in a continuing effort to bring to light the most comprehensive and accurate data available to help students make informed choices.
Nancy Zimpher is the chancellor of the State University of New York.
Here’s the good news about the new College Scorecard: no rankings. In dropping its proposed plan, the Obama administration showed recognition of the difficulty -- indeed, the impossibility -- of providing students and their families with measurements that could determine which colleges offer “best value” and “worst value.”
The administration had hoped to come up with an easy-to-understand website for measuring value that would help students and families understand the return on investment they could expect for the cost of a college education. That effort proved entirely unworkable to most people who studied the proposal closely. So we all owe the U.S. Department of Education thanks for scrapping that proposal.
To begin with, measuring the value of an education in monetary terms fundamentally mischaracterizes the nature of higher education. The highest learning is no more a commodity than one's life is a commodity. Students need an education that will help them earn a living, but they also wish for a fulfilling life, one that goes beyond any economic measure.
Students are looking for many different things from their time in college, and there is enormous diversity among colleges in what they offer a student. What is valuable to one student may be of little value to another. Needs and results will vary from student to student even within an institution, let alone across different institutions.
The Obama administration’s aim is to provide transparency about factors to consider in choosing a college. This is a laudable goal. Yet the new College Scorecard continues to place a premium on economic rather than noneconomic valuations. That is regrettable.
My initial impression is that the website is largely about price, cost, financial aid, success in securing a job, ability to pay off debt, courses of study offered and the size of the student body. There is also one indicator about whether the school is located in an urban, suburban or rural area.
But what about the rigor and breadth of instruction? Don’t students and parents want to know whether a student can expect to graduate with a firm grasp on how to think about the world, and how to communicate their thought in speech and writing -- all skills that will help them much more in life than any specialized career training? And what about meaningful faculty-student interaction? Surely students would like to know whether they will be likely to end up in small discussion classes or large, anonymous lecture classes. And what about the richness of student life, religious affiliation, cultural resources in the surrounding community, demographic diversity and other factors that may indicate the quality of life that a student can expect during his or her studies? It does not seem that the College Scorecard provides any guidance about such matters, which can be crucial in deciding which college to attend.
There are also serious questions about the sources and relevance of the data, which seems to be several years out of date. Will families reviewing Scorecard information realize that many colleges and universities -- including my own -- have added large amounts of funding for financial aid during the past few years? Or will they simply reject some institutions based on inaccurate data? The U-CAN website (University and College Accountability Network) designed by the National Association of Independent Colleges and Universities, after extensive testing with families about the kinds of information that would be useful, has much more current and comprehensive data on this topic.
The new College Scorecard website also tries to provide information about the average salaries that graduates of each college earn six years after enrolling. It is difficult to know where this information comes from. Does it take into account how many students continue on to graduate school? If not, the averages will seem to be lower than they really should be.
The Scorecard apparently tries to compensate for graduate students in its salary data by not counting former students who are in deferment on student loans. But for many reasons, not all of a college’s former graduates borrow money, and consequently do not defer loans while in graduate school. These graduate students will turn up as having no income or minimal income -- even if being in graduate school is their intended goal, and their undergraduate institution prepared them well for graduate study.
And in any case, earnings are more related to the choice of occupation than to the choice of college. And what about controlling salary data for location? If large numbers of a college’s graduates settle in relatively low-cost areas of the country, the average salaries for their students will be lower than at other colleges.
Also, this emphasis on earning is at odds with President Obama’s emphasis on the need for more teachers, social workers, geriatric care workers and child care workers. None of these professions is what one would call highly compensated.
And why does the site use the six-year graduation rate as a measure of success? Why not use the four-year graduation rate? Surely families don’t really want to know how many students take six years to go through a four-year program.
But one of the most disturbing aspects of this scorecard is its reliance on income data retrieved from the IRS and matched with student loan information possessed by the Department of Education. One had the notion that the IRS data would be protected from use by any other federal agency. And how is the employment data ever to be verified if the public does not have access to information that is likely to have the consequences that this scorecard may have?
On top of all this, as American Council on Education President Molly Corbett Broad has already pointed out, no external review was performed before the site went live. Now that college officials are getting a chance to look at it, many are seriously concerned about the reliability of the data.
The good news on that score is that there are already valuable tools in place. One of the best is the U-CAN website already mentioned. This resource provides comparable information about each participating college, with links to more detailed information if desired. I think that this resource would be far more helpful to students and their families than the new Scorecard.
The Obama administration has more support than it might realize from the very colleges and universities it is trying to evaluate. We too wish to be as transparent as possible about what is going on at our schools, about what it costs to attend and about what our alumni do with their lives after graduation. Together we want to help prospective students make responsible choices that suit their circumstances and their dreams for the future.
The new College Scorecard, however, seems to have a great potential to mislead, misinform and discourage students and their families. Talk about defeating the purpose.
Christopher B. Nelson is president of St. John's College, in Annapolis, Md.
The Center for Community Alternatives’ report on the use of prospective students’ high school disciplinary behavior records in the college admissions review process exposes the wild, wild west that exists with high schools and their disciplinary policies. Both the school-by-school variations in reasons for suspending or expelling students and the differing methods for reporting such information understandably raise concerns about negative implications of the collection and use of such information.
Particularly troubling is the impact that differing disciplinary policies and practices have had on students, primarily underrepresented students, beyond high school. However, CCA’s recommendation that colleges cease any consideration of student discipline as part of the application review process is an irresponsible solution to a problem that requires a more judicious approach. Disciplinary behavior information is important for legal and public safety reasons and is often obtained and used without harming campus diversity.
As an admissions officer at a public four-year institution that serves an urban population, I am always concerned that our admission policies not create barriers for minority students. At my institution, high school applicants are required to provide transcripts and test scores. They also have to indicate whether they have been subject to disciplinary action at their secondary institution and/or have a misdemeanor or felony.
I know the admissions process is often mysterious and daunting, even without requiring supplemental information such as personal essays and recommendation forms, especially for underrepresented students. Requiring criminal history information adds to the fear some applicants have about how they will be viewed during the decision review process. I have spoken with students and parents who are concerned with how disciplinary and/or criminal disclosure information is used in the admissions process, and whether it is necessary, especially if the person has already paid their so-called dues to society. I also have seen a difference between punishments imposed on applicants, and have heard applicants express frustration with biases they have experienced, based on race.
Thus the CCA’s concerns over how this information could come to play in the review are important, but they do not warrant abandoning an often carefully considered process that serves a valid purpose in higher education admissions. Checking a box stating that the student faced disciplinary actions while in high school does not have to be the end of a student’s dream to obtain a college degree.
As a matter of both policy and process, the collection of disciplinary information during the admission process serves an important function in higher education for at least two reasons. First, part of assessing admissibility involves making a determination about character. Students involved in cheating, for instance, may not stack up as favorably as students who have earned their grades honestly.
Second, while some disciplinary disclosures are now required for campus life purposes (and therefore not necessarily used as a factor in admission decisions), there are institutions where the admission process and the enrollment/matriculation process are one and the same. So banning any consideration of disciplinary information in admission presents a procedural obstacle to fulfilling requirements many campuses must meet under state laws and universitywide policies. For instance, changes were made to Indiana law in 2014 restricting the use of expunged criminal history records in the hiring and academic admissions process. This prompted Indiana University to adopt a universal criminal history policy for all campuses.
For reasons such as these, the CCA’s recommendations fall far short of a solution to the problem they rightly identify. I would prefer that higher education focus on CCA’s point about the assessment of disciplinary information by “untrained” professionals, which is something that admissions professionals and their professional associations are well poised to address.
Each institution should adopt its own uniform policy for all applicants requiring the disclosure of any disciplinary action taken against them at another school or college. A collaboration of personnel from admissions, other enrollment services offices and the dean of students/student affairs and legal counsel could be required to write, monitor and review a comprehensive policy, and thereby address concerns related to balancing legal and public safety concerns with diversity recruitment initiatives. Having the same staff responsible for reviewing the disclosures would address the arbitrary decision making by “untrained” staff that CCA notes as a limitation to the review process.
Under well-developed and researched policies, institutional admissions staff could be trained on how to differentiate between those behaviors that would be considered normal teenage behavior versus those actions that, if repeated, would be a potential threat to campus safety. It would be important to emphasize during training the disciplinary review process is not an opportunity for the campus to readjudicate the student for past behavior. Such training would almost assuredly be the subject of ongoing discussion in the professional community that groups like CCA could strongly influence.
Having a policy under which students are asked to disclose information about past behavior and using it in the review process does not automatically guarantee a safer campus. However, the legal ramifications of not collecting information, or receiving it involuntarily and not using it to make an informed decision, should be compelling enough to persuade any institution of the wisdom of an unbiased, uniform and nonjudgmental collection of information about high school disciplinary behavior.
Pamela Brown is associate director of undergraduate admissions at Indiana University-Purdue University Indianapolis.
Louisiana tried to tighten admissions standards by shifting remediation to community colleges. But when enrollment dropped at four-year universities, without increasing at two-year institutions, the state shifted course.