Encouraging Colleges to Look Within

In challenge to rankings business, new data from National Survey of Student Engagement show variation in educational quality is more prevalent within institutions than between them.
November 10, 2008

The National Survey of Student Engagement -- an annual report providing comparative data on student experiences at four-year institutions nationwide -- is entering its 10th year. Now that the survey is reaching what some consider a critical mass of participants, this year’s report finds that variations in educational quality are more prevalent within institutions than among them. As a result, NSSE officials argue that holistic assessments, such as theirs, provide a more accurate comparison than do those using institution-wide averages.

“Quality is multi-dimensional,” said Alexander C. McCormick, NSSE's director and professor at Indiana University’s School of Education, where the survey is based. “You can’t distill everything down to one number. Quality is lumpy within institutions. You don’t always have the same experience as your peers in college.”

NSSE measures five areas of education performance: level of academic challenge, active and collaborative learning, student-faculty interaction, enriching educational experiences and supportive campus environment. While McCormick acknowledges that institution-wide averages found in other assessments are worthy of note, he argues that such figures cannot be used to provide institutions with the means by which to improve themselves.

He also warns that prospective college students and their parents need to realize that a generous ranking by one or more sources taking broad institutional views -- U.S. News & World Report's America’s Best Collegesbeing among them -- does not always indicate a high quality “throughout the undergraduate experience.”

In each of NSSE's five areas of education performance, there are two parts to the total variation in scores. The within-institution variation -- or differences among individuals within an institution -- is far greater than the gaps in averages comparing institutions to one another

The internal variations are more noticeable when one disaggregates the data for an individual institution. As NSSE does not disclose data for institutions without their permission, this year's report includes a case study using real data for two pseudonymous institutions. For example, the report notes there is a more "supportive campus environment" for students involved in either the honors program or the Educational Opportunity Program -- for underrepresented students -- than the other 71 percent of students at one of these shielded institutions. In another case study, data reveal that students in different disciplines show different levels of engagement. In one case, it notes that business students have less "enriching educational experiences" -- study abroad, internships, etc. -- than do their counterparts in engineering.

This year’s NSSE report was compiled from information on almost 380,000 first-year and senior students at 727 public and private four-year institutions around the country. Also, among this year’s first-time participants were some major players in for-profit education: University of Phoenix Online, University of Phoenix Southern California, and Kaplan University.

While individual institutions are provided with their own results, based on the same set of engagement questions asked to first years and seniors, only aggregated data are provided to the public. When detailed data are given to institutions, they receive customized reports showing their data compared to aggregate data from three default peer groups: geographical region and public/private, 2005-revision of the Carnegie Foundation’s “basic classifications” and all current year NSSE participants.

Institutions are free to publish their data if they wish. And while some institutions do release information, many only provide snapshots of data that portray their information in a positive light.

NSSE made waves last year by cooperating with USA Today and publishing individual data from a number of institutions from whom it received permission. To coincide with the release of this year’s study, the newspaper’s Web site has updated its feature showing selected institutions’ data. This relatively new relationship has led some in higher education to believe that NSSE was looking to get into the business of ranking institutions and nearing the point at which it would fully disclose individual institutional data to the public.

A year later after the initial experiment, McCormick said he has put those concerns to rest, noting that NSSE does not support the use of its data for ranking institutions. USA Today has also not done so.

“I don’t think by cooperating with USA Today we’re pressuring institutions in any way,” McCormick said. “There was a lot of suspicion in the first year. Certainly some were saying this is the camel’s nose under the tent and soon NSSE will be publishing our data. Still, there’s no carrot and stick associated with it. It’s more about providing an opportunity. Public reporting is a decision for the colleges to make.”

This year’s report highlights a number of “promising” and “disappointing” findings. One of the more positive discoveries was that, according to the report, two-thirds of first-year students and three-fourths of seniors “at least sometimes discussed ideas from their readings or classes with faculty members outside of class.” The report also notes that writing-heavy courses engage students in deeper learning techniques such as the analysis and synthesis of concepts from multiple sources. In addition, students in writing-heavy courses reported more “personal, social, practical and academic learning and development.”

Online courses are also given relatively positive reviews by the new report. In comparison to classroom-based learners, online learners were more likely to “participate in course activities that challenged them intellectually,” “participate in discussions that enhanced their understanding of different cultures,” and “discuss topics of importance to their major.”

McCormick said he could not speculate as to why online learners report these advantages over classroom-based learners. Still, he did note that many of these learners were more likely to be older, transfer or first-generation students who have a greater interest in their education. Figures like these, he said, demand further investigation.

On the negative side, the report notes that only about half of first-generation students participate in co-curricular activities. It also shows that just 57 percent of first-year students and half of seniors are encouraged by their institutions to interact with students of “different economic, social, and racial or ethnic backgrounds.”

Most troubling of this year’s negative data points, McCormick said, was that almost 25 percent of first-year students and 20 percent of seniors report that they “frequently came to class without completing readings or assignments.” These points are below the expectations set by many professors. Faculty who participated in NSSE’s companion Faculty Survey of Student Engagement this year, however, reported even less engagement among their students.

Thomas F. Nelson Laird, FSSE project manager and an education professor at Indiana University, said 45 percent of faculty reported that typical first-year students “frequently (often or very often) came to class unprepared.” Additionally, he said 36 percent of faculty reported that typical seniors did the same.

“Over all, I’d say that faculty have a realistic picture of the preparation of their students,” Laird wrote in an e-mail. “Many faculty recognize that a portion -- but certainly not all -- of their students are frequently coming to class underprepared.”

Looking toward the future, NSSE is hoping to incorporate more study of shifts in its data through time. Jillian Kinzie, NSSE's associate director, wrote in an e-mail that NSSE is getting to the point where it can analyze multi-year data from individual institutions. As a result, she said, it is just now identifying institutions that have improved their scores. NSSE also hopes to report more of the ways the individual institutions are using their data to improve the quality of their programs.

While McCormick said it is premature to say how exactly NSSE 2.0 will look, he said the survey is already beginning to make a difference in the higher education community.

“What I’m aiming for is very simple,” McCormick said. “I want people look beyond their average score and U.S. News [rankings] and show that there are better ways to focus on individual measures of success. A major focus on this is calling attention to this internal variation within institutions.”


Be the first to know.
Get our free daily newsletter.


Back to Top