The bad news: Earlier in October the Brazilian press announced the sad news that the University of São Paulo (USP) usually considered the best university south of Rio Grande, had disappeared from the top list of 200 institutions of the Times Higher Education rankings, together with the prestigious State University of Campinas (UNICAMP). USP went from the 158th place to join the 226-250 group, while UNICAMP disappeared from the top 300 list completely. During the following days, many articles appeared in newspapers, magazines and blogs, trying to explain this sudden fall. The fact that USP remained among the best 150 in the also prestigious ranking of the Shanghai Jiao Tong University was no consolation.
The good news: a few days later, the Ministry announced results of ENADE, an assessment of higher education carried by the government. The assessment is based on a test applied periodically to all the students concluding higher education in different fields; in 2012 it was applied to students in 7,2 thousand degree programs in the social sciences, humanities, management and related areas. According to the Minister of Education Aloísio Mercadante, an economist with a doctoral degree from UNICAMP, the percentage of course programs with satisfactory results went up from 48.5% to 68.3%. He proclaimed, “We have to celebrate the progress of the higher education system towards better quality. Our methods of assessment, follow-up and inspection are allowing this to happen.” Unfortunately, the data also showed that 30% of the course programs demonstrated insufficient results, which was also picked up by the press and subject to all kinds of interpretations.
The actual news. The actual news is that there is no news. The University of São Paulo may or may not be getting worse, but its sudden drop in the international rankings is explained mostly by new entrants on the Times list, and perhaps to changes in the methodology. Rankings tell us the relative position of an institution in regard to others, but anyone who passed statistics 101 knows that this is an ordinal scale that tells us nothing about the distance between the institutions or their quality. The question of why no Latin American University comes up higher in these rankings remains an issue that was not changed by these recent results.
The statements of the Ministry of Education are more startling because the results of the student assessments done by Ministry are placed in a normal distribution curve, which means that, by definition, half are above and half below the mean— there are no external standards to establish what are the acceptable results in each field. Actually, the way the data crunchers at the Ministry of Education process the information is more complicated than that – they calculate the mean results of the students for each course, and then place these averages in a five-point scale starting with the lowest, so that the estimated average may move up and down around a truly normalized mean. But eventual variations are just arithmetic artifacts and do not have any meaning in terms of changing quality.
One could speculate about the reasons why even the best newspapers cannot interpret the information they get or why the Ministry of Education is unable to understand the data produced by their own people. One possible explanation is that figures are apparently easy to understand and to show, while all these questions of normal distributions, rankings methodology, scales and standards are too difficult to understand, let alone to communicate. The problem is when policies are implemented based on these simplified constructs, and not on what is really happening.