Blog U › 
  • GradHacker

    A Blog from GradHacker and MATRIX: The Center for Humane Arts, Letters and Social Sciences Online

Analyzing Analytics at the University
February 14, 2016 - 10:08pm

Heather VanMouwerik is a doctoral candidate in Russian History at the University of California, Riverside. Follow her @hvanmouwerik or check out her website.

“He uses statistics as a drunken man uses lamp-posts—for support rather than illumination.” (Andrew Lang)

This week at GradHacker, we will be focusing on teaching techniques and pedagogical tools. To start us off, a couple of questions: What are analytics? And how are they being used to understand us and our students?

Over the last twenty years, there has been a noticeable spike in the use of the word “analytics.” Oftentimes used as a synonym for the collection of data, analytics actually refers to the evaluation of that data with the purpose of establishing meaning, guiding action, and ensuring improvement.

From what I understand, our modern use of the word began in smoke-filled, Mad Men-esque advertising agencies, but, with the help of modern technology, analytics have infiltrated almost every aspect of modern life. For example, I am currently wearing a Fitbit around my wrist, which is collecting data about my activity level, sleep cycle, and caloric use. With a swipe of my finger, my phone access this information and assesses my health in a variety of ways.

Learning analytics, or analytics within the context of higher education, are equally ubiquitous. Every time a student uses a learning management system like Blackboard or submits a paper through an online plagiarism program, their data is being filed away by the university to be mined for information on student learning, engagement, and achievement.

My purpose in writing this post is rather selfish: to seriously think about how analytics are being used to evaluate my work, my studies, and my students. After examining this topic, I am still unsure how these tools will affect my university experience and those of my students. Although there is great potential for this technology to transform student learning and evaluation, there is also a profound need for transparency, self-evaluation, and experimentation.

As graduate students, analytics affect us in three ways:

1)   As a University Member: The students that populate our classrooms are now admitted based upon an analytical approach to their applications with minimal human intervention. Items like past performance markers, standardized test scores, and personal background information are analyzed in an attempt to predict future success.  When applied to the student body, as Karen D. Mattingly et al. argue, learning analytics could potentially provide universities with constant feedback on student performance, which would enable them to identify and counsel struggling students earlier. Just this week I went through TA training with our on-campus psychological services group to learn how to identify and help students in crisis. I was impressed with how the counselors are using analytics they get from counseling visits, teacher and resident director feedback, and grades to ensure students do not slip through the cracks and get the help they need.

2)   As a Student: Like it or not, learning analytics are also central to graduate education. Although our applications are much less likely to be evaluated by analytics, our performance is still subject to analytical evaluation. For example, at universities like mine, graduate students must complete their Ph.d program within normative time, the amount of time the graduate division determines is normal for the student’s department. Analytics are not only used to calculate normative time but also to decide whether or not a student will be allowed to exceed this deadline. In addition, the grant and fellowship applications you send off are increasingly subject to analytical evaluation. Although peer review still dominates, some suggest that analytics might help negate the biases inherent in the grant peer-review process.

3)   As an Instructor: Teacher evaluations have been a source of data for learning analytics before learning analytics were even a thing. When I was an undergraduate in the early 2000s, for example, we filled out teacher evaluations on a Scantron. Although the technology has developed substantially since then, the same principle applies to the increasingly sophisticated and digitized teacher evaluation process. Already, analytics have resulted in an interesting discussion about what it means to be female and an instructor in an undergraduate classroom, and I hope this meta-evaluation of evaluation continues.

Ideally the university’s focus on these tools is to ensure student successes and anticipate student failures, enriching the overall experience. However, I remain concerned. First, the technology that collects and analyzes data is new and unevenly distributed. In order for this type of analysis to work in a long-term, meaningful way, it needs to be collected from as many angles as possible to ensure an accurate set of data for evaluation. Imagine buying a car based solely on gas mileage and trunk size. Chances are you won’t be particularly happy with that car. Instead, you need to know a lot of different information--horsepower, color, style, make, year, etc--in order to make a good decision. Likewise, if you only evaluate a student on their grades and family background, then your assessment of the whole person will be profoundly limited. The more diverse the data, the more accurate the analysis.

Second, because the technology is so new, no one exactly knows its limitations yet. How does a university evaluate whether or not its analysis is accurate? Who within the institution is responsible for organizing, compiling, and using analytic information? Who has access to this information? And who is responsible for protecting the information? Until a universal protocol is established for analytical best practices (Hey! I can dream), institutions should ensure transparency in collecting and using student data.

Third, I worry that the increasing importance placed on analytics has a disproportionately negative effect on students who do not have equal access to technology or who are first-generation college students.

Now, before you start calling me Chicken Little, I think that, despite my concerns, analytics are a positive and powerful tool for teaching. If you are looking for little ways of incorporating analytics into your classroom, here are a couple of ideas.

1)   Lesson Preparation: When I am a TA in the history department, my dominant role is to lead weekly discussion sections based upon a shared text. Some weeks I find the students are very well prepared and eager to talk; other weeks students actively avoid eye contact when I ask a question or babble an incoherent and vague response. At first, I found making a lesson plan difficult and frustrating, because I did not know what to expect from week to week. Then I discovered my favorite analytic tool: track views. When you post an item on the course’s learning management system, you can see how many times someone has looked at a particular document. An hour or so before a discussion section, I check this statistic. If most students have seen it, then I plan for an engaged audience. If few students have seen it, then I either plan a lot of group work with opportunities to read along or I put together a pop quiz. It takes a lot of the guesswork out of lesson planning for participation-heavy courses by predicting student engagement.

2)   Syllabus Revision: At the end of the quarter, I quickly review data available through the learning management system, like track views, percent matches on plagiarism reports, response times for online assignments, and engagement levels on online forums. Because I am usually doing this as I am fleeing from campus for a few days of vacation, I store this information in my teaching journal until I am ready to revisit a course’s syllabus. If an assignment took longer than I thought it should, I rewrite it. If students universally did not read a particular document, then I decide whether or not I want to keep it. All of this is stuff that instructors do every time they evaluate their course, but I find it so much easier to do when I am armed with this type of data.

3)   Evaluating Hidden Discussions: I have talked before about how important it is in online classes to provide ample opportunities for students to communicate with each other, because it is in these hidden discussions, or peer-to-peer communication that occurs without the direct intervention of the instructor, that deep learning occurs. Encouraging students to participate like this, however, is difficult, since these discussions necessarily exist outside of direct instructor oversight and cannot be easily graded. So, how can these hidden discussions be evaluated? Well, A. F. Wise et al. argue that analytics provide an opportunity to do just that. I recommend that you read their report, because they offer some great guidance on fostering and evaluating hidden discussions.

Up until this point, I do not think that I fully appreciated how often analytics are used in higher education. This type of evaluation is potentially revolutionary to students, professors, and administrators; however, as with most new technologies, the long-term ramifications analytics are unclear. While all of this is going on at the university level, I will continue to experiment with using analytics at the smaller, classroom level.

Interested in continuing this discussion of teaching techniques and pedagogical tools? Then check back later this week as Maddy and Travis share their ideas!

I really want to know what you think about the prominence of analytics at your university. What sort of uses have you found for these tools in your classrooms? Have analytics changed the way you teach? If so, how? Please tell us about it in the comments!

[Image from Flickr user dirkcuys, used under Creative Commons license]


Please review our commenting policy here.


  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Back to Top