Assessment is quickly becoming the new black. It’s one of the themes of the Secretary of Education’s Commission on the Future of Higher Education. More and more institutions, some prodded by accreditors, are looking for rigorous ways -- often online -- to compile course data.
Now Blackboard, a leading provider of course management software, is making plans to enter the assessment field.
Blackboard already offers the capability to do course evaluations, and for over a year-and-a-half the company has been researching more comprehensive assessment practices.
The prospect of online evaluations and assessments, for many faculty members, conjures images of RateMyProfessors.com, the unrestricted free-for-all where over 700,000 professors are rated -- often to their dismay -- by anonymous reviewers. Blackboard -- and some others are looking to enter the evaluation field -- are planning very different and more educationally oriented models. Blackboard's approach is more oriented on evaluating the course than the professor.
Blackboard has generally enjoyed a good reputation among faculty members, dating to its beginnings as a small startup. One of the things that has endeared Blackboard to academics is the ability they have had to customize the company's products, and Blackboard, though it’s no longer small, will seek to keep important controls in the hands of institutions.
With institutions looking to do evaluations and assessment online, Debra Humphreys, a spokeswoman with the Association of American Colleges and Universities, said that Blackboard’s outcomes assessment program “could make trends that are already underway easier for schools.”
David Yaskin, vice president for product marketing at Blackboard, said that a key component of Blackboard’s system -- which is in development -- will likely be online portfolios that can be tracked in accordance with learning outcomes that are determined by faculty members, departments or institutions.
Yaskin said he’d like to see a system with “established outcomes, and a student has to provide evidence” of progress toward those outcomes, whether in the form of papers, photography collections or other relevant measures. Yaskin added that faculty members could create test questions as well, if they are so inclined, but that, for Blackboard’s part, the “current plan is not to use centralized testing in version 1.0, because higher ed is focused on higher orders of learning.”
One of the most powerful aspects of the program, Yaskin said, will likely be its ability to compile data and slice it in different ways. Institutions can create core sets of questions they want, for a course evaluation, for example, but individual departments and instructors can tailor other questions, and each level of the hierarchy can look at its own data. Yaskin said that it's important to allow each level of that hierarchy to remain autonomous. He added that there should be a way for “faculty members to opt out” of providing the data they got from tailored questions to their superiors if they want. Otherwise, he said, faculty members might be reticent to make full use of the system to find out how courses can be improved.
Yaskin added that, if certain core outcomes are defined by a department, the department can use the system to track the progress of students as they move from lower to upper level courses.
Because Blackboard, which bought WebCT, has 3,650 clients, any service it can sell to its base could spread very quickly. While details on pricing aren't available, the assessment services will be sold individually from course management software.
The idea of online evaluation is not new. Blackboard has been looking to colleges already using online course evaluations and assessments for ideas.
Washington University in St. Louis -- which wasn't one of the consulted institutions named by Blackboard -- took over five years to develop an internal online course evaluation system. A faculty member in the anthropology department developed templates, and other faculty members can add specific questions. Students then have access to loads of numerical data, including average scores by department, but the comments are reserved for professors. Henry Biggs, associate dean of Washington University’s College of Arts and Sciences, was involved with the creation of the system, and said that too much flexibility can take away from the reliability of an evaluation or assessment system.
Washington University professors have to petition if they want their ratings withheld. “If faculty members can decide what to make public, there can be credibility issues,” Biggs said. “It’s great for faculty members to have a lot of options, but, essentially, by giving a lot of options you can create a very un-level playing field.”
Biggs said that the Blackboard system could be great for institutions that don’t have the resources to create their own system, but that a lot of time is required of faculty members and administrators to manage an assessment system even if the fundamental technology is in place. “The only way it can really work is if there are staff that are either hired, or redirected to focus entirely on getting that set up,” Biggs said. “I don’t think you will find professors with time to do that.”
Humphreys added that “the real time is the labor” from faculty members, and that technology often doesn’t make things so much easier, but may make something like assessments better. “People think of technology as saving time and money,” Humphries said. “It rarely is that, but it usually adds value,” like the ability to manipulate data extensively.
Some third-party course evaluation systems already offer tons of data services. OnlineCourseEvaluations.com has been working with institutions -- about two dozen clients currently -- for around three years doing online evaluations.
Online Course Evaluations, according to president Larry Piegza, also allows an institution to develop follow-up questions to evaluation questions. If an evaluation asks, for example, if an instructor spoke audibly and clearly, Piegza said, a follow-up question asking what could be done – use a microphone; face the students – to improve the situation can be set to pop up automatically. Additionally, faculty members can sort data by ratings, so they can see comments from all the students who ripped them, or who praised them, and check for a theme. “We want teachers to be able to answer the question, ‘how can I teach better tomorrow?’” Piegza said.
Daily Jolt, a site that has a different student-run information and networking page for each of about 100 institutions that host a page, is getting into the evaluation game, but the student-run evaluation game.
Mark Miller and Steve Bayle, the president and chief operating officer of Daily Jolt, hope to provide a more credible alternative to RateMyProfessors.com. Like RMP, Daily Jolt’s evaluations, which should be fully unveiled next fall, do not verify .edu e-mail addresses, but they do allow users to rate commentors, similarly to what eBay does with buyers and sellers, and readers can see all of the posts by a particular reviewer to get a sense of that reviewer.
Biggs acknowledged that student-run evaluation sites are here to stay, but said that, given the limited number of courses any single student evaluates, it’s unlikely that reviewing commentors will add a lot of credibility. Miller said that faculty members will be able to pose questions in forums that students can respond to.
“A lot of faculty members want to put this concept [of student run evaluations] in a box and make it go away,” Miller said. “That’s not going to happen, so we might as well see if we can do it in a respectful way.”