Many students these days prefer instant messaging to phone calls, and music downloads (legal or otherwise) to music purchases. But students' agility with those technologies doesn't necessarily mean they can tell a quality online source from an advertisement. Or that they know how to use e-mail to communicate effectively.
Measuring those skills -- and helping colleges plan curricular and library offerings accordingly -- is the goal of a new standardized test that the Educational Testing Service is now opening up to widespread use, with the first such administration scheduled for January. The exam -- which is designed for placement and evaluation, not admissions -- has been in a testing period with a small group of colleges.
The Information and Communication Technology Assessment, as the test is known, can be scored individually and colleges can receive aggregate scores. The test was first announced last year, but a number of changes have been made based on early administrations of the exam.
Terry Egan, project manager for the test for ETS, said that the exam grew out of a sense among educators that there is more than a "digital divide," but a "proficiency divide" in which students "have access to technology, but don't know how to use it."
The test was originally envisioned as one that colleges might give rising juniors, but ETS is now exploring the possibility of offering multiple versions of the test: a short version that might be given to entering freshmen and a longer version for later in students' college careers. The questions on the short version would take three to four minutes each to complete, and would test students' ability at using e-mail technology, downloading attachments, combining different technology forms in e-mail, describing findings, and comparing reliable and unreliable sources of information. Sample questions are available on the ETS Web site.
The short version takes 45 minutes, and the longer version, which also features more complicated problems that take 10-15 minutes each, takes 90 minutes. Students would be told if their skills were in the high, middle or low range, and a college could find out what percentage of its own test takers were in various categories. Students who take the longer form could also find out how they did in different subcategories, such as integrating information or evaluating information.
Test takers will use a simulated Web environment to take the test. On the fake Web site, ETS will have both reliable and unreliable sources of information. Egan said that there will be no single right answer for any of the questions, and that students would be evaluated with the idea that there are a range of answers, showing various levels of Web savvy.
A sample question on an ETS research paper about the project tells a student that his sister fell during a tennis match and has been diagnosed with a rupture of her anterior cruciate ligament. The student needs to identify and describe reliable sources of information about treatment and rehabilitation options.
Egan stressed that the test should not be thought of strictly as a technology test. "This is really an assessment of cognitive skills. Many students have technical skills, but not the cognitive skills for using technology. We are trying to see if a student knows how to legally and ethically use information," she said.
Colleges can use the test, she said, to place students in appropriate programs or to measure the general skills of students so that faculty members and librarians can know the skill levels of the student population. Because the test is being largely designed as a diagnostic tool, she said, colleges will pay for the test, not students. Currently, the price is $25 a test, but that may change. Egan said that a version of the test may also be developed as a work-force competency test that students might take (and pay for themselves).
The University of California at Los Angeles is one of the pilot colleges using the test. Stephanie Brasley, information literacy coordinator for the main undergraduate library at UCLA, said she saw the test as a way to measure what students know and don't know about how to use technology. "If this was just a technology test, we wouldn't have been interested. What's important is that it's a test of students' skills and information problem solving and working in a digital environment."
Brasley said that she finds many UCLA students know how to use technology for entertainment, but not much more. She hopes to use aggregate scores from the test to map out a strategy for adding to students' skills.
"Students can IM all day and they can game really well, but if you think of the academic environment and the professional environment, those aren't the skills," she added.
Read more by
Today’s News from Inside Higher Ed
What Others Are Reading