You have /5 articles left.
Sign up for a free account or log in.

San Jose State University on Wednesday quietly released the full research report on the for-credit online courses it offered this spring through the online education company Udacity. The report, marked by delays and procedural setbacks, suggests it may be difficult for the university to deliver online education in this format to the students who need it most.

The report's release lands on the opposite end of the spectrum from the hype generated in January, when university officials, flanked by the Udacity CEO Sebastian Thrun and California Governor Jerry Brown, unveiled the project during a 45-minute press conference. The pilot project, featuring two math courses and one statistics course, aimed to bring high-quality education to students for a fraction of the cost of the university's normal tuition. Wednesday's report went live on the university’s website sometime before noon Pacific time, appearing with little fanfare on the research page of the principal investigator for the project, Elaine D. Collins. Collins serves as associate dean in the College of Science.

The report provides a long-awaited look into how the pilot project has fared. The initials results from the spring pilot led the university to put its partnership with Udacity on “pause” for the fall semester. Last month, the university released results from the summer pilot, showing increased retention and student pass rates. However, those reports barely scratched the surface of the data the university collected during the project.

The report, funded by the National Science Foundation, details the setbacks the research team encountered as it began to evaluate results from the spring pilot project. In particular, it took months to obtain usable data from Udacity that tracked how students used instructional resources and accessed support services. The research team then had to spend several weeks awaiting clarifications and corrections to resolve accuracy questions.

“The result ... is that the research lagged behind the implementation whereas, ideally, it would be running alongside, providing just-in-time information about what works and where improvements can be made,” the report reads.

The Udacity team contested the research team's findings in a blog post. They said they received the first data request on May 31, which was modified on June 3. Udacity submitted the data on June 28. The RP Group asked Udacity to reformat the data on July 25, which was performed by the next day.

"[T]he reason for this is it’s the first time we’ve collaborated with an external entity," Thrun said. "Whatever picture is being drawn here, I don’t understand why this is being said."

Ellen Junn, provost at San Jose State, declined to comment on Wednesday.

Another data source, student responses to three surveys, also proved less useful than anticipated. The spring pilot produced just 213 students whose results could be used for statistical purposes -- the remaining 61 received an incomplete grade, dropped a course or were removed after data were pruned for inconsistencies. Survey response rates ranged from 32 to 34 percent, and the research team found “significant differences” between those who responded and the general student population.

“Most importantly, successful students were overrepresented among the survey population and almost no students from the partner high school completed the surveys,” the report reads.

The surveys were further complicated by internal delays. The spring pilot began before San Jose State’s institutional review board could approve the project, which meant the first survey, billed as an entry survey, was not conducted until the fifth week of classes.

The research team consisted of members from the Research and Planning Group for California Community Colleges and Sutee Sujitparapitaya, associate vice president for institutional research at San Jose State. Despite the complications, the report concludes the results provide pointers to how students enrolled in SJSU Plus courses learn.

“[M]easures of student effort eclipse all other variables examined in the study, including demographic descriptions of the students, course subject matter and student use of support services,” the report reads. That means students who took charge of their own education and submitted more problem sets, logged in more often and watched more videos than the course mean were more likely to succeed than their peers were.

The importance of student effort highlights the pilot project’s difficulties in targeting disadvantaged students, who Udacity's online support providers early on felt “lacked adequate preparation for the courses and were very unlikely to succeed.”

Results from the first survey showed 39 percent of students had never before taken an online course. The unfamiliarity with the new platform meant less than half “partially understood” the online support services available to them, including video conferencing with faculty members and discussion forums.

By the end of the semester, four in every five students said they wanted more help with the content -- yet few scheduled appointments with faculty members during office hours. Instead, one faculty member said she answered “hundreds” of e-mails with questions that were answered in the syllabus. Another instructor “noted that she had out of necessity learned to write colorful boldfaced e-mails to draw students’ attention.”

During focus group sessions, student reported they were confused by having to interact with both San Jose State’s and Udacity’s websites, and that important e-mails arrived either too late or were flagged as spam.

While pass rates among students outside San Jose State in the introductory statistics course were more than double those in the two math courses, the report suggests the course’s weekly assignments “helped this group of students overcome, to some degree, their lack of online preparation.”

Research has shown that at-risk students tend to struggle in online classes, said the education consultants Michael Feldstein and Phil Hill. That disadvantaged students enrolled in SJSU Plus courses posted similarly poor pass rates suggests the spring pilot was rushed, they said.

"We have to be careful that our sense of altruism doesn’t overcome our sense of common sense," Hill said. "If we know that at-risk students don’t tend to do well in online courses, you can’t just wish away that problem. "

San Jose State and Udacity attempted to address many of the issues presented in the report on its summer pilot. Instead of being inundated by e-mails, students received more notifications while engaging with the course content online. The summer courses, which expanded to include psychology and computer programming, also featured orientation sessions.

Student pass rates from the summer pilot were superior to those in the spring pilot, with two-thirds of students receiving a C or better in four of five of the courses. Yet results in the remedial math course still lagged, with less than one-third of students receiving a passing grade.

The summer pilot also featured a vastly different student population: 53 percent of students had completed a postsecondary degree, including some doctoral degree holders. Only 15 percent were active high school students, compared to about half of the spring pilot’s students.

The shifts in student demographics, coupled with the improved results, led some, including Feldstein and Hill, to criticize the SJSU Plus initiative for a lack of transparency. But the two consultants also applauded San Jose State and Udacity for their disclosure.

"San Jose State, to its credit, is exposing the data, warts and all," Hill said. "Boy, this would be wonderful if we could apply the same type of rigor to face-to-face courses and have public accountability."

Next Story

More from Alternative Credentials