Tucked away on page 1,020 of the 1,582-page spending bill winding its way through Congress, Section 527 of the ‘‘Consolidated Appropriations Act of 2014’’ would make taxpayer funded research publicly available within 12 months of publication.
According to the bill, federal agencies must develop public access policies that provide a "machine-readable version of the author’s final peer-reviewed manuscripts that have been accepted for publication in peer-reviewed journal." The policy applies to all federal agencies with research and development expenditures exceeding $100 million a year. The proposal resembles that introduced last year by the White House Office of Science and Technology Policy, although it is not clear how Congress's involvement would affect the rollout of those policies.
The $1.1 trillion bill, which passed the U.S. House of Representatives on Wednesday and the Senate on Thursday, is expected to be signed into law.
The U.S. Education Department plans to provide guidance counselors and state agencies with more information about students who are filling out the Free Application for Federal Student Aid, known as the FAFSA, Obama administration officials said Wednesday at the department's so-called "Datapalooza" event.
The goal is to boost the rate at which students, especially those from low-income backgrounds, complete the FAFSA.
The department will “responsibly” share data with high school guidance counselors on which of their students have begun the FAFSA so they can work with those students on actually completing the forms, according to a department fact sheet. Department officials will also share that information with state student aid agencies “early” this year, the White House announced separately on Thursday.
The department is also eyeing the development of a FAFSA Application Program Interface, a set of web protocols that would allow developers to build third-party services and applications that work with the complicated form, which is currently available only through the government’s website, FAFSA.gov.
Officials will soon issue a formal request for information and feedback on how the department might develop feeds (known as application programming interfaces, or APIs) of “key education data, programs and frequently used forms,” including the FAFSA, the department said.
James H. Shelton, the department’s assistant deputy secretary for innovation and improvement, told attendees that a FAFSA API would be useful in expanding the way students, families and others use the form. For instance, he said, the department has talked with KIPP charter schools about how they might be able to submit all of their students’ FAFSA forms at once.
The announcement came at a symposium Wednesday where Obama administration officials highlighted companies, nonprofit and academic groups that have used government data to build products and services aimed at helping students prepare for, apply to, and select a college. The Datapalooza event on Wednesday attracted more than 500 entrepreneurs, software developers and education technology experts.
Education Secretary Arne Duncan told attendees that his goal was to reduce the amount of time it takes to obtain degree and promote competency-based education. The administration is currently soliciting ideas on how it should waive federal student aid rules to certain colleges who want to experiment with innovations that will reduce the cost of higher education.
Open SUNY -- through which the State University of New York plans to take existing online programs in the 64-campus system and to build on them, making them available for students throughout the system -- has its first degree programs. In her annual address on the state of the university, Chancellor Nancy Zimpher announced the first degree programs and the campuses that are producing them. The offerings include associate, bachelor's and master's degrees. Two SUNY institutions -- Empire State College and SUNY Oswego -- are each offering two programs. The others are being offered by Broome Community College, Finger Lakes Community College, SUNY Delhi and SUNY Stony Brook.
The news that Purdue University likely overstated the impact of its early warning system, Course Signals, has cast doubt about the efficacy of a host of technology products intended to improve student retention and completion. In a commentary published in Inside Higher Ed, Mark Milliron responded by arguing that “next-generation” early warning systems use more robust analytics and will be likely to get better results.
We contend that even with extremely robust and appropriate analytics, programs like Course Signals may still fall short if their adoption ignores the most pressing piece of electronic advising systems — their use on the front end, by advisers, faculty and students. Until more attention is paid to the messy, human side of educational technology, Course Signals — and other programs like it — will continue to show anemic impacts on student retention and graduation.
Over the past year, we have worked with colleges in the process of implementing Integrated Planning and Advising Systems (which include early warning systems like Course Signals). The adoption of early warning systems requires advisers, faculty and students to approach college success differently and should, in theory, refocus attention on how they engage with advising and support services. In practice, however, we have found that colleges consistently underestimate the challenge of ensuring that such systems are adopted effectively by end-users.
The concept of an early alert is far from new. In interviews, instructors and advisers have consistently reminded us that for years, students have received “early alert” feedback in the form of grades and midterm reports. Early warning systems may streamline this process, and provide the reports in a new format (a red light instead of a warning note, for example), but the warning itself isn’t terribly different.
What is potentially different about products like Course Signals is their ability to connect these course-level warnings to the broader student support services offered by the college. If early warning signals are shared across college personnel, and if those warnings serve to trigger new behaviors on their part, then we are likely to see changed student behavior and success. In other words, sending up a red light isn’t likely to influence retention. But if that red light leads to advisers or tutors reaching out to students and providing targeted support, we might see bigger impacts on student outcomes.
Milliron says, for example, that with predictive analytics, “student[s] might be advised away from a combination of courses that could be toxic for him or her.” But such advising doesn’t happen spontaneously: it requires advisers to be more proactive in preparing for and conducting each advising session. They must examine a student’s early warning profile, program plan and case file prior to the session; they must reframe how they present course choices to students; and they have to rethink what the best course combinations are for students with varying educational and career goals, as well as learning styles and abilities. Finally, they may have to link students to additional resources on campus — such as tutoring— and colleges need to ensure these services exist and are of high quality.
For this process to occur, advisers need to be well-versed in how to use the analytics, and be encouraged to move past registering students for the most common set of courses to courses that make sense for the individual. But because most colleges remain uncertain about the process changes that should occur when they adopt early warning systems, they are unable to provide the training that would help faculty and advisers make potentially transformative adjustments in their practice.
Even if colleges do adequately prepare faculty and advisers for this transition, there is much we still don’t know about how students will perceive and use the data and messages they receive from early warning systems. These unknowns may influence the extent to which the systems impact student outcomes.
For example, if students perceive early warnings as a reprimand rather than an opportunity to get help, they may ignore the signals or avoid efforts of college personnel to contact them. To anticipate and mitigate these kinds of potentially negative responses, it is important to understand how all students, not just those who use and enjoy early alert systems, experience and react to such signals. As Milliron notes, we need to figure how to send the right message to the right people in the right way.
Early warning systems are only tools, and colleges will have to pay closer attention to changing end-user culture in order to maximize their effectiveness. Currently, colleges are skipping this step. At the end of the day, even the best system and the best data depend on people to translate them into actions and behaviors that can influence student retention and completion.
Melinda Mechur Karp is a senior research associate at the Community College Research Center at Columbia University's Teachers College. Also contributing to the essay were Jeff Fletcher, a senior research assistant, HooriSantikianKalamkarian, a research associate, and Serena Klempin, a research associate.
Cengage Learning, the second-largest higher education publisher in the U.S., on Tuesday announced it has formed a partnership with Knewton to provide adaptive learning technology in a handful of its products. Cengage will use Knewton technology in the company's MindTap platform, an interactive textbook reader. The technology will first appear in the management and sociology disciplines, a Knewton spokesman said.