Pearson will expands its partnership with the adaptive learning technology company Knewton to offer MyLab and Mastering products to six new subject areas this fall, the education company announced on Thursday. MyLab and Mastering, e-tutoring products that "continuously [assess] student performance and activity in real time," have been available since fall 2012 for students in math, economics, reading and writing. With the addition of topics including biology, anatomy and physiology, chemistry, physics, finance and accounting, Pearson estimates the products will reach about 400,000 students.
While some observers say academe is already moving to a post-MOOC era or one dominated by MOOC-like offerings that aren't really massive open online courses, the MOOC itself has a new symbol of recognition. Oxford Dictionaries, published by Oxford University Press, has now added MOOC as an official word.
Definition: "a course of study made available over the Internet without charge to a very large number of people."
Origin: "early 21st century: from massive open online course, probably influenced by MMOG and MMORPG."
There are no easy solutions to these problems. Nevertheless, we think that a more public-facing academy is a necessary, if insufficient, response. Public engagement helps to demonstrate the value of research. It also helps to generate a larger audience for scholarly research and therefore potentially more revenue for publishers. We are not suggesting that research intended for a broader audience can or should supplant research targeted at the scholarly community. But we think there is room for more scholars to demonstrate that their expertise is important outside their subfield.
We have a new book on the 2012 presidential election, The Gamble, that provides one model for public engagement. The book was designed to be an accessible academic account of the election, written in real time and published within a year of the election itself — standard timing for books focused on the general public, but an unusually short time frame for a scholarly book. Together with our publisher, Princeton University Press, we structured the project so that we could enter into the ongoing public discussion about the election alongside pundits and journalists — via continuous analysis and writing, serializing the process of peer review, and accelerating the final mechanics of publication.
Our experience writing this book suggests to us that there are underutilized opportunities for both scholars and their publishers to innovate on traditional modes of academic writing and thereby bring scholarly research to a much larger audience. We joked over the past two years that part of "the gamble" was simply writing the book itself. We believe that this gamble has paid off, and we offer our story in hopes that it might encourage others to roll the dice. We think this sort of project can benefit scholars, publishers, and the broader public alike.
Why We Wrote the Book
The book was motivated by two goals. The first was simply to tell the story of what promised to be a lively and competitive election. The second goal was to amplify the voice of political science in the conversation about the election—from events on the campaign trail to explanations and interpretations of the election after it was over.
Journalists typically write the history of American presidential elections, a history built on their access to decision-makers in the campaigns. We believed that the social scientific study of campaigns, with its emphasis on systematic data and statistical analyses, adds something important. Whereas journalistic accounts effectively capture why campaign principals made the decisions they did, a political science account can better determine whether those decisions mattered.
The problem, however, is that political scientists — like most academics — usually work too slowly to have much influence. Science takes time, and so the first academic articles might appear about 18 months after an election. Academic books may take two to three years or even longer. By this point it is too late. The conventional wisdom about the election has congealed — whether it is correct or not — and journalists, commentators, and voters are already thinking ahead to the next election. After the 2004 election, for example, the misinterpretation of a single question on the exit poll led some commentators to attribute President George W. Bush’s victory to his appeal among "values voters."
We wanted to be different. We wanted to write an academic book, but with a journalist’s faster metabolism.
How We Wrote the Book
In August 2011, we pitched the book to several different presses. In February 2012, we signed a contract with Princeton University Press. In August 2012, the first two e-chapters of the book were made available for free by Princeton Press. In January 2013, a third e-chapter was released. In April 2013, a fourth and final e-chapter debuted. In September 2013, the print edition of the book will be published — including revised versions of the e-chapters as well as four additional chapters. Looking back at our drafts of the initial and final chapters, we wrote the entire book in about a calendar year. How were we able to do this?
Like Lennon and McCartney, we got by with a little help from our friends. Their help was most evident in the data we were able to obtain at no cost — weekly survey data from the firm YouGov, daily data on media coverage from the firm General Sentiment, data on candidate advertising courtesy of The Washington Post, and multiple other datasets from generous colleagues. These data were necessary to make our book stand apart from other accounts of the campaign. Most importantly, we received these data promptly and continuously, allowing us to do analysis while the campaign was under way.
Second, we wrote about our findings in public forums during the campaign itself. This writing had several benefits. It helped ensure that the book would be completed in time. It allowed us to elicit responses to our argument that, at times, led to revisions and corrections — a sort of crowdsourced peer review. And it put our perspective into the conversation happening in the moment. We found blogs to be the ideal venue for doing this because they allowed us to write and publish with minimal editorial delay and to get feedback in comments threads under each blog post. We contributed to The Monkey Cage, YouGov’s Model Politics, Campaign Stops and FiveThirtyEight at The New York Times, and Wonkblog at The Washington Post.
Finally, and perhaps most important to the successful completion of the book, was the innovative plan devised by Princeton University Press (PUP), which certainly took a gamble as well. Our editor at the project’s inception, Charles Myers, convinced us that the book would be more accessible to a non-academic audience if it had a chronological narrative at its core, rather than the thematic structure that academics often favor. Then, as we completed drafts of individual chapters, PUP sent them out for peer review, rather than waiting until we had finished the entire manuscript. PUP had secured reviewers in advance and requested a tight turnaround. PUP also produced the multiple e-chapters that allowed the book to be partially serialized. In their view, having these e-chapters — and giving them away for free — would help build interest in the book. Over 2,000 copies of these chapters were downloaded from Amazon, in addition to an untold number of PDF copies downloaded from the PUP website or The Monkey Cage. Several colleagues assigned these chapters to their students, circulating them further.
PUP also accelerated the process of producing a print volume — giving us stringent deadlines that we had to meet. We managed to do this with modest success, although we created delays by adding a new chapter at the 11th hour and by fine-turning analyses for weeks on end. But ultimately, we finished the manuscript in time to produce a book that would be published alongside, or even before, the journalistic accounts. PUP deserves credit here as well, as it is taking them only three months to turn that final manuscript into a book available for purchase.
What impressed us throughout this process was the press’s flexibility and willingness to innovate. The press showed how to take the existing model of scholarly publishing — one centered on peer review — and modify that model to produce a book that was still rigorous but also timely and, we hope, lively.
Did “The Gamble” Pay Off?
We believe that it did. We sought to tell the story of this election, and we believe that our account provides a novel perspective that challenges much conventional wisdom. More than a few commentators argued that the underlying economic and political fundamentals were not in Obama’s favor. We show that this was untrue: the economy was growing fast enough for the incumbent to be favored. Many commentators also saw the Republican primary as a search for "anybody but Romney." We show that this was also untrue. The many anybodies — Rick Perry, Herman Cain, Newt Gingrich, Rick Santorum, etc. — surged largely because of temporary increases in media coverage of them, and not because Republican voters had any underlying hostility toward Romney himself.
After the election, commentators were quick to attribute Obama’s victory to his superior campaign. We show that the effects of things like campaign advertising and field organizing were likely not large enough to account for Obama’s victory. We also call into question many prevailing interpretations of the election — that it augured a Democratic realignment, that it suggested a profoundly "Liberal America," that it suggested the Republican Party needed a complete overhaul. On the whole, the 2012 election was very much what extant political science research led us to expect. It showed that a book building on and elaborating that research could make a useful contribution.
We also sought to be part of the conversation among journalists and commentators, and we felt included in that conversation. This was reflected in opportunities and invitations to contribute to media outlets — such as our collaboration with Ezra Klein to develop a forecasting model for Wonkblog. It was reflected in the willingness of high-profile journalists and commentators to endorse the book. It was reflected in ways in which commentators chose to engage with political science in their own writing. Even when they disagreed with us, with other political scientists, or with their conception of what "political scientists say," it was better than being ignored.
Of course, we will have a better sense of whether our book has any particular impact after it is out. But regardless we believe — although it is difficult to measure — that political science ideas and findings are much more in the bloodstream of campaign journalism and punditry than they once were.
John Sides is an associate professor of political science at George Washington University. Lynn Vavreck is an associate professor of political science and communication studies at the University of California at Los Angeles.
Blackboard announced last week that Ray Henderson is leaving his position as president of academic platforms at the company and is joining its Board of Directors. On his blog, Henderson characterized the shift as one that could expand his influence. "It means I’ll no longer manage day-to-day operations for our Academic Platforms group. In handing off the day to day, I’ll take a new role that will provide me a perch with broader purview across the whole of Bb. I’ve been enlisted to think about the whole of Bb and its pieces, and how they might come together to produce the most coherent and effective global education company that we can design," he wrote.
Henderson's shift is likely to be closely watched (and it already is). He came to Blackboard from Angel, when Blackboard purchased that company in 2009. Henderson was seen by many as more communicative and more open to ideas than other Blackboard leaders at the time -- and his presence has reassured not only customers of Angel but many other Blackboard customers. The e-Literate blog noted that Henderson's move comes at a time of a number of prominent job changes in the learning management system industry.
Despite the praise heaped on California Senate Bill 520 by Phil Hill and Dean Florez in a recent panegyric published in Inside Higher Ed, the bill was not the right answer for California’s higher education access woes, and it is a poor model for other states to emulate.
A bill that would open the door to for-profit companies -- including unaccredited “fly-by-night” ones -- to offer courses in the name of a state’s colleges and universities is fraught with danger. A bill that would require a state’s colleges and universities to outsource their core educational function is truly misguided, however well-intentioned the idea may have been.
That’s the real reason for the huge uproar and the rare universal opposition to California’s SB 520 from those close to higher education -- both faculty groups and the universities themselves.
Let’s be clear about one thing that’s not acknowledged in Hill and Florez’s piece: colleges and universities around the country already allow transfer credit from other universities as long as those courses meet the quality control standards of the home institution.
That tradition has been in place for a long time precisely to balance the needs of students who often take courses at more than one institution with the needs of the public to ensure quality control and the integrity of degrees from its taxpayer-funded institutions. The people of California (including employers) need to know that a degree from the University of California, the California State University, or a state community college is just that -- and not something offered by an unknown entity.
By mandating that state public colleges and universities begin a process of outsourcing its courses, SB 520 would have seriously weakened transparency and accountability in its institutions of higher learning. That’s one reason why the provosts of major universities in the Midwest have argued against similar schemes in their institutions. Alumni and trustees at Thunderbird Business School have also expressed serious concerns about how such a proposed relationship would threaten the reputation of that school and the value of its degrees for all students.
There is good reason for such concern, for cautionary tales about relying on for-profit companies to offer a college’s courses are unfolding right now around the country. In a December 2012 court settlement, for instance, the New York Institute of Technology was found legally and financially liable for actions of its for-profit partner. More recently, Tiffin University has seen its accreditation threatened because of over-reliance on unaccredited for-profit companies to offer its courses.
If SB 520 had passed, it would not have expanded meaningful access to quality higher education in the state. But it would have thrown open the door to massive profits for edu-businesses, who are accountable not to the people of California, but to investors and stockholders. No wonder so many CEOs were there to praise SB 520.
Florez and Hill labor mightily to make SB 520 sound bold and innovative, an effort to “wake up [California’s] higher education community,” they say. What everyone, including the state’s elected leaders, really need to wake up to are the fundamental facts about higher education funding in California.
According to a report published in February 2013 by Postsecondary Opportunity: The Pell Institute for the Study of Opportunity in Higher Education and titled “State Disinvestment in Higher Education FY1961 to FY2013,” California’s state fiscal support for higher education as a percentage of state personal income dropped by 58.2 percent (adjusted for inflation) between 1980 and 2013. The trajectory is clear: if the current long-term trend continues, California will reach zero in state funding for higher education in the year 2054.
Unfortunately, as Postsecondary Opportunity’s research demonstrates, many other states are also in a “Race to Zero.”
SB 520 was no “wake-up call” for anyone. It was, in fact, a dangerous diversion from the reality that there is simply no substitute for public investment in higher education, and there is no single cheaper teaching modality or low-cost “magic bullet” that will meet our need for qualified college graduates.
With all that is at stake for the futures of millions of students and for our country, we need to take a harder look at so-called “innovative” solutions that make the old promise of “something for nothing.”
In today’s Academic Minute, Professor Nancy Kim of the California Western School of Law explores the nature of Internet-based contracts we often agree to without careful consideration. Learn more about the Academic Minute here.
MOOC provider Coursera on Tuesday announced Lila Ibrahim will become its first president. Ibrahim, who will continue to serve as an operating partner for the venture capital firm Kleiner Perkins Caufield & Byers, previously spent 18 years with the chip maker Intel. "Lila has worked closely with the company founders over the past year," Coursera founders Andrew Ng and Daphne Koller wrote in a blog post. "She has been consistently passionate about education and brings the experience to help us turbo charge Coursera’s growth." Ibrahim will join Koller and Ng to form Coursera's executive team.
As early as 2006, 70 percent of Americans searched online for answers to science-related questions, noted Dominique Brossard, a professor of life sciences communication at the University of Wisconsin at Madison, but search engine algorithms could mean public opinion about controversial topics -- like climate change or stem cell research -- is impacted by how those search engines list the results to a query.
Researchers themselves are increasingly turning to social media to keep up with the most recent scientific developments, according to a report from the National Science Foundation. In 2010, one-fifth of neuroscientists and a quarter of physicians surveyed said they read blogs or used social media one or more times a day. "Science as an institution is, more than ever, in need of public support as federal funding is shrinking and scientiﬁc issues become more and more entangled with social and ethical considerations," the article reads. "A theoretical understanding of the processes at play in online environments will have to be achieved at a faster rate if science wants to leverage the online revolution for successful public engagement."
A long walk through the English countryside and the current flap over the government surveillance of cell phone records touched off my deeply held and unreasoned Luddite reaction to "big data." Like most over-hyped trends, the surge of interest in big data and its application provokes ennui among those of us with some mileage on our sneakers. Gary King of Harvard says that with all the available "big data" students in their freshman year can be given a personalized plan to achieve their lifetime career goals. Harvard Business Review claims that data science is the sexiest new profession. Every day brings us the media hyperbole of the application of big data to commercial, political, and scientific enterprises. While some skeptics have surfaced, the mainstream press continues its love affair with big data.
The long walk I recently took through the English countryside (200 miles in two weeks) reminded me of the value of limited information and gave me unencumbered space to think about my oddly blinkered view of big data. Collecting and analyzing data is after all, how I have made a living for 30 years. Data remain to me the only icon of science left largely unsullied by politics, ego, and money. Perhaps I am just jealous, as HBR suggested the old guard of statisticians, survey methodologists, and data analysts are not equipped to join the brave new world of big data.
What convinced me otherwise was the way my husband and I recently managed to mostly not get lost on the famous yet poorly marked coast-to-coast walk through the English Lake District and Yorkshire Moors. We used a $1.50 plastic compass, survey ordinance maps, a highly schematic guidebook and each other. No GPS, no Google Maps, no iPad or iPhone, no turn-by-turn directions. The simple tools of "compass, map, and thou" are based on substantial abstractions of geographic reality subject to errors of judgment and interpretation. More detailed information would have overwhelmed us as we walked while trying to avoid deep bogs, animal excrement, and slippery precipices in the fog and rain. Decisions made with paper maps, trust, and a little visual triangulation kept us true to our course 90 percent of the time.
And so to big data… The history of science is actually one of reverse engineering. In the beginning, our measurement tools for the physical and social world were so crude that the combination of substantial abstraction and painstaking taxonomic description were the only choices. The grand theories of natural selection and relativity emerged at a time when the data were very sparse and poorly collected. To have any reasoned explanation of the world, scientists of earlier eras had to accept that the empirical world they could observe was quite limited and distorted. Improvements in our tools have allowed us over time to anchor and refine those grand abstractions with a reality closer to what is observed. Still, the world comes to us through a glass, darkly. Until very recently, we have continued to use substantial abstraction to see and understand natural and social phenomena.
The problem with big data is that it is like trying to take a sip of water from a fire hose. "Big" data is really a euphemism for all of the data thrown off by the digital engines that drive our economic and social transactions. Electronic medical records, arrest and conviction records, loyalty card data from the grocery store, all of the stuff you tell OkCupid and Match.com, Google search histories, insurance claims, cell phone calls and even the digital things we create like tweets and blog posts.
Any transaction, business process, or social engagement that uses a machine that records, counts and stores stuff in a digital format generates data. Now people and institutions leave digital footprints everywhere. We used to have to ask questions or collect paper records. Now, it is like slapping a universal bar code on the back of every person and business in the world. Every time they do something, the big barcode scanner in the sky records it and stores it. Data are no longer representing reality but rather are the reality.
The problem of course is that we have almost come full circle. Rather than too little data, poorly measured, we now have too much data, precisely measured. Our ability to use data effectively to make decisions or understand the world depends on our ability to see patterns and abstract from those patterns. Big data is, in many ways, an exact replica of reality. Using big data to make decisions is like using every square inch of soil, landscape, and sky in my 200-mile walk across England to figure out how to get around the corner in the next small village. It feels to me as if we need to return to the time of Linnaeus, the famous Swedish botanist whose pioneering classification of the natural world gave us the concept of the "species," to classify the intersecting and complexly nuanced world thrown off by our digital engines before we start making decisions using this unknown commodity. We need to rebuild those high level abstractions from the ground up to make sense of this new reality.
My difficulty with at least the political and commercial applications of big data is that our tools of abstraction and decision-making are decidedly underdeveloped when faced with this type of data. As long as Netflix doesn’t understand that when I share my account with my early 20-something daughters, their big data application will continue to recommend "Buffy the Vampire Slayer" and "Gossip Girl" to me when my real preferences run to "Masterpiece Theater" and subtitled films. On a more serious note, our real fear of the use of cell phone transaction data to understand the social networks of individuals is not necessarily about the invasion of privacy but the possibility that the wrong person will be identified as a threat because his or her data are taken out of context. It is no longer whether our data are adequate to support our theories but rather whether we have developed adequate theories to explain our highly nuanced data.
Or maybe I am just jealous that Google hasn’t come looking for me…. yet.
Felicia B. LeClere is a senior fellow with NORC at the University of Chicago, where she works as research coordinator on multiple projects. She has 20 years of experience in survey design and practice, with particular interest in data dissemination and the support of scientific research through the development of scientific infrastructure.
California’s controversial bill to allow third-party, online courses to count for credit at the three public systems of higher education has met an ignoble end. Or has it? On July 31, we learned that Senate Bill 520 (SB 520), authored by Senate President Pro Tempore Darrell Steinberg, is being moved to the two-year file, and will remain dormant for at least a year.
Is this a telling defeat for powerful state politicians who went too far in trying to advance online education options, or did the process of introducing the bill and debating it in public actually create the same goals and opportunities that drove the bill in the first place?
We believe that despite the tremendous and dramatic opposition and perceived defeat of SB 520, quite a lot has been accomplished as a direct result of the initial bill language. Despite spectacular headlines, the bill itself is not dead, but rather has simply been moved to the two-year file where it will be revived as needed.
How Did We Get Here?
As described in a position paper written by Phil Hill and Michael Feldstein for the 20 Million Minds Foundation, when the California Master Plan was adopted in 1960, the basic premise was to guarantee students a place within one of the three public systems based on their high school record. It was assumed that by having a place in a public institution, the student would have access to needed courses. As the state budget has crumbled, unemployment rates have skyrocketed and enrollment demand has surged without the resources to accommodate it, this assumption is no longer valid. Across the state, literally hundreds of thousands of students have been turned away from needed courses at the California Community Colleges (CCC), the California State University (CSU), and the University of California (UC).
In January of 2013, in an effort to address the growing public education access problem facing California, the 20 Million Minds Foundation brought together students, faculty, administrators, state leaders, and ed-tech pioneers for a one-day symposium. The "Re:Boot California Higher Education" conference promoted a robust discussion that examined not only the challenges, but also the potential technological solutions to the major issues facing California’s three segments of higher education. During his opening address to the Re:Boot participants, Senator Steinberg indicated:
[Online education] is a […] revolution and possibilities abound using technology in ways that not only equal or enhance quality but also reduce the cost of higher education for struggling students and their families.
In March of this year, during an online press conference, Senator Steinberg unveiled Senate Bill 520, announcing the legislation that would “would reshape higher education, in partnership with technology we already use, to break bottlenecks that prevent students from completing education.”
The newfound involvement of state government officials in this level of higher education, and the nature of the bill itself, which proposed the heretofore unheard-of use of controversial, potentially disruptive, large-scale solutions such as MOOCs for credit, generated significant resistance from faculty groups and the systems themselves. In particular, a New York Timesarticle "broke the news" that a powerful senate leader was going to challenge the status quo without getting agreement from faculty groups first, and this publicity helped rally vocal opposition to the bill. Of course, this level of resistance should not have surprised anyone involved in higher education in California.
The nature of government is that real legislative movement most often occurs for two reasons – bad press or a crisis. Senator Steinberg sees course access as a crisis for public higher education, and he introduced a bill designed to wake up the higher education community. The bill essentially sent the message that "we need to solve these problems of access whether our colleges and universities do it themselves or whether we need outside help." This challenge to go beyond the ordinary thoughts and discussions in public policy pushed the boundaries and made many groups quite uncomfortable.
In parallel, Governor Jerry Brown added fuel to the fire by proposing additional funding to the CCC, CSU, and UC with the caveat that certain conditions be tied to the funding. The language in the proposed budget obligated the funds "to increase the number of courses available to matriculated undergraduates through the use of technology, specifically those courses that have the highest demand, fill quickly, and are prerequisites for many different degrees.” This language was interpreted as telling the systems how to do their job. After CSU and UC indicated they would follow the same guidelines, but execute the solution their own way, Governor Brown used a line-item veto to remove his own proposed earmarks creating conditions for the additional funding.
The result of the intense opposition and debate during the legislative process led to significant amendments to SB 520. Originally envisioned as the gateway to public-private partnerships with a common pool of courses, the bill has been transformed into a grant program for each system to implement individually. Even with the passage of the amended bill in the Senate, the bill is currently on hold.
Movement From Systems
All three systems have proposed new programs that broadly meet the same goals outlined by SB 520, largely based on the additional funding for online initiatives, with the new emphasis being the introduction or expansion of online courses with cross-enrollment across each system.
The California Community Colleges currently enroll as much as 17 percent of their students in various types of online or distance education. The system is poised to continue to advance and expand its online programs with a strong focus on career technical education as well as workforce development programs as outlined in the CCC System Strategic Plan, updated in June of this year.
In July, the CSU introduced a new Intrasystem Concurrent Enrollment program, allowing students at each campus to sign up for one of 30 online courses offered in the program from other "host" campuses. Under the current plan, students will be limited to one course per semester.
In January, UC introduced the Innovative Learning Technology Initiative, updated in May, as "a direct response to the governor’s plan to earmark $10 million from UC’s FY 14 core budget to use technology to increase access to high-demand courses for UC matriculated students.”
Despite the welcome news of these programs, we are already hearing widespread concerns over the pace and scale of implementation. Lieut. Governor Gavin Newsom, a noted supporter of more effective use of technology, following the online program presentation at the July meeting of the UC Regents, stated, "I don’t think we’re running at full speed here. We’re moving extraordinarily slowly…. Californians are looking to us" for progress in online education.
What to Expect Next
Students enrolled in California public colleges and universities should be guaranteed timely access to the core courses that they are required to take in order to graduate. Given that there are a variety of ways in which the institutions could meet this obligation, the state should avoid being overly prescriptive about the method. Rather, it should supply the mandate for educational access, support institutions in meeting this mandate, and provide a safety valve to ensure the mandate’s right is preserved.
--From our position paper
The focus should remain on finding effective solutions to the course-access issue -- providing students with high-quality courses they need while reducing costs. Before this year, this was not happening for a variety of reasons, and it remains to be seen just how much the institutions will do without the pressure of earmarked funding in the state budget or pending legislation such as SB 520.
We believe the best outcomes for online education occur when faculty and institutions are motivated and supported to design high-quality options for students. Ideally, colleges and universities would craft solutions, but use third-party courses as safety valves to ensure students have access to necessary classes. The hope is that the three public systems will continue their progress, find real solutions to the course access problem, and not fall into the trap of doing the same old thing again, just with online options.
At this point, one might actually suggest that a welcome policy outcome has indeed been accomplished as a direct result of the initial language in SB 520. The bill is certainly not dead. The bill itself could now be thought of as a safety valve, providing an option in case the three systems fail to show real progress in meeting the challenge of course access. We are, however, cautiously optimistic that viable and effective change is, at least for now, in the formative stages.
Phil Hill is a consultant and industry analyst covering the educational technology market primarily for higher education. He is co-publisher of the e-Literate blog and co-founder of MindWires Consulting. Follow him on Twitter at @PhilOnEdTech.
Dean Florez is the former Senate majority leader of the California State Senate and the current president and CEO of the 20 Million Minds Foundation (@20MillionMinds). Follow him on Twitter at @DeanFlorez.