Information Technology

Gonick essay predicting higher ed IT developments in 2012

This series of annual Year Ahead articles on technology and education began on the eve of what we now know is one of the profound downturns in modern capitalism. When history is written, the impact of the deep economic recession of 2008-2012 will have been pivotal in the shifting balance of economic and political power around the world. Clear, too, is the reality that innovation and technology as it is applied to education is moving rapidly from its Anglo-American-centered roots to a now globally distributed dynamic generating disruptive activities that affect learners and institutions the world over.

Seventy years ago, the Austrian-born Harvard lecturer and conservative political economist Joseph Schumpeter popularized the now famous description of the logic of capitalism, Capitalism, Socialism, and Democracy.

The opening of new markets, foreign or domestic … illustrate(s) the same process of industrial mutation – if I may use that biological term – that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one. This process of Creative Destruction is the essential fact about capitalism.

Our colleges and universities, especially those in the United States, are among the most conservative institutions in the world. The rollback of public investment in, pressure for access to, and indeterminate impact of globalization on postsecondary education all contribute to significant disorientation in our thinking about the future of the university. And then there are the disruptive impacts of information technology that only exacerbate the general set of contradictions that we associate with higher education.

The faculty are autonomous and constrained, powerful and vulnerable, innovative at the margins yet conservative at the core, dedicated to education while demeaning teaching devoted to liberal arts and yet powerfully vocational, nonprofit in their sensibilities and at the same time opportunistically commercial, in what Clark Kerr, in The Uses of the University, called an "aristocracy of intellect" in a populist society. And while reports of the death of the American university are greatly exaggerated, there is an ineluctable force at play that continues to exert growing pressure against the membranes of the higher education ecosystem. The uneven and unequal dynamics of the global economy and information technology are major forces leading to growing pressure for universities to adapt through the process of creative destruction. The emergent trends I note below include disruptive forces that, if history is a guide, will lead future students of the history of technology to note the period ahead as the beginning of the next great tech bubble.

The year ahead may be among the most difficult ever for the economics of postsecondary education in much of the world.  At the same time, and in the same time frame, I believe we will see major new developments from the world of information technology that will, over time, lead the university to adapt and enable the familiar institution to not only persist but to maintain its relevance to the disruptive forces of society and economy all around it

Here are the 2012 top 10 IT trends impacting the future of higher education:

1.  Open Learning Initiatives Become an Institutional Imperative

Each year for the past three I have noted in this annual column the rise of open learning and open education resources enabled through information and communication technology. This past year’s experimentation by Stanford’s much-publicized global offering to tens of thousands of learners around the world followed by MIT’s MITx initiative will quickly become a table stakes conversation for most top universities and colleges the world over. The range of subjects, the variety of modalities for delivery, and the extension of learning opportunities around the world are approaching an inflection point.  No one can or should ignore the most important and explosive opportunity in postsecondary learning in over half a century.  As new massive open online learning environments (MOOLEs) move from a nascent state along the maturity curve economic models, new entrants and laggards, winners and losers, and new centers of knowledge will follow.

2. The United States Launches Next Generation Network Infrastructure and Applications in Partnership with Neighbors and Cities

Boundary-spanning activities are not limited to online learning environments. In 2012, at least two major national next generation network initiatives will be launched.  The goal is to create a comparative advantage for the United States in the network-enabled 21st-century economy.  The tactical approach is to partner with those prepared to invest, build and operate new gigabit networks in neighborhoods around our universities and colleges as well as offer “above the network” services to our neighbors. The premise is that advanced network infrastructure to the environs around the university will catalyze new, never-before-seen applications and services that will improve the quality of life of millions of Americans who live around our major universities. 

Gig.U is a national initiative led by U.S. National Broadband Plan architect Blair Levin, designed to create a national partnership among universities, telecommunication providers, and technology companies that leverages blazing-speed wired and wireless networks to build a network of testbed facilities in neighborhoods around our universities.  The project draws inspiration from Google’s Gigabit Community initiative that led to the decision to engage with Kansas City and early prototyping some years earlier in Cleveland in the development of OneCommunity and the Case Connection Zone to build gigabit fiber to the home networks and applications.

The second initiative is US Ignite, a multifaceted initiative led by the White House Office of Science and Technology Policy, the National Science Foundation, and a new 501(c)3. US Ignite seeks to catalyze and choreograph the development of a new generation of applications that can run on and leverage the next generation networks being deployed by NSF, Google, Internet2, NLR,  Gig.U and others.

The opportunity to extend unprecedented network access and services to neighborhoods around our universities will unleash new opportunities for innovation, collaboration, and opportunities for both our postsecondary institutions and the cities and towns within which we live, work and study.

3. Big Data is Here: Getting Beyond the Campus View

Zettabyte-scale data sets (1 billion terabytes) will be here in 2012, if we can achieve three preconditions. Prospects of an emergent zettabyte-scale big science world of proteomic data are being rate limited by existing network capacity, visualization tools for analysis and the encrusted logic of university funding and our IT organizations.  Network and storage innovation and visualization and analytical tools will continue to evolve at cloud scale and speed.   The prospects of creating a vector in which the technology and our analytical tools meet the needs of our research science community will require some unprecedented collaboration among US funding agencies and our universities. The most exciting development on this front is the set of initiatives led by Internet2 and their NET+ work. Led by former MIT CIO Jerry Grochow, NET+ is our single best opportunity to support big science and to position the United States to be able to compete in the growing competitive international big science playing field.

NET+ is tackling the thorny problem associated with Schumpeter’s Creative Destruction proposition. We can spend the next 25 years in "business as usual mode" attempting to build the infrastructure for big science on each of our university campuses and reinforce the patterns of securing funding, building platforms, and supporting analytical services. We will also miss the train. There is simply no way we can afford to create redundant infrastructure to support the next generation of science, discovery, and innovation. Three-letter federal funding agencies, state economic development and education organizations, research and education networks, research scientists, and of course our higher education leaders, including our CIO community, should join and challenge Net+ to quickly set its sights on the development of an unprecedented collaborative set of platform technologies. The race for big science is on. The stakes are too high to be left to single or even small numbers of campus solutions.

4. Big Data Applied to Creating a New Learning Genome

The promise of massively scalable capacity to enable, track, and assess learning outcomes based on personalized learning needs is both compelling and ready for prime time. Notwithstanding internal debates on learning styles (Pasher, Harold, et al. “Learning Styles: Concepts and Evidence.” Psychological Science in the Public Interest 9.3 (2009): 105-119), the methods, modeling techniques, and rigor applied to big data science are well positioned to advance our understanding and application of learning sciences. The higher education marketplace is poised to begin marshaling the growing tsunami of data points and apply first generation algorithms to provide both predictive models of learning success and, over time, refine those algorithms to align different learning styles to learning successes.

The framework of a new learning genome begins in earnest in 2012. Look for a wide range of players from Blackboard Analytics, University of Phoenix, Kaplan University, Pearson Education, and perhaps ERP players to make a run at an “Enterprise Education Platform.” Startups with secret sauce are ready to scatter their pixie dust on colleges and universities to magically solve everything that ails us, from predictive modeling for retention to career counseling. 

Growing interest in big data for college success has many CIOs salivating at the opportunity to build new platforms and realign their organizations to respond to the heightened interest in data-driven decision support.  But the science of learning as applied to a new learning genome is in a nascent state. We should be wary of the unbounded enthusiasm that the hype curve will generate in the next year and work on the foundations of campus readiness, governance and partnerships to focus on requirements and advancing our ability to contribute our loaf of bread to the emergent marketplace of “solutions.”

5. SmartPads and New Learning Content

Circa 1993, the most provocative concepts in educational technology were CD-ROM and laserdisc multimedia tools, like Macromedia Director and Bob Stein’s Voyager multimedia publishing ventures that produced works like Who Built America. Then came the World Wide Web.  Multimedia education content innovation remained largely frozen in place for nearly 20 years. The emergence of SmartPad technologies has led to a resurgence and revival of interest in multimedia content education. To date we have seen well-financed platform players transpose traditional textbooks and port them over to Kindle, iOS and/or Android environments.

The SmartPad is the experience platform of choice for many students.  Value-added functionality for textbooks like highlighting, clipping services, and collaboration tools will continue to extend the value of existing textbook content and the role of the traditional publishing industry. In 2012 a new class of learning content projects that combines advanced multimedia tools and hybrid interactions enabled over the Web will find their way to the mainstream. Look for gaming platforms on SmartPads (with integration on the web) to create quest adventures for disciplines as diverse as history and physical sciences.  Traditional research journals in disciplines such as law and medicine will start piloting the integration of multimedia content, well beyond nesting video or hyperlinked content.  Areas as diverse as nursing and workforce development will integrate artificial intelligence engines and advanced multimedia learning content to promote simulation experiences to facilitate meaningful use and practice. This emergent market will likely see several large venture back startups this year along with interest from a handful of forward-thinking traditional publishers.

6. ERP is dead! Long Live ERP 3.0

You may or may not be old enough to remember ERP 1.0 and the famous green screen interface.  Over the past decade, we’ve come to “enjoy” the web interface despite persistent lag and time delays in the transaction experience for many of our university back office experiences like registration, grades, time entry, and applying for a job. Universities have spent billions of dollars on ERP in the past 20 years. That doesn’t even address getting intelligence or even reporting out of those back-office systems. From the vendors’ vantage point those initial investments of billions of dollars are now – through maintenance, upgrades, licensing and the like -- an annuity check worth 5-10 times the initial payout over the life of the installed product.  Collective action among some of our leading universities and colleges led to the creation of our homegrown ERPs, still big, slow, and largely focused on the back office and not much better on the customer experience.

2012 can mark the beginning of the end of ERP as we know it.  To be sure dinosaurs die slowly and will continue to roam the face of the college campus for the foreseeable future. ERP 3.0 is here in the form of modern technical architecture and services that provide users with better functionality and a much better user interface experience. Real time reporting, built-in analytics, and predictive and scenario–based modeling run embedded in the news services architecture.  Universities are lining up to kick the tires and the first-wave adopter group will have real-life stories to tell this year. Incumbent providers all see the handwriting on the wall and are scrambling to respond with multi-tenant, software-as-a-service offerings. 

The campus community has come to expect, indeed demand, consumer-like experience in supporting personal banking, travel, and shopping. ERP as we know it simply won’t get us from here to there. A promise of a mobile app sometime in our lifetimes is not good enough.  The underlying technical architecture of existing ERP systems is at a dead end.  With real alternative models moving into production in 2012, expect a groundswell of interest in adopting the new technology and services model. To be sure, there will be bumps along the road, but there is simply no turning back.

7. From OneCard to Mobile Payment on Campus

Everyone has seen and experienced the campus OneCard technology -- the integration of the campus ID for gym access or parking combined with having mom and dad send money that can be added to the card to support campus-related acquisitions at the bookstore, cafeteria, coffee shop or campus retail facilities. The world of mobile payment is undergoing significant disruption. From Kenya (banking) to Europe (home health, museum services, event ticketing and airline frequent flyer programs), mobile payment is moving from cards to mobile devices. The integration of mobile payments with “presence technology” (knowing my location) and “preference technology” (my profile or patterns of behavior gathered) through smartphone technology makes the college and university marketplace a logical place for early near field communication platform technology development in the United States and Canadian markets.  Services like Google Wallet are just the beginning. Entrepreneurs, student groups, device manufacturers, service integrators, college stores, and services from laundry to cafeterias should be ready to pilot and innovate in the next year as mobile payment hits a campus near you.

8. The Khan Academy meets TED Talks and the birth of a new remediation strategy for higher education

The systemic challenge of college prep and remediation is well-known. As many as a third of students come to college without adequate preparation. The cost of remediation in the United States stands at $5.6 billion annually. The challenge of remediation at the college level will continue to demand demonstrably effective interventions.

The success of the TED Talk format has begun to permeate the sacred sanctums of academic life. Professional societies, like the American Political Science Association, have experimented with TED Talks at their annual meetings. The best of these 18-minute (or as short as 6-minute) formatted presentations are well-choreographed pieces of theatrical soliloquies and often include effective visualizations that together engage, provoke, and communicate with audiences of all manner of background. More recently, university types have begun to take notice of the disruptive impact of Sal Khan’s high school remedial organization. Targeted currently largely to high school learners and teachers, the format of the Khan Academy ‘chalk board’ exercise videos has gone from a handful of videos to more than 2,500 instructional pieces and is now being integrated into district-level assessment and outcomes analysis that support individual teachers, schools, and even larger administrative and curriculum design efforts.

Organizations as diverse as the American Council on Education and the Bill & Melinda Gates Foundation are focused on this space and are interested in investments and accreditation. This year, the market is ready for several entrants producing TED Talks-meet-Khan-Academy aimed at the college remediation space. Done right, this will be where well-produced discrete topics for students will be joined with carefully integrated tutoring, competency demonstrations and even demonstrations of knowledge acquisition that can be shared within the university/college setting.

9. Active Learning Collaboratories Begin to Define the New Classroom Experience

Classroom facility planning and attendant faculty development to leverage new active learning collaboraties will come into focus in the 2012 calendar. New gold standards for this category of learning spaces are emerging on campuses like the University of Southern California. Rhetorical commitments to bring learning experiences into the 21st century are made possible through an alignment of academic strategic planning, academic technology leadership and focused project planning and partnerships with facilities management -- no small challenge on most higher education campuses.

Traditional lecture halls are being replaced by active learning studios. Dialogue cafes with telepresence video conferencing capabilities augment and support experiential learning curriculum focused on cross-cultural and cross-national understanding. Integrated "magic" touch screen panels are enabling geospatial and GIS explorations in a wide range of disciplines from statistics to poverty studies, from history to astrophysics. Scientific visualization walls are working their way into learning spaces to support learning of technique, active discovery and "lab work," presentations, and collaborative explorations.

Many campuses have a showcase learning space. Relatively few have a systematic approach to building and supporting learning spaces across the campus. While others on campus will continue to romance the value of the nailed to the floor student desk (“that is how I learned”), the time has come to create a new "standards" orientation to different levels of technology-enabled learning spaces informed by a catalog of different teaching and learning approaches. Technology, governance, and focus on student success in the classroom have all matured from the era of the wild, wild west over the past two decades. In 2012, I believe we may see the development of a draft taxonomy of such a new standards orientation developed by a coalition of architects, technologists, instructional designers, students, and yes, even a couple of instructors.

10. Whither the Campus IT Organization?

In the beginning we cast ourselves as high priests. We had others build us grand temples as modern mausoleums in the center of which resided the sacred mainframe computer. All of us old enough to remember recall the special wizard-like roles of those who tended to the machine. Those who led the wizards, plastic pen protectors in place, were viewed with reverence. Then the advent of the personal computer smashed efforts to preserve the hereditary line of the high priesthood.

The emergence of an era of possibility and plenty recast our role into those of the chosen people. Unabashed idealism combined with charismatic leadership and a healthy dose of rhetoric gave rise to the audacious idea of transforming the enterprise of higher education. The chosen people, themselves led by charismatic technology visionaries, would lead the academy, apparently lost and aimless for centuries in the wilderness of the desert of pre-personal computers, into a new promised land. The advent — and powerful appeal — of networks connecting computers and people from around the campus and around the world represented prophetic leadership. These prophets envisioned a world with as many blinking lights around network routers and switches as there were stars in the skies or grains of sand in the desert. The torch of scientific discovery and historical evolution naturally culminated in the digitally networked campus. Compelling indeed were those who invented a leadership role to advance this emergent information technology ecosystem and convinced the powers that be that every president needed a new commander-in-chief for technology.

And then a funny thing happened. The promise of productivity and efficiency of information technology combined with the centrifugal logic of the networks came to pass. Globalization with all of its disruptive impulses in the economic, cultural, and education domains would not have materialized in the accelerated fashion we are witnessing without the compounding impact of our computing and networking power.  The foundations of much received wisdom are now in flux. Through the success of our networks, the economies of scale associated with computing and storage capacity, and the innovations and economics of nomadic and mobile experiences, what was once solid is melting into thin air.

 In 2012 consider three broad IT leadership scenarios:

  • Embrace the assumption that technology is now a utility and generally does not provide strategic advantage. In this scenario leadership becomes managing sourcing strategies for the utility and internal customer relationships, and squeezing capacity to support new unfunded mandates associated with the new high priests, chosen people, and prophets on the campus planning horizon, whoever and whatever they might be.
  • Align the organization and its capacity to genuinely support strategic activity. If the institution embraces any form of strategic direction, IT can become an innovative enabler as well as a transformational agent in achieving strategic work. Maintaining focus on strategic differentiators is not easy in the best run organizations, and it is even more challenging in many institutional settings in higher education. IT leadership can provide a consistent and credible voice for the value of having strategies with which others, including IT, can align.
  • Dare to artfully challenge the institution and its leadership to continue to see a vital role for innovation and creative work in IT as an ineluctable part of the university’s strategic leadership portfolio. Learning to thrive in the ambiguity of where and how innovation and creativity dynamically render on campus is an existential identity question, not a strategic concern of the institution. Ceding a modicum of control, celebrating the innovation of others, partnering to co-produce and co-enable others to take the institution to the edge of the possible are the objectives of all 21st century university leadership, including information technology leaders.

The year ahead will undoubtedly be filled with many challenges and strains, celebrations and awe. The logic of creative destruction will be more apparent in the next period of time than ever before in our lifetimes. Looking for the emerging technology trends and how they intersect with the future of the academy provides an interesting mapping strategy to chart the terrain to be discovered and sojourned in 2012.

Lev S. Gonick is vice president for information technology services and chief information officer at Case Western Reserve University. He blogs about technology at Bytes From Lev.

Freeing the LMS

Section: 
Smart Title: 

Vying to reshape dynamics of e-learning market, Pearson announces cloud-based learning management system that is "absolutely free" — hosting and support included.

Integrated Solutions

Section: 
Smart Title: 

Citing a need to save money and rethink information access, several liberal arts colleges have placed libraries and IT in the same administrative unit.

The Pulse: Lou Pugliese of Moodlerooms

Smart Title: 

The new edition of The Pulse features a conversation with Lou Pugliese, chairman and CEO of Moodlerooms Inc., which provides hosting and support for Moodle, the open source course management system.

'XXX' Marks the Spot

Section: 
Smart Title: 

The exercise of figuring out one’s “porn star name” is probably more familiar to college students than to college administrators.

Fair Use Face-Off, Canadian Edition

Smart Title: 

A defection from a major copyright clearinghouse by Canadian research universities echoes concerns in U.S.

Making Clouds Less Ominous

Section: 
Smart Title: 

Research universities try to negotiate a standard contract with commercial e-mail providers that would ease costs and fears about moving sensitive data into the cloud.

Secondhand Rights

Smart Title: 

Two companies that help facilitate credit transfers battle in court over who owns their clients' course catalog content.

Between What's Right and What's Easy

Sometimes our tools are our politics, and that’s not always a good thing. Last week, the Copyright Clearance Center announced that it would integrate a “Copyright Permissions Building Block” function directly into Blackboard’s course management tools. The service automates the process of clearing copyright for course materials by incorporating it directly into the Blackboard tool kit; instructors post materials into their course space, and then tell the application to send information about those materials to CCC for clearance.

For many, this move offers welcome relief to the confusion currently surrounding the issue of copyright. Getting clearance for the materials you provide to your students, despite the help of organizations like CCC, is still a complicated and opaque chore. Instructors either struggle through the clumsy legal and financial details or furtively dodge the process altogether and hope they don’t get caught. With the centralization offered by CCC and now the automation offered by this new Blackboard add-on, the process will be more user-friendly, comprehensive, and close at hand. As Tracey Armstrong, executive vice president for CCC, put it, “This integration is yet another success in making the ‘right thing’ become the ‘easy thing.’”

Certainly, anything that helps get intellectual resources into the hands of students in the format they find most useful is a good thing. I have no doubt that both the CCC and Blackboard genuinely want the practical details of getting course materials together, cleared, and to the student to be less and less an obstacle to actually teaching with those materials. But I’m skeptical of whether this “easy thing” actually leads to the “right thing.” Making copyright clearance work smoothly overlooks the question of whether we should be seeking clearance at all -- and what should instead be protected by the copyright exception we’ve come to know as “fair use.”

Fair use has been the most important exception to the rules of copyright since long before it was codified into law in 1976, especially for educators. For those uses of copyrighted materials that would otherwise be considered an infringement, the fair use doctrine offers us some leeway when making limited use for socially beneficial ends.

What ends are protected can vary, but the law explicitly includes education and criticism -- including a specific reference to “multiple copies for classroom use.” It’s what lets us quote other research in our own without seeking permission, or put an image we found online in our PowerPoint presentations, or play a film clip in class. All of these actions are copyright violations, but would enjoy fair use protection were they ever to go to court.

But there is a dispute, among those who dispute these kinds of things, about exactly why it is we need fair use in such circumstances. Some have argued that fair use is a practical solution for the complex process of clearing permission. If I had to clear permission every single time I quoted someone else’s research or Xeroxed a newspaper article for my students -- figuring out who owns the copyright and how to contact them, then gaining permission and (undoubtedly) negotiating a fee -- I might be discouraged from doing so simply because it’s difficult and time-consuming. In the absence of an easy way to clear copyright, we have fair use as a way to “let it slide” when the economic impact is minimal and the social value is great. 

Others argue that fair use is an affirmative protection designed to ensure that copyright owners don’t exploit their legal power to squelch the reuse of their work, especially when it might be critical of their ideas. If I want to include a quote in my classroom slides in order to demonstrate how derivative, how racist, or maybe just how incompetent the writer is, and copyright law compelled me to ask the writer’s permission to do it, he could simply say no, limiting my ability to powerfully critique the work. Since copyright veers dangerously close to a regulation of speech, fair use is a kind of First Amendment safety valve, such that speakers aren’t restricted by those they criticize by way of copyright. 

This distinction was largely theoretical until organizations like CCC came along. With the help of new database technologies and the Internet, the CCC has made it much easier for people to clear copyright, solving some of the difficulty of locating owners and negotiating a fair price by doing it for us. The automatic mechanism being built into Blackboard goes one step further, making the process smooth, user-friendly, and automatic. So, if fair use is merely a way to account for how difficult clearing copyright can be, then the protection is growing less and less necessary. Fair use can finally be replaced by what Tom Bell called “fared use” -- clear everything easily for a reasonable price. 

If, on the other hand, fair use is a protection of free speech and academic freedom that deliberately allow certain uses without permission, then the CCC/Blackboard plan raises a significant problem.

The fact that the fair use doctrine explicitly refers to criticism and parody suggests that it is not just for when permission is difficult to achieve, but when we shouldn’t have to ask permission at all. The Supreme Court said as much in Campbell v. Acuff-Rose (1994), when Justice Kennedy in a concurring decision noted that fair use “protects works we have reason to fear will not be licensed by copyright holders who wish to shield their works from criticism.” Even in a case in which permission was requested and denied, the court did not take this as a sign that the use was presumptively unfair. Fair use is much more than a salve for the difficulty of gaining permission.

Faculty and their universities should be at the forefront of the push for a more robust fair use, one that affirmatively protects “multiple copies for classroom use” when their distribution is noncommercial, especially as getting electronic readings to students is becoming ever cheaper and more practical. 

Automating the clearance process undoes the possibility of utilizing, and more importantly challenging, this slow disintegration of fair use. Even if the Blackboard mechanism allows instructors simply not to send their information to CCC for clearance (and it is unclear if it is, or eventually could become, a compulsory mechanism), the simple fact that clearance is becoming a technical default means that more and more instructors will default to it rather than invoking fair use.

The power of defaults is that they demarcate the “norm”; the protection of pedagogy and criticism envisioned in fair use will increasingly deteriorate as automatic clearance is made easier, more obvious, and automatic. This concern is only intensified as Blackboard, recently merged with WebCT, continues to become the single, dominant provider of course management software for universities in the United States.

Technologies have politics, in that they make certain arrangements easier and more commonplace. But technologies also have the tendency to erase politics, rendering invisible the very interests and efforts currently working to establish “more copyright protection is better” as the accepted truth, when it is far from it. 

As educators, scholars, librarians, and universities, we are in a rarified position to fight for a more robust protection of fair use in the digital realm, demanding that making “multiple copies for classroom use” means posting materials into Blackboard without needing to seek the permission of the copyright owners to do so.

The automation of copyright clearance now being deployed will work against this, continuing to shoehorn scholarship into the commercial model of information distribution, and erase the very question of what fair use was for -- not by squelching it, but simply by making it easier not to fight for it and harder to even ask if there’s an alternative.

Author/s: 
Tarleton Gillespie
Author's email: 
info@insidehighered.com

Tarleton Gillespie is an assistant professor in the Department of Communication at Cornell University, and a Fellow with the Stanford Law School Center for Internet and Society.

The Shift Away From Print

For most scholarly journals, the transition away from the print format and to an exclusive reliance on the electronic version seems all but inevitable, driven by user preferences for electronic journals and concerns about collecting the same information in two formats. But this shift away from print, in the absence of strategic planning by a higher proportion of libraries and publishers, may endanger the viability of certain journals and even the journal literature more broadly -- while not even reducing costs in the ways that have long been assumed. 

Although the opportunities before us are significant, a smooth transition away from print and to electronic versions of journals requires concerted action, most of it individually by libraries and publishers. 

In reaching this conclusion, we rely largely on a series of studies, of both publishers and libraries, in which we examined some of the incentives for a transition and some of the opportunities and challenges that present themselves. Complete findings of our library study, on which we partnered with Don King and Ann Okerson, were published as The Nonsubscription Side of Periodicals. We also recently completed a study of the operations of 10 journal publishers, in conjunction with Mary Waltham, an independent publishing consultant. 

Taken together, these studies suggest that an electronic-only environment would be more cost-effective than print-only for most journals, with cost savings for both libraries and publishers. But this systemwide perspective must also be balanced against a more textured examination of libraries and publishers.

On the publisher side, the transition to online journals has been facilitated by some of the largest publishers, commercial and nonprofit. These publishers have already invested in and embraced a dual-format mode of publishing; they have diversified their revenue streams with separately identifiable income from both print and now increasingly electronic formats. Although the decreasing number of print subscriptions may have a negative impact on revenues, these publishers’ pricing has evolved alongside the economies of online only delivery to mitigate the effects of print cancellations on the bottom line.

The trend has been to adopt value-based pricing that recognizes the convenience of a single license serving an entire campus (rather than multiple subscriptions), with price varying by institutional size, intensity of research activity, and/or number of online users. By “flipping” their pricing to be driven primarily by the electronic version, with print effectively an add-on, these publishers have been able to manage the inevitable decline of their print business without sacrificing net earnings. They are today largely agnostic to format and, when faced with price complaints, are now positioned to recommend that libraries consider canceling their print subscriptions in favor of electronic-only access.

Other journal publishers, especially smaller nonprofit scholarly societies in the humanities and social sciences and some university presses, are only beginning to make this transition. Even when they publish electronic versions in addition to print, these publishers have generally been slower to reconceive their business models to accommodate a dual-format environment that might rapidly become electronic-only. Their business models depend on revenues received from print, in some cases with significant contributions from advertising, and are often unable to accommodate significant print cancellations in favor of electronic access. 

Until recently, this has perhaps not been unreasonable, as demand for electronic journals has been slower to build in the humanities and some social science disciplines. But the business models of these publishers are now not sufficiently durable to sustain the journals business in the event that libraries move aggressively away from the print format. 

Many American academic libraries have sought to provide journals in both print and electronic formats for the past 5 to 10 years. The advantages of the electronic format have been clear, so these were licensed as rapidly as possible, but it has taken time for some faculty members to grow comfortable with an exclusive dependence on the electronic format. In addition, librarians were concerned about the absence of an acceptable electronic-archiving solution, given that that their cancellation of print editions would prevent higher education from depending on print as the archival format.

In the past year or two, the movement away from print by users in higher education has expanded and accelerated. No longer is widespread migration away from print restricted to early adopters like Drexel and Suffolk Universities; it has become the norm at a broad range of academic institutions, from liberal arts colleges to the largest research universities. Ongoing budget shortfalls in academe have probably been the underlying motivation. The strategic pricing models offered by some of the largest publishers, which offer a price reduction for the cancellation of print, have provided a financial incentive for libraries to contemplate completing the transition. 

Faced with resource constraints, librarians have been required to make hard choices, electing not to purchase the print version but only to license electronic access to many journals -- a step more easily made in light of growing faculty acceptance of the electronic format. Consequently, especially in the sciences, but increasingly even in the humanities, library demand for print has begun to fall. As demand for print journals continues to decline and economies of scale of print collections are lost, there is likely to be a tipping point at which continued collecting of print no longer makes sense and libraries begin to rely only upon journals that are available electronically.  
As this tipping point approaches, at unknown speed, libraries and publishers need to evaluate how they can best manage it. We offer several specific recommendations.

  • First, for those publishers that have not yet developed a strategy for an electronic-only journals environment and the transition to it, the future is now. Today’s dual-format system can only be managed effectively with a rigorous accounting of the costs and revenues of print and electronic and how these break down by format. Because some costs incurred irrespective of format are difficult to allocate, this accounting is complicated. It is also, however, critical, allowing publishers to understand the performance of each format as currently priced and, as a result, to project how the transition to an electronic-only environment would affect them. Publishers that do not immediately undertake these analyses and, if necessary, adjust their business models accordingly, may suffer dramatically as the transition accelerates and libraries reach a tipping point.
  • Second, in this transition, libraries and higher education more broadly should consider how they can support the publishers that are faced with a difficult transition. A disconcerting number of nonprofit publishers, especially scholarly societies and university presses that have the greatest presence in the humanities and social sciences fields, have a particularly complicated transition to make. The university presses and scholarly societies have been traditionally strong allies of academic libraries. They may have priced their electronic journals generously (and unrealistically). Consequently, a business model revamped to accommodate the transition may often result in a significant price increase for the electronic format. In cases where price increases are not predatory but rather adjustments for earlier unrealistic prices, libraries should act with empathy. If libraries cancel journals based on large percentage price increases (even when, measured in dollars, the increases are trivial), they may unintentionally punish lower-price publishers struggling to make the transition as efficiently as possible.
  • Third, this same set of publishers is particularly vulnerable, because their strategic planning must take place in the absence of the working capital and the economies of scale on which larger publishers have relied. As a result, some humanities journals published by small societies are not yet even available electronically. The community has a need for collaborative solutions like Project Muse or HighWire,  (initiatives that provide the infrastructure to create and distribute electronic journals) for the scholarly societies that publish the smaller journals in the humanities and social sciences. But if such solutions are not developed or cannot succeed in relatively short order on a broader scale, the alternative may be the replacement of many of these journals with blogs, repositories, or other less formal distribution models.
  • Fourth, although libraries today face difficult questions about whether and when to proceed with electronic-only access to traditionally print journals, they should try to manage this transition strategically and, in doing so, deserve support from all members of the higher education community. It has been unusual thus far for libraries to undertake a strategic, all-encompassing format review process, since it is often far more politically palatable to cancel print versions as a tactical retreat in the face of budgetary pressures. But a chaotic retreat from print will almost certainly not allow libraries to realize the maximum potential cost savings, whereas a managed strategic format review can permit far more effective planning and cost savings.

Beyond a focus on local costs and benefits, there are a number of broader issues that many libraries will want to consider in such a strategic format review. The widespread migration from print to electronic seems likely to eliminate library ownership of new accessions, with licensing taking the place of purchase. In cases where ownership led to certain expectations or practices, these will have to be rethought in a licensing-only environment.
From our perspective, the safeguarding of materials for future generations is among the most pressing practices deserving reconsideration. Questions about the necessity of developing or deploying electronic archiving solutions, and the adequacy of the existing solutions, deserve serious consideration by all libraries contemplating a migration away from print resources. In addition, the transition to electronic journals begins to raise questions about how to ensure the preservation of existing print collections. Many observers have concluded that a paper repository framework is the optimal solution, but although individual repositories have been created at the University of California, the Five Colleges, and elsewhere, the organizational work to develop a comprehensive framework for them has yet to begin.

The implications both of licensing on archiving and of the future of existing print collections are addressable as part of any library’s strategic planning for the transition to an electronic-only environment -- but all too often are being forgotten under the pressure of the budgetary axe.

These challenges appear to us to be some of the most urgent facing libraries and publishers in the nearly inevitable transition to an electronic-only journals environment. Both libraries and publishers should proceed under the assumption that the transition may take place fairly rapidly, as either side may reach a tipping point when it is no longer cost-effective to publish or purchase any print versions. It is not impossible for this transition to occur gracefully, but to do so will require the concerted efforts of individual libraries and individual publishers.

Author/s: 
Eileen Gifford Fenton and Roger C. Schonfeld
Author's email: 
info@insidehighered.com

Eileen Gifford Fenton is executive director of Portico, whose mission is to preserve scholarly literature published in electronic form and to ensure that these materials remain accessible. Portico was launched by JSTOR and is being incubated by Ithaka, with support from the Andrew W. Mellon Foundation. Roger C. Schonfeld is coordinator of research for Ithaka, a nonprofit organization formed to accelerate the productive uses of information technologies for the benefit of academia. He is the author of JSTOR: A History (Princeton University Press, 2003). 

Pages

Subscribe to RSS - Information Technology
Back to Top