Earlier this past summer, the U.S. Department of Education announced it would eliminate a student’s opportunity to list in rank order the colleges and universities to which he or she had submitted the Free Application for Federal Student Aid. Many in higher education, and most involved in college counseling, applauded the decision.
Then, this month, the National Association for College Admissions Counseling amended its ethical guidelines to memorialize the department’s action, and it now discourages colleges from asking applicants to list in rank order the colleges they are considering.
These recent changes will force many of us who work at colleges and universities to more directly ask students about their level of interest in our institution. Because we will no longer be able to rely on our ranked position on the FAFSA, which had very high predictive value related to a student’s prospect for enrolling, we now will have to do the asking. This will be new territory for many of us and for students, but I believe such directness can be good for colleges, admissions offices, families and students.
I suspect this shift in communication may have been unintentional on the parts of both the Education Department and NACAC. I also think their actions were the result of a “parade of horribles” -- what-ifs and speculations -- that undoubtedly will bring focus to other important strategies and tools used by many colleges in the contemporary practice of admissions.
Oft mentioned among the parade of horribles are:
the potential for admissions offices to use (“misuse” is a better term) information, like rank order, to influence admissions and financial aid decisions;
the pressure on students to develop a strategy in developing their list order to make sure to maximize their options;
the potential that first-generation students and those from underserved or underresourced areas will not understand the process.
These sound pretty awful, while the actions of the Department of Education and NACAC, designed to protect students, seem sensible. So why in the world would admissions and enrollment professionals, also presumably interested in serving and recruiting students, engage in such practices?
Let’s start with two premises.
First, there are three types of colleges: superselective institutions that have the luxury of “crafting a class,” open-access colleges that accept everyone who applies and colleges that work tirelessly all year just to make each class.
Second, one of the primary responsibilities of today’s enrollment manager or senior admissions leader is to predict who will enroll.
While my institution may be positioned between the superselective and the just-make-the-class types, my sympathies are more closely aligned with the latter, given the realities of demographic shifts, changes in ability and willingness of students and their families to pay, and the affordability advocates who tout cutbacks to areas such as marketing, administration and recruitment.
At Augustana College, where I work in admissions, one of my primary responsibilities is to offer the president and the Board of Trustees a data-informed prediction about who will enroll each year. This prediction sets in motion a budget and planning process that impacts the quality of education we offer our students and the livelihoods of the people who serve our students. Therefore, I want to have as many resources as possible to help inform that prediction.
We don’t ask students to rank order the institutions to which they’ve applied, but we do ask admitted students whether Augustana ranks first or in the top three or top five choices. We’ve done this for years, postadmission, and have found it to be very helpful in prioritizing our outreach to students and making the best use of our time as admissions professionals. We’ve used this information along with FAFSA position to help predict who will show up on our campus in the fall.
So, let me offer a few reasons -- not in any rank order -- why an admissions office might want to have a good idea about our relative standing with students in an effort to be efficient and make credible predictions.
Limited, constrained human resources. For most college admissions offices, especially at those institutions that need to work very hard to make the class, human resources must be deployed carefully, thoughtfully and with the greatest good in mind. Given the size of applicant pools, it is usually impossible to develop relationships with everyone who applies. Many admissions offices try to learn where to focus their efforts to make the most meaningful connections. Information like the ranking of colleges, and many other things that demonstrate students’ interests, can help an admissions counselor prioritize work and concentrate on the students most likely to enroll. At institutions that need 20 to 25 percent of our admitted students to enroll, being able to connect with those most likely to choose our college is quite useful.
The need to work smarter. A constant chorus on college campuses today is to “work smarter, not harder.” Data equip an admissions office to do that. I am aware of very few admissions offices that are increasing staff sizes, which means we are expected to work smarter every year in an environment of heavier workloads and shrinking resources. Lacking human resources, we need data, tools and processes that streamline and focus attention and allow us to be smart in our work.
Vital volunteer engagement. When it takes a village to make the class, ensuring that your village of volunteers has meaningful engagements with prospective students is crucial to long-term recruitment and admissions success. Most admissions offices rely on campus partners to supplement the recruitment effort and ultimately be effective. If there’s one thing I know about volunteers, it is that one bad experience can turn an enthusiastic volunteer away forever. Many admissions offices need to do an internal sort to make sure volunteers have good experiences. Data that inform an internal sort are important to maintaining valuable relationships with our volunteers, too.
Efficiency and access. Most important, good use of time means we can focus more on first-generation or underresourced students and families. One of the reasons we must prioritize is so we can spend more hours on creating access -- working with populations who are not as familiar with the college search process or our type of college. Understanding that one student is clear about choosing your college can free you up to counsel others who need more information to make a comfortable and informed decision.
Most people would agree this list does not in any way sound related to a “parade of horribles.” In the end, it may just come down to the fact that communication patterns and predictions keep changing. Perhaps in a couple of years, students, becoming more savvy by the minute, will decide once they’re admitted to tell each college or university that it is number one on their list -- thus hoping to get more attention. To get to the real truth, we will again have to change our approach to how we ask them.
Because, ultimately, we should do all we can to communicate honestly and in depth with our accepted students, and that begins with directness and an effort to truly know what they are thinking. It’s the kind of communication that should precede any commitment of this magnitude.
W. Kent Barnds is vice president of enrollment, communications and planning at Augustana College.
In a recent letter to the Higher Learning Commission, the largest of the regional accreditors, the Office of the Inspector General offered a scathing review of the commission’s approvals for direct-assessment competency-based education programs. The review highlighted the fundamental challenges facing a movement that has been washing like a wave over higher education. The OIG’s more rigid reading of the rules for faculty interaction with students may have a chilling effect on accreditors, who could become more concerned about running afoul of the OIG than of heeding calls to be supportive of much-needed innovation in higher education.
In just two years, we have gone from a handful of CBE programs and almost none offering direct assessment -- the unwieldy name for CBE programs not tied to the credit hour -- to more than 600 institutions working on such offerings. Those institutions include community colleges, independent colleges and universities, and public institutions like the University of Michigan and the University of Texas. In contrast to the rapid expansion of for-profit online education a decade ago, the primary providers today are nonprofit institutions. While most programs are still being designed within traditional credit-hour frameworks and thus Title IV rules of financial aid disbursement, an increasing number seek to be untethered to the credit hour and its trumping of time over actual learning. They have the support of leaders in the Education Department, the White House and both parties of Congress.
Enter the OIG, which operates independently within the Education Department, auditing and investigating department programs. The recent letter to the Higher Learning Commission reasserts the use of the “regular and substantive interaction between faculty and students” rule to distinguish between conventional Title IV-eligible programs and correspondence programs, which have greater restrictions on aid eligibility and ruinous stigma attached to them. The OIG, acutely aware of the abuses in correspondence programs in the 2000s, takes a very conservative interpretation of the rule and posits a traditional faculty instructional role.
However, many of the most innovative CBE programs unbundle that role, using faculty members in various ways, such as subject matter experts, reviewers and for learning support, while relying on “coaches” for some of the advising and mentoring roles often associated with faculty. Such programs are also introducing breakthrough technologies that can offer personalized learning and robust support not possible just 10 years ago.
The Education Department’s own guidance to institutions tacitly acknowledged such an unbundling process in its December 2014 dear colleague letter when it talked about interactions between students and "institutional staff," and it offered more explicit guidance this September in its Competency-Based Education Experiment Reference Guide. That detailed and much-awaited guidance reaffirms the need for students to have “access to qualified faculty,” but it allows for the unbundling of faculty roles, for “regular and substantive” interaction to be “broadly interpreted,” and for “periodic” interaction to be “event driven.” It shares the OIG’s basic concern when it asserts that “it is incumbent on the institution to demonstrate that students are not left to educate themselves, a chief characteristic of correspondence programs.” But it also understands that there are now many exciting alternatives to “self-learning” that do not look like traditional classrooms.
A lot of the innovation underway in CBE rests on adaptive learning technologies, powerful analytics and customer relationship management tools, learning science, and improved practices in everything from advising to learning design. But those advances -- all emerging after the correspondence program abuses of 20 years ago -- are unacknowledged in the OIG’s report. And the report’s authors continue to use time as a proxy for learning, as when they use phrases like, “even though the applications described the proposed programs as self-paced ….” Pacing is largely irrelevant in a direct-assessment world where outcomes, not seat time, matter.
The report rightly points out a need for clarity of approval processes and better communications. Institutions have long been frustrated by the opaque nature of both the Education Department’s and at least some of the accreditors’ approval processes, including the Higher Learning Commission’s. Yet, ironically, just as the accreditors and the department have improved their guidance -- witness the Council of Regional Accrediting Commissions' guidance in June and the department’s expanded CBE guidance in September -- the OIG report will very likely make things worse again as both parties scramble to respond and alter their processes in whatever ways they feel necessary.
New Regulatory Frameworks
Congress can fix this mess (come on now, hold back that snickering). It can create a demonstration project that allows non-credit-hour CBE -- let’s please drop “direct assessment,” as all CBE programs directly assess student mastery of competencies -- the kind of latitude for providing the functions that faculty have traditionally provided, while not reifying their roles. It can use the occasion to also provide for subscription models of disbursing Title IV, rethinking time-based measures like Satisfactory Academic Progress, tying aid disbursement to mastery of competencies and finally, getting Title IV rules to align with the legislative intent of an alternative to the credit hour.
It can then use that demonstration project and what we learn from it to inform the reauthorization of the Higher Education Act. Given the bipartisan support for CBE, the demonstration project can be easily created, and it would be a useful mechanism for informing the more complicated process of reauthorization. Republican Congressman John Kline of Minnesota, chair of the House Education and Workforce Committee, can immediately address the need by reintroducing last year’s widely supported CBE bill. Former Senator Tom Harkin, a Democrat from Iowa and then chair of the Senate Health, Education, Labor and Pensions Committee, would not take up the bill, a missed opportunity. His successor in the Senate, Republican Senator Lamar Alexander of Tennessee, should consider a demonstration project as a useful step towards reauthorization, a source of learning to inform better policy making, and act to support the innovation he has rightly called for in HELP Committee hearings.
In the end, the OIG is simply enforcing the law and rules that Congress and the Department of Education have created. While its lawyers, auditors and investigators are by nature and training biased toward a more conservative, even rigid, reading of the rules, it is not their job to make the rules. They will enforce what Congress creates.
So the onus is on policy makers to create new regulatory frameworks with enough latitude to better provide for innovation and the learning still underway, enough quality assurance to discourage shoddily designed programs, and enough regulatory oversight to prevent the abuses that still inform the OIG’s concerns with CBE programs.
Paul LeBlanc is president of Southern New Hampshire University. He worked as a senior policy advisor to Under Secretary Ted Mitchell in the U.S. Department of Education from March to June 2015, focusing on CBE programs and innovation.
A board member recently asked me, “If you can do two-minute elevator speeches on why someone should give a million dollars to the college, why don’t you give me an elevator speech on the biggest challenges facing higher education? Skip the nuance. No laundry list. Just the top six. You have 120 seconds.”
Here is what I said.
Raising graduation rates. While America is among the most well-educated countries in the Organisation for Economic Co-operation and Development (OECD), it has fallen to 14th place in the world in the percentage of 25- to 34-year-olds with higher education. What’s wrong? The national six-year -- again, six-year -- graduation rate is 59 percent. If we graduated even half of those who drop out, we’d be number one in the world again. Plus, low graduation rates mean we spend billions on students who quit and then more billions to recruit and teach more students who quit. Higher graduation rates would save us money, restore people’s dreams and help the United States compete.
Setting high standards for our higher education institutions. And we need to do that while still taking account of the widely varying missions and resources of each college and university. Right now, we have a plethora of criteria -- recruiting low-income and first-generation young people, graduation rates, jobs for grads, student satisfaction, a good bond rating, praise from accreditors, low default rates on student loans, research quality, strong financial statements -- many of which conflict. Which ones can be fairly applied to which institutions?
Improving the training of academic administrative leadership. We take smart academics who know virtually nothing about administration and put them in charge of multimillion-dollar operations with almost no preparation. We should be able to do better than learning by making mistakes.
Fostering responsible board governance. The situations that occurred at the Pennsylvania State University and the University of Virginia are the most glaring examples. How do you educate trustees about a sector that they’re unfamiliar with, while helping them steer between micromanaging and laissez-faire?
Meeting the expectations of Title IX, the Clery Act, VAWA, ADA, FERPA, etc. The public needs to trust us and sees federal standards as a guide. If we’re going to keep getting guaranteed student loans, Pell money and federal grants, this is the price we pay. But even if one agrees with its goals, federal compliance is often burdensome, unwieldy, ambiguous and certainly not cheap.
Financing renovation. How can we pay for all the needed upgrades without taking on debt that unhinges our bond rating? Budgets strain just to cover maintenance. Donors dislike giving for fixer-uppers. Even with record low interest rates and construction firms hungry for work, borrowing looks risky.
What about student debt? Students who graduate from a nonprofit institution have just made, for the price of a new car, the wisest investment of their lives. Leaving aside the fact that the for-profits account for almost half the defaults, fixing number one would help students repay loans because they’d graduate. The right measures (number two) and better leadership (numbers three and four) would help us perform better and, like (number five), restore public confidence.
I don’t know if my trustee was impressed, but my thinking became more focused. Try it yourself, and let me know what your top six are. The elevator doors are about to close ….
Carl Strikwerda is president of Elizabethtown College.
Much has been made in recent months by higher education and political leaders about how the costs to institutions of federal regulation are driving up the price of colleges and universities.
A new report from Vanderbilt University and the Boston Consulting Group claims that institutions spend about $3 billion for regional accreditation. This figure -- based on approximating accreditation costs as a percentage of the overall costs of regulation at 13 institutions and other industrywide data -- seems designed to reach a similar conclusion as an earlier report released by an influential Senate committee: that the cost of accreditation is unusually high.
The previous report claimed that Vanderbilt University’s College of Art and Sciences devotes more than 5,000 hours annually, at a cost of about $2.92 million, to report to its regional accreditor. The earlier study also included data gathered by Duke University asserting that the cost of accreditation in faculty and staff time over the last few years has been about $1.5 million. This is in addition to the $500,000 the university spends each year to manage required reporting related to academic assessment and other matters.
These costs are significantly inflated and irresponsibly misleading. As multiple news outlets have reported, the vast majority of the costs identified in the previous Vanderbilt study were related to regulations related to federal research grants and not accreditation.
In addition, suggesting, as this report does, that the cost of accreditation includes significant faculty costs ignores the reality that accreditation activities are part of regular faculty service and committee work and contribute to the overall improvement of the institution.
Unlike the many regulatory requirements that institutions have to deal with that are really only reporting, the process of peer review creates significant benefit to institutions to help them study themselves with expert colleagues, plan for the future, and discover and address their blind spots.
There is no doubt that colleges, universities and pre-K-12 institutions suffer from overregulation, but accreditation -- a process that reinforces continuous improvement of institutions -- isn’t the primary culprit. Accrediting agencies in both pre-K-12 and higher education are working to streamline their processes, to lower costs and become more cost-effective.
They are also seeking to become more transparent about decision making, and ensure that the broadest range of stakeholders is included in discussions of academic standards and quality -- quality that translates into real improvements at institutions, not just checklist compliance.
So what is the real cost of accreditation?
As two leaders of accreditation agencies -- one that assures quality in over 34,000 pre-K-12 institutions around the globe and the other that provides regional accreditation for over 800 Southern higher education institutions (including Duke and Vanderbilt) -- we estimate the cost of accreditation to be significantly lower than those reported in the Senate report.
A 2012 doctoral dissertation project conducted by Paul Woolston Jr. at the University of Southern California looked at the average cost for institutions seeking to reaffirm their accreditation through three of the six regional accreditation agencies. Woolston found that the average cost -- including both direct and indirect costs -- was $327,254 over a seven- to 10-year cycle. This means the annual cost to institutions ranges from roughly $32,000 to $41,000 per year depending on the length of the accreditation cycle.
Even at doctorate-granting research universities, the average combined direct and indirect cost over this entire span was about $415,000 for these institutions across three accreditation regions. Hardly the millions of dollars being bandied about by those who would like to see the accreditation process dismantled, and certainly an amount that is manageable enough to budget for over the long term.
Moreover, the cost of accreditation is significantly lower than what those unfamiliar with the process might expect because of the volunteer nature of the work. AdvancED, for example, recently brought a 40-member team to examine evidence of school quality in Hillsborough County, Fla., that included significant on-site time to review information and meet with district and community representatives -- a process similar to that undertaken during accreditation of higher education institutions.
If the cost of one day of that single visit was an outright business charge, at an average rate of $2,500 per consultant, the cost for the four-day visit could have exceeded $400,000. However, AdvancED charged the district $4,000 for the accreditation effort plus individual travel and expenses. Institutions are also charged an annual accreditation fee of $750. For Hillsborough, a district of 245 institutions, the cost per year is $183,750. The process resulted in the development of the district’s new five-year master plan that has received significant buy-in from teachers, staff, students, parents and other stakeholders, whose views were surveyed and taken into account to develop the plan.
The reality is that, typically, accreditation costs are only 5 to 10 percent of the costs of overall investment in institutional research and continuing improvement. This, we believe, is a small cost to pay for a vital service that ensures quality and spurs continuous improvement in all areas that affect student learning.
Why the huge disparities between our view and what some of the recent studies have found? For the most part, they reflect some of the confusion about what accreditors actually do, what is required of institutions and what institutions must do to improve themselves.
Accreditation sets standards for quality in pre-K-12 and higher education institutions, and measures and monitors institutional and student performance through a carefully orchestrated multiyear process of peer review. Teams of experts -- including academic deans, presidents and faculty from peer institutions, and nonacademic experts in particular disciplines -- work closely with institutional officials to investigate every aspect of what the institution does.
The process provides an in-depth view into the vital systems of the institution: the effectiveness of instruction, the availability and strength of student support, how the institution is led and governed, its financial management as well as how it uses data in decision making. Accreditors provide their seal of approval after institutions make needed changes and work closely with college and university leaders to develop an ongoing improvement strategy and demonstrate that they have achieved key standards according to clear indicators of performance, including measures of what students learn.
We believe this work represents a significant investment in the future of colleges and schools. A recent survey by the Southern Association of Colleges and Schools Commission on Colleges suggests that this is so. Only about 16 percent of responding institutions said that accreditation is only or primarily an expense to the institution while the vast majority (over 80 percent) saw accreditation as primarily an investment (42.5 percent) or as both an investment and an expense (41.5 percent).
The survey also showed that institutional leaders believe that about two-thirds of the cost is spent on what the institution needs to invest in to improve, while only one-third of the cost is seen as stemming from the specific requirements of accreditation. Likewise, AdvancED conducts biannual surveys of its institutions. Results consistently show that over 90 percent of institutions find significant value in the accreditation process as a primary driver of continuous improvement and accountability. The cost of accreditation is viewed as minimal in comparison to the benefits and impact.
The cost of improving an institution is the singular responsibility of the institution, and part of its daily work. Exemplar colleges -- and even corporations like IBM -- do not consider the work of improvement to be an onerous task or something that an accrediting body forces them to do but an essential part of their management.
We hope that policy makers will come to recognize that the cost of accreditation is what is required to help guide institutions on their improvement journey; it is not the cost of the journey itself.
Belle S. Wheelan is president of the Southern Association of Colleges and Schools Commission on Colleges. Mark A. Elgart serves as the founding president and chief executive officer for Advance Education (AdvancED).
Among my earliest memories is scribbling lines across a sheet of paper and handing it to someone (this was around age four, so presumably one of my parents) and asking them to read it. Whether precocious or clueless, what matters is that the aspiration formed early. I’ve learned a little more about the process since then. But that’s still what it comes to: scribbling lines across a sheet of paper and handing it to someone to read -- following a detour through the keyboard, since one thing that hasn’t improved is my handwriting.
Paper is magnetic: it draws more paper to itself, chiefly by means of the activity we call “research.” Today, in what seems to be an incipient norm, it’s possible to conduct every stage of the writing process -- from preliminary reading to final revision -- without ever touching a sheet of paper. To me it sounds like a relentless hell of perfect efficiency. Half of thinking, let alone writing, is getting a feel for the material, and that always means going through a literally prehensile phase: grasping a pen, holding an archival document, marking up laser prints with a sharply hued yellow highlighter.
That need may be anachronistic, but it’s wired into my nervous system, down deep and for good. And it makes my current efforts to “go paperless” -- something I rejected and avoided as long as possible, but now irrevocably underway -- both difficult and ultimately paradoxical. I am ambivalent but resigned. There is plenty of time for both second-guessing and steeling of the will during the long hours needed to feed thousands of pages into a scanner by hand, one by one.
What is at issue, ultimately, is the problem of space. The roughly 70 square feet of floor space in my study holds two four-drawer filing cabinets, plus bookshelves and a couple of spots to accommodate a laptop and some legal pads. (I recall seeing a coffee table book of photographs of writers’ desks in which Alain de Botton referred to his, a trifle portentously, as a “sacred plinth of creativity.” When one of the cats sprawls across my papers, I am reduced to saying, “Get off my sacred plinth of creativity!” This never works.) Several years ago it became necessary to supplement the filing cabinets with a sturdy cardboard box or two.
It did not stop there, of course: paper attracts paper. Around 2005, I was deep into research on a topic involving a large number of writers from the early 20th century -- some well-known then and a few still remembered, but most of them fairly minor, so that any photograph, archival trace or WorldCat listing was potentially crucial. The “inside the book” search features at Google and Amazon provided many leads worth following up. The easiest and most logical course at the time was to print them out. Ditto for appropriate web pages and scholarly papers. Meanwhile, more paper was accruing around other projects -- not to mention the small forest consumed in writing this column each week, in addition to the seasonal flux of galleys for new books.
As of this summer, the boxes were stacking up three and four deep, and occupied so much ground that getting from the door to my chair was a short but challenging obstacle course. And there is no end in sight, even with regular winnowing. Very often a document had annotations and cross-references that represented an investment of time and attention, and were still of value to me even though the item itself (an article from JSTOR, for example) would be easily replaceable.
Anyone whose methods and habits were shaped in the past few years might well consider my predicament inexplicable, if not ridiculous. And fair enough. The hundreds of pages that I mentioned having printed out 10 years ago would now be practically effortless to collect using Evernote, and possible to store without consuming an inch of space. Evernote also makes it easy to “clean up” the material so gathered -- stripping out advertisements, design elements and anything else besides whatever kernels of substance you want to preserve -- while also recording the online location of the original.
The option of highlighting and annotating material in PDF, rather than on a printout, has been around for a while -- although those of us stuck with old software (or lacking technical guidance) could not actually do so until fairly recently. Now there are numerous cheap or free applications that make marking up a PDF possible no matter what the user’s circumstances. (That’s another advantage of Evernote: besides being able to store and retrieve a PDF, the user can annotate it.)
Some of the files that now threaten to trip or suffocate me go back more than 25 years. They are the product of hundreds of trips to libraries and archives, and thousands of hours of grappling with long-term projects as well as numerous flurried episodes of obsessive fascination (e.g.: Was the political scientist who wrote a standard text on psychological warfare while also publishing science fiction under the pseudonym Cordwainer Smith also the patient that psychiatrist Robert Lindner wrote about in the memorable case study “The Jet-Propelled Couch”? I have notes). The grounds for preserving all of it are well established, or at least unshakably well rationalized, as earlier comments here may have suggested.
And yet the fact remains: crossing the floor of my study has become hazardous, and stacks of notebooks make it hard to use the sacred plinth of creativity, even for a catnap. The situation being in all ways unstable, I started taking extreme measures earlier this month. There’s bound to be someone else in a comparable situation, so the details might be useful to record.
The first step -- obvious but difficult -- was a general turnover, identifying anything that I could feel reasonably certain on quick inspection to be disposable and carting if off to the recycling bin. Judging by the size of the piles relative to a ream of typing paper, this meant discarding around 10,000 pages in two days.
Any sense of relief was short-lived. At least as much remained, and each later stage of culling would be more labor intensive. As noted, the ideal place to store all the material I gathered 10 years ago from library catalogs, Google Books search results, etc., was Evernote. The difficulties involved in time travel meant that my only option was a do-over. That is, I had to go through the printouts and relocate (as much as possible) the citations, blog entries, historical society websites and so forth, and store them in Evernote. This at least rendered the information easily searchable, but the effort was tedious and often frustrating, since finding everything was not possible even using the Internet Archive. My files contained documents or information that have vanished from the web -- which at least made the decision to print them off years ago seem at least somewhat worthwhile.
Such items went into a folder to revisit. My next step was to go through the printed copies of articles and conference papers from JSTOR and other databases to weed out the duds (usually identified as such by exasperated and occasionally insulting comments scribbled on the cover page) and then compare what remained against what I had in PDFs in my digital archive. Again, it was slow work. (Plus there was the temptation to read, just a little ….) But there was much encouragement in seeing the growing pile of duplicates -- paper that I could discard without really losing the content.
Then came what felt like the crossing of a threshold: the moment when I took the scanner out of its box, installed the software and began creating PDFs of photocopied documents from archival research, material from now-vanished web pages and printouts of articles from databases I could no longer access (or, in some cases, remember). Eventually, I noticed that the scanning device (brand name ScanSnap) could handle color -- meaning that all those laboriously highlighted and annotated pages I could never bear to part with could be rendered as PDFs. The bar started getting higher for what I feel compelled to preserve on paper.
The scanner requires a user to feed each page in one by one -- and carefully, since otherwise the image can turn out wavy or unreadable. The process is only slightly less monotonous than an Andy Warhol film, and I have at least two or three long days of it ahead. Then it will be necessary to give the PDFs descriptive labels and organize them in files.
When you are your own intern, the mind does wander. And mine keeps coming back to a few things, each simple enough yet approachable for revisiting from different directions. One is that the process cannot be undone. The boxes full of old research materials -- and not a few drafts of things I’ve published and long since forgotten -- will be pulped, and what remains is an image entrusted to a ghostly medium. That means placing more trust in the electrical grid than I can quite justify.
Another recurring thought in the process is an exchange between space and time: every five hours I spend on the work equals X cubic centimeters of room that will not be occupied by a cardboard box. At the same time, each cubic centimeter of paper is a fraction of the time consumed, thus far, in writing (or in the case of research, reading and hoarding the traces of others’ lives and thoughts).
And finally, there is the impossibility of what’s going on here under the banner of “going paperless.” Because I can’t, won’t and wouldn’t if I could. Now that my files are being rendered digital, they can travel with me anywhere, and I will be able to read them in the only way I know how: with an open notebook, pen in hand.
(No reference here to an application or device should be taken as an endorsement, nor have I received compensation or any other incentive for mentioning them. In most cases, other tools are available that perform the same or similar functions. Those named are simply the ones that, over time, I’ve tried and found useful.)