Not long ago,this column took up the perennial issue of academic prose and how it gets that way. On hand, fortunately, was Michael Billig’s Learn to Write Badly, a smart and shrewd volume that avoids mere complaint or satirical overkill.
Bad scholarly writing is, after all, something like Chevy Chase’s movie career. People think that making fun of it is like shooting fish in a barrel. But it’s not as easy as shooting fish in a barrel: to borrow Todd Berry’s assessment of his comedic colleague, “It’s as easy as looking at fish in a barrel. It’s as easy as being somewhere near a barrel.” Besides, it’s gone on for at least 500 years (the mockery began with Rabelais, if not before) so it’s not as if there are many new jokes on the subject.
But Billig did make an original and telling point in his critique of pure unreadability – one I neglected to emphasize in that earlier column. It has come into clearer view since then thanks to a new book by Carl H. Klaus called A Self Made of Words: Crafting a Distinctive Persona in Nonfiction Writing (University of Iowa Press).
Klaus is professor emeritus of English at the University of Iowa and founder, there, of the Nonfiction Writing Program. He is also a practitioner and critic of the genre of the personal essay, and A Self Made of Words seems largely addressed to the students, formal or otherwise, who want to learn the craft. Scholarly discourse rarely assumes the guise of the personal essay, of course. But Klaus’s insights and advice are not restricted to that literary form, and his book should have a tonic effect on anyone who wants his or her writing to do more than paint gray on gray.
To put it another way, A Self Made of Words doesn't stress writing in the personal voice, but rather the persona that always operates in writing, of whatever variety, whether formal or informal, autobiographical or otherwise.
Klaus wrote an earlier book called The Made-Up Self: Impersonation in the Personal Essay (Iowa), which I have not had a chance to read, but I assume he there goes into the original use of the word persona, meaning, in Latin, a mask, of the stylized kind ancient actors wore on stage to project a character. The author of even the flattest and most objective or empirically minded paper creates or displays a persona while writing: one that is self-effacing and indistinct, yes, but that manifests its authority through self-effacement and the absence of first- and second-person communication.
Impersonality, in other words, implies a persona. So does the introspective voice and intimate tone of a memoirist, with countless shades of formality and casualness, of candor and disguise, possible in between. The persona is not something that stands behind or apart from the written work, though it may seem to do so. The raw material of the persona is language itself -- not just the vocabulary or syntax an author uses, but the differences in intonation that come from using contractions or avoiding them, from the mixture of concrete and abstract terms, and from the balance of long and short words.
Klaus devotes most of the new book to how those elements, among others, combine to create effective writing -- which is, in his words “the result of a complex interaction between our private intentions and the public circumstances of our communication.” It is not a style guide but a course of instruction on the options available to the writer who might otherwise be unable to craft a persona fit to purpose.
Which, alas, is often the case. Michael Billig did not discuss the academic author’s persona in his book on how to write badly and influence tenure committees – at least, not as such. But it is implicit in his argument about how apprentice scholars orient themselves within the peculiar, restricted language-worlds their elders have created while fighting to establish their claims to disciplinary claims.
In effect, they learn how to write by wearing the personae they’ve been given. And there’s nothing wrong with that, in itself; the experience can be instructive. But the pressure to publish (and in quantity!) makes it more economical to rely on a prefabricated writerly persona, stamped out in plastic on an assembly line, rather than to shape one, as Klaus encourages the reader to do.
No reader of The Sociological Imagination (1959) will soon forget C. Wright Mills's “translations” of a few passages from The Social System by Talcott Parsons, one of the most eminent American social scientists of the day. Here's a representative selection from The Social System, in the original Parsonian idiom:
“Attachment to common values means, motivationally considered, that the actors have common ‘sentiments’ in support of the value patterns, which may be defined as meaning that conformity with the relevant expectations is treated as a ‘good thing’ relatively independently of any specific instrumental ‘advantage’ to be gained from such conformity, e.g. in the avoidance of negative sanctions. Furthermore, this attachment to common values, while it may fit the immediate gratificational needs of the actor, always has a ‘moral’ aspect in that to some degree this conformity defines the ‘responsibility’ of the actor in the wider, that is, social action systems in which he participates.”
And here is how Mills put the same thoughts into demotic English:
“When people share the same values, they tend to behave in accordance with the way they expect one another to behave. Moreover, they often treat such conformity as a very good thing – even when it seems to go against their immediate interests.”
To get the full effect, you have to see Mills perform the operation upon much larger chunks of ore – a solid page of Parsons, massy and leaden, followed by its rendering into three or four spry statements of the relatively obvious. “I do not pretend that my translation is excellent,” Mills writes, “but only that in the translation no meaning is lost.” He later quotes a suggestion by Edmund Wilson that social scientists get help from their colleagues in the English department.
That advice dates the book considerably, of course. Michael Billig, the author of Learn to Write Badly: How to Succeed in the Social Sciences (Cambridge University Press) is a professor of social sciences at Loughborough University, in Leicestershire, and the examples he cites come chiefly from sociology and psychology. But the techniques and strategies he describes work just as well in humanities and education departments, among others.
Billig’s title is sardonic, but the text itself, for the most part, is not. I half expected an annotated scrapbook of scholarly bloviation -- and it does give you a feel for the state of the art. But description and complaint are secondary to Billig’s much more interesting effort to understand the purpose and enabling conditions of successful bad writing. For despite the note of sarcasm, even the book’s title is serious: people do not come into the world knowing how to be verbose and evasive, or to prop up a shaky idea with resonant jargon. It has to be learned, and there must be incentives to learn it.
In the 1890s, William James complained that trendy psychological jargon of his day, such as “apperception,” served little purpose beyond, as Billig puts it, “enabl[ing] professors to be professorial” so as “to impress the impressionable.” The exotic word was assumed to be exact and rigorous, but apperception, James said, meant “nothing more than the act of taking a thing into the mind” -- an act more precisely characterized in already available terms such as “assimilation,” “elaboration,” or “interpretation,” among others. James was ambivalent about the then-emerging tendency toward ever-narrower academic specialization. But he seemed to think (in some moods anyway) that the need to communicate outside one’s professional peer group might limit the linguistic damage.
What he could not foresee, as Billig says, is the explosive and continuing growth of higher education as a whole (“the numbers of tertiary education teachers across the word rose from just under 6.5 million in 1999 to over 9.5 million in 2007”) and the paradoxical effects of disciplines becoming “too big to control and too powerful to avoid.” Within a given field of study are “communities or subdisciplinary tribes” using their niche vocabularies not just to communicate research but to establish affiliations and establish institutional power.
“For most journals in the social sciences,” Billig writes, and the point can be generalized further, “there will be some sets of terminology that will identify the author as belonging to an approved approach, discipline, or subdiscipline. This means that many journal editors are likely to practice, without conscious intention, a restriction upon free use of language…. Some words will have to pass stringent tests before they can gain admittance. Others will be protected currency, circulating untaxed between authors and readers.”
The hint of protectionism here is not accidental. A terminology signals an approach -- and an approach implies a social and professional network. Becoming comfortable and proficient within a subdiscipline’s semantic field is the prerequisite for disciplinary socialization. (Billig has some amusing and revealing pages on the expression “semantic field,” while “socialization” is a boilerplate example of the ubiquitous reliance on “-ization” and “-ification” to create words of a pleasing vagueness. The author considers the latter tendency a form of reification, then discusses how the term very "reification" is itself an example of the problem,)
One standard explanation of the value of a theoretically informed and narrowly circulating vocabulary is that it avoids the assumptions and restrictions of ordinary language. And it very well may, though Billig has some sharp points to make about the simple-mindedness of treating “ordinary language” as some homogenous and uniformly contaminated medium.
But his more important point, I think, is that apprentice scholars don’t typically “find that their research meets an impasse which they can only overcome by seeking out different words or phrases, either because they are confronting new problems, which cannot be expressed in the old ways, or because they have been discovering new phenomena, for which there are no existing names.” Instead, they assimilate “this odd way of writing and speaking as a sign that they are entering into the world of research, thereby leaving behind their ordinary ways of talking and writing.” Otherwise, Billig says, your peers won’t know that you aren’t just somebody who’s just wandered in out of the rain.
So in a way Billig is confirming what Talcott Parsons said in that passage quoted earlier:
“Attachment to common values means, motivationally considered, that the actors have common ‘sentiments’ in support of the value patterns, which may be defined as meaning that conformity with the relevant expectations is treated as a ‘good thing’ relatively independently of any specific instrumental ‘advantage’ to be gained from such conformity, e.g. in the avoidance of negative sanctions.”
Very seldom in writing about scholarly publishing have I had an occasion to use the word “fun” -- actually, this may be the first time -- but with a couple of recent titles, nothing else will do.
They are not frivolous books by any means. Sober and learned reviews by experts may appear in specialist journals two or three years from now, and they will be instructive. But the books in question should generate some interest outside the field of J. Redding Ware studies.
Nobody who appreciates the history and color of the English language can fail to enjoy the University of Oxford Bodleian Library’s new facsimile edition of Ware’s masterpiece, the invaluable Passing English of the Victorian Era: A Dictionary of Heterodox English, Slang, and Phrase. First published in 1909, shortly after Ware’s death, it is now available as Ware’s Victorian Dictionary of Slang and Phrase -- a title that is more marketable, perhaps, but decidedly wanting in precision. Most of the lingo it covers is Victorian, but the dictionary itself, appearing as it did in the final month of the king’s life, is Edwardian. (A pedant’s work is never done.) The new edition contains a valuable introduction by John Simpson, who retires this month as chief editor of the Oxford English Dictionary.
It covers almost everything currently known about Ware (i.e., not much) and assesses his contribution to lexicography, which was considerable. Ware’s dictionary is cited in the OED more than 150 times, including “almost 50 times for the first recorded evidence for a word or meaning.” My earlier reference to Ware studies was a bit of a stretch, since nobody has ever written a book about him, nor a dissertation -- nor even, it seems, a journal article devoted solely to his life or works. A couple of papers by Joyce scholars identify his dictionary as a source the novelist consulted while writing Ulysses. Simpson’s introduction is the landmark in a nearly barren field.
Ware was born in 1832 and his first name was James, after his father, who was a grocer. In his teens the future author served a short jail sentence following a violent family quarrel, during which he grabbed a bacon knife and threatened to kill James the elder. He worked as mercantile clerk while writing his first novel, published in 1860. From that point on Ware seems to have eked out a hardscrabble existence along the lines George Gissing depicts in New Grub Street, cranking out fiction, journalism, and quite a few plays, along with a handbook on playing whist and a tourist’s guide to the Isle of Wight.
Simpson unfortunately neglects to mention one other documented occasion when Ware went to court, seeking relief from a downstairs neighbor who played the piano at all hours. The outcome remains unclear but it seems no bacon-knife was involved.
Not quite a gentleman, then, nor by profession a scholar. Ware’s dictionary was commissioned by Routledge as a supplement to another volume on colloquial English. “His lexicographical method is arguably modern,” Simpson notes, “as he based his selection largely on printed evidence that he (and, one imagines, other interested parties) had collected principally from newspapers and other ephemeral sources… The latest quotations date from 1902, though the majority date from the 1880s and 1890s.”
No don would have come up with what Simpson calls Ware’s “idiosyncratic labeling system,” which identifies the provenance of a given piece of slang through categories such as “Slums,” “Italian Organ-Grinders,” “Music Hall,” or “Colloquial Imbecile.” He clearly spent a good bit of time hanging around theaters and pubs, and “was painfully aware of changes in hairstyles and fashion generally over the decades, and can with the help of his printed evidence place the introduction of a new ‘look’ to a precise year.”
Some of the expressions Ware includes have passed into accepted use. He identifies opportunism as a piece of political slang from the 1860s, explaining: “Used rather in contempt, as subserving conscience to convenience, or to personal advantage.” It turns out that flappers -- young women of an immodest sort -- were on the scene well before the 1920s, And while Susan Sontag doesn't mention it in wrote her notes, Ware identified camp as an adjective applying to “actions and gestures of exaggerated emphasis," noting that it was “used chiefly by persons of exceptional want of character.” By that Ware undoubtedly means sissies, “effeminate men in society,” a term he indicates (citing an American newspaper) caught on in the 1890s.
I would have assumed the slang word narc -- pertaining to an informer, and used as both noun and verb -- derived from narcotics. Apparently not: copper’s nark is defined as thieves’ argot, also from the 1890s, meaning “a policeman’s civilian spy.” Ware indicates that police were called coppers from 1868 on, and you’d have found a copper-slosher (“individual with the mania for ‘going for’ policemen”) hanging around a copper’s shanty, as the station house was known. A working-class person thought to be blustering risked the taunt “Copper! Copper!” – implying that he was “giving himself the airs of police authority.”
But where did cop itself come from? “There has been more discussion over this widely applied word than any other in the kingdom of phrase,” writes Ward in one of his longer entries. It is incredibly polysemic, meaning “taken, seized, thrashed, struck, caught by disease, well-scolded, discovered in cheating,” and could also be used as a verb meaning “to take too much to drink” (hence copping the brewery). Thunderous applause for an especially good show at a music hall show would cop the curtain, so that the performer could take a bow.
The vast majority of Ware’s 4,000 entries define expressions that vanished without a trace. Hence his original title: "passing English" comes and goes. It's the vigor of language that drives the vitality of the book.
One craze of the 1880s was corpse-worship – “the extreme use of flowers at funerals” – which got so bad that by the ‘90s “many death notices in the press were followed by the legend ‘No flowers.’ ” Slang words often come from the contraction of common expressions; for example, damirish, “damned Irish,” and damfino, “I am damned if I know.”
Nobody still describes an egg gone bad as suitable for electioneering purposes (derived from “the exercise of projecting them at antagonistic candidates”) and the culture is all the poorer for it. Then again, it's a relief that suggestionize -- an old bit of legal jargon meaning “to prompt,” as with a witness – never caught on. Now if we could just euthanize “finalize.”
Decades before the birth of Jerry Garcia, deadhead was the entertainment-industry label for an audience member who got in without buying a ticket. It applied to critics and “’theatrical people’… [who] never pay to enter a theatre.” Ware, as a playwright, resented them, and the dictionary vents his frustration in an amusing manner:
“The experienced eye can always divide the deadheads from the ‘plank-downers’ in a theatre. The deadheads are always dressed badly, and give themselves airs when looking at the inferior parts of the house. The plank-downers never give themselves airs, mean business, and only look at the stage. Deadheads are very emphatically described by a theatrical official: ‘Here come two more deadheads; look at their boots.’”
Many entries are no doubt the only record of a term or catchphrase, and in some cases the lexicographer can just guess what they originally signified. Who stole the goose? is an “interjection of contempt, which appears to have some hidden meaning, probably of an erotic nature.” in the case of Who took it out of you?, Ware doesn't even try. The meaning is “wholly unknown to people not absolutely of lower class.”
Of comparable subaltern origins, but easier to understand, is the slang term label for sausage: bags o’ mystery. That one should come back into circulation. Its use could be extended to the hot dog.
Speaking of mystery, another book recently reissued in facsimile is Andrew Forrester, Jr.’s The Female Detective, a collection of short fiction from 1864, reprinted by the British Library and distributed in the U.S. by the University of Chicago Press. A digital edition is available from Amazon.
You can find the book online in PDF for free -- which is also true with Ware’s dictionary, although Simpson’s introduction is not to be missed. With The Female Detective, the new material consists of a foreword by Alexander McCall Smith (best known for his No. 1 Ladies Detective Agency novels, but also the author of What W.H. Auden Can Do For You, just out from Princeton University Press) and an introduction by Mike Ashley, whose books include a biography of Algernon Blackwood.
The Female Detective is offered as a collection of reports from the files of the elusive Miss G----- (she is nothing if not discreet) as edited by Forrester. The detective’s “casebook” was a very popular genre at the time, part of the “railroad literature” that sprang up to meet the demand of commuters. Forrester later wrote at least two more such collections, but his place in the history of the genre comes from having created Miss G---- (a.k.a. Mrs. Gladden), the first female professional detective in fiction. Kathleen Gregory Klein devoted several pages to the book in The Woman Detective: Gender & Genre (University of Illinois Press, 1988) and puzzlement over Forrester's identity – was it a pseudonym? – has echoed down the scholarship ever since.
It now seems very likely that the author was, in fact, J. Redding Ware. Simpson accepts it as credible in his introduction to the dictionary, as does Mike Ashley in the opening pages of the short-story collection. The identification was proposed by Kate Summerscale in the notes to her nonfiction novel The Suspicions of Mr. Whicher: A Shocking Murder and the Undoing of a Great Victorian Detective (2008). In researching her book, Summerscale noticed that Ware published a pamphlet about the crime in question: a child murder that occurred in a country house in 1860. (The circumstances are oddly reminiscent of the JonBenet Ramsey case.) He seems to have incorporated the text into a chapter of The Female Detective.
Mrs. Gladden had keen powers of observation and deduction, and a reader can’t help thinking of the much better-remembered private eye who came on the scene later in Victoria’s reign. Ware must have felt that Arthur Conan Doyle had stolen his thunder – though that seems like a rather peculiar phrase, come to think of it.
Wade lists it in his dictionary, explaining that it means “annexing another man’s idea, or work, without remunerating him, and to your own advantage.” It was first used, he writes, by one John Dennis, “a play-writer of the 17th century, who invented stage thunder for a piece of his own which failed.” The theater manager incorporated the technique in the production of someone else’s play, prompting the enraged Dennis to yell out, “They won’t act my piece, but they steal my thunder.”
I hope J. Redding Ware studies comes into its own, or at least that others discover him. How often is it worth reading a dictionary just for the stories?
University presses -- like other publishers -- know that not all reviews will be favorable, and generally don't respond to most critiques of their books. But the debate over a new book published by Harvard University Press has led its director to issue a defense of the decision to publish. The book in question is The Collaboration: Hollywood's Pact With Hitler, by Ben Urwand, a fellow at Harvard. The book has been praised by some for revealing the extent to which Hollywood avoided offending the Nazis, but has been harshly criticized by others for oversimplifying the history. The New Yorker has been particularly critical, with David Denby first publishing a negative review and then following up with a piece called "How Could Harvard Have Published Ben Urwand's The Collaboration?" In that piece Denby outlines what he considers to be numerous "omissions and blunders."
A statement from Harvard University Press says in part: "We stand by the integrity of our refereeing and editorial procedures. A thorough review process is standard at Harvard, where we take very seriously the imprimatur of the university’s name. Though not all reviewers agree with Urwand’s interpretation of the actions he describes, nearly 60 pages of notes and documentation enable readers to judge for themselves the strength and validity of his presentation. Via his agent Urwand has responded to Denby and the New Yorker, but as yet we have no indication that his response has been published."
Last week Pope Francis, who is on something of a roll, assured atheists that they could get into heaven. As one of the unchurched and the disbelieving, I appreciate this expression of good will without finding the news especially consequential. There’s enough to worry about as it is, this side of death.
But the pontiff’s timing is impressive. Ronald Dworkin’s Religion Without God, the philosopher’s first posthumous work, appeared in bookstores a few days before Francis made his statement -- even though Harvard University Press listed it as an October book. (When he succumbed to leukemia in February, Dworkin was a professor of law and philosophy at New York University and an emeritus professor of jurisprudence at University College, London.) Surely it’s a matter of providence at work, or at least of synchronicity, depending on which way you’ve staked that existential wager.
I call it Dworkin’s first posthumous book, not on the basis of inside information, but from the certainty somebody is bound to raid the Nachlass of any figure so prominent in Anglo-American discussions of the philosophy of law across four decades.
Even a fairly stringent assessment of him as someone more esteemed outside his discipline than in it -- ever the complaint when someone is just too visible as a public intellectual -- ends up conceding that he did play a catalytic role, at times. Much of the commentary since his death seems to echo Dworkin’s own recollection of serving as Learned Hand’s clerk: “I disagreed with everything he said, but he was a very good person to have to argue with.” (By the way, a book bringing the philosopher’s and the judge’s ideas together for comparison seems like a project full of interesting possibilities.)
Religion Without God is based on the three Einstein Lectures that Dworkin gave at the University of Bern in Switzerland in December 2011. The lecture series began in 2009. The speakers rotate, from year to year, between a physicist, a mathematician, and a philosopher. Einstein’s occasional remarks about God (the things he actually wrote and said, not the kudzu-like apocrypha) are the seed crystals for the lectures, rather than their topic.
According the publisher’s note, Dworkin “planned greatly to extend his treatment of the subject over the next few years” but “had time only to complete some revisions of the original text,” although the volume closes with a fourth piece, “Death and Immortality,” shorter than the lectures, which bears no indication of when it was written. It begins on a mordant note, as if in reply to the Pope: “When Woody Allen was told that he would live on in his work, he replied that he would rather live on in his apartment.”
For a while I suspected that Religion Without God might be a very late installment in the New Atheism saga, and on that basis gave it wide berth. All the polemical gunpowder has run out on both sides. The very prospect of another battle -- Dworkin v. Dawkins! -- sounded as appealing as a sawdust burrito or an afternoon in line at the Department of Motor Vehicles. Life is too short.
Happily the lectures are nothing of the kind. Arguments for or against the existence of God (or gods, if you prefer) form no part of Dworkin’s project. He takes it as a given that the dispute will continue, as it must, at varying degrees of heat and lucidity. But he also takes as important and meaningful that some forms of atheism are as deeply shaped by the numinous as any religious faith.
“Numinous” is the term Rudolf Otto coined in The Idea of the Holy (1917) to name an overwhelming experience of the grandeur, power, order, significance, and strangeness (“otherness”) of the universe, or of being itself. It can be blissful, and it can be terrifying. Religious mystics have no monopoly on the numinous. Physicists and mathematicians have written about it, for example, and one of the passages from Einstein quoted by Dworkin expresses it in a forceful manner:
“To know that what is impenetrable to us really exists, manifesting itself as the highest wisdom and the most radiant beauty which our dull faculties can comprehend only in the most primitive forms – this knowledge, this feeling, is at the center of true religiousness. In this sense, and in this sense only, I belong to the ranks of devoutly religious men.”
On another occasion, Einstein said, “He to whom this emotion is a stranger, who can no longer pause to wonder and stand rapt in awe, is as good as dead; his eyes are closed.” Dworkin stresses that while monotheists may understand numinosity as a revelation of the power and awe-full reality of the Creator, it does not, as such, compel belief in a personal deity (what Dworkin refers to, from time to time, as “the Sistine god,” in honor of Michelangelo’s rendition).
Einstein, for one, dismissed the idea of such a Supreme Being existing prior to, and apart from, the universe. He said so repeatedly, although believers kept construing his remarks about “belong[ing] to the ranks of devoutly religious men” to the contrary. The physicist thought of himself as a kind of pantheist, along Spinoza’s lines. The difference between pantheism and atheism is arguably one of shading -- and Dworkin subsumes Einstein’s perspective under the rubric “religious atheism,” which would also apply to beliefs such as Ethical Culture and some kinds of pacifism.
“Religious atheism” is not meant to be an ironic label; the author shows no interest in it as paradoxical. Dworkin’s point is that a sense of “life’s intrinsic meaning and nature’s intrinsic beauty” runs deeper than one’s judgment of the source or intelligibility of that meaning and beauty. Values “are real and fundamental, not just manifestations of something else; they are as real as trees or pain.” The theist understands meaning, beauty, goodness, and other values to be the intentional creation or the commandment of a higher being, who thus merits our worship, or at least our very close attention. To live a good and meaningful life means living in accord with the divine purpose.
But for the religious atheist (which is to say, for the author himself) that is getting things more or less backward. Dworkin seems to have reached the same conclusion as Descartes on a matter that bothered the earlier thinker in his final years, as mentioned in Steven Nadler’s The Philosopher, the Priest, and the Painter (discussed in this column).
In short: Is something good – or (true, beautiful, just, etc.) because God wills it? Or is it the other way around? What if the bearded man on the ceiling of the Sistine Chapel decided that theft, murder, and cannibalism were totally fine, and even to be encouraged? Would that make them good? If not, then in some sense we have accepted that right has priority even over divine might.
Thus concluded the theist Descartes, as did the religious atheist Dworkin. There is much that I am scanting in Dworkin’s book here, in the interest of time, but that should provoke enough thought, and elicit enough invective, for now. Let me end this column, as it began, with a look to the afterlife. In a symposium at the Boston University School of Law a few years ago, Dworkin announced that he’d had a glimpse of paradise:
“Lots of people, including among them among the most distinguished philosophers and lawyers in the world, have come together to discuss a book of mine. As if that weren’t good enough, they discuss it before I’ve actually finished writing it so I can benefit from what they say. That isn’t the best part. The best part is that I don’t even have to die.”
The implication, by contrast, is that hell is all about the deadlines.
CourseSmart, the digital publishing company founded by higher education publishers, today announced options to make renting and purchasing educational materials more flexible. Previously, the company only allowed customers to rent e-textbooks for 180 days -- a window that is now being expanded to half a dozen options ranging from a 60-day rental to purchasing the book outright. CourseSmart also introduced Subscription Packs, which allow students to fill six slots in a "digital bookshelf" for a flat fee of $200.
"There’s a lot that’s to be said about how digital can save students money," CourseSmart CEO Sean Devine said. "Instead of going out and spending hundreds of dollars on textbooks ..., you can come to one place."
CourseSmart is also working with its publishing partners to add more interactive elements, like embedded videos and multiple choice tests, to its e-textbooks, Devine said.
"One of the criticisms of e-textbooks to date has been that they don’t add a lot of value -- except perhaps saving students money," Devine said. "There’s a fair amount of convergence going on beween what was previously a flat textbook and the more interactive, digital products. It’s our belief that digital products in the future will look more like this."
Cambridge University Press will power its learning management system with technology from Knewton to teach English to students around the globe, the two companies announced on Thursday. Knewton will work with the publisher to build a series of English Language Training (ELT) products for the Cambridge LMS platform, which serves about 250,000 students. As part of its expansion plans, Knewton will also open an office in London that will coordinate the company's work in Africa, Europe and the Middle East.
Introductory college biology textbooks prepare students – even those who don’t plan to become doctors – to take medical school examination tests, while devoting little attention to such topics as evolution, a new study shows.
Steven Rissing, professor of evolution, ecology and organismal biology at Ohio State University, analyzed eight commonly used introductory biology textbooks and found that all closely followed the curriculum suggested for pre-med students by the Medical College Admission Test (MCAT). All texts included at least 50 percent of the primary MCAT biology content specifications within the first 30 percent of text.
Over all, they put a heavy emphasis on molecular and cellular biology while underemphasizing “big issues,” such as personalized medicine, evolution and climate change, that have more relevance to students who don’t plan on being medical doctors, Rissing said in a news release. “We need to have biology education for citizens and voters, not just for future doctors.”