From H.G. Wells to "Bill and Ted's Excellent Adventure," the literary and cinematic history of time travel offers two lessons of overriding importance. The first: Watch your step, especially when going backward in time. Everything you do, or don't do, will have unintended consequences. You could end up killing your grandfather in childhood by accident. Twist cause and effect into a pretzel of paradox and you'll probably wish you hadn't.
Lesson two: Be wary of visitors from the future. This advice will be superfluous in the case of evil Schwartzeneggerian robots programed to kill, but it holds good more generally. Even with the best possible intentions, whatever time-travelers from the future say will mess with your sense of possessing free will. Without that, you might as well stay in bed in the morning.
Heedless of all this hard-won wisdom, Robert J. Nemiroff, a professor of physics at Michigan Technological Institute, spent a couple of months in late 2013 looking for signs of chrononauts among us. His paper "Searching the Internet for evidence of time travelers" (coauthored with Marcia Goodrich, an editor of two Michigan Tech magazines) was posted at the scientific preprint repository arXiv on the day after Christmas. Its findings -- not to leave anyone in suspense -- were that chrononauts seem not to have left a digital footprint.
A reader pointed out the link a few days after the article appeared, and I set out to interview the author. The effort was complicated by the fact that Nemiroff was in transit to Washington to attend the American Astronomical Society meeting. We were able to talk by phone on Sunday morning – a day before he and his students discussed their search at a poster presentation.
The design and execution of Nemiroff's project are easily explained, but first a word about the state of time-travel research. It is focused, at this point, on speculative viability rather than engineering. Stephen J. Hawking is probably the best-known exponent of an argument against the possibility of time travel. But some of the more phantasmagoric entities in particle physics behave in ways suggesting that they move backward in time, albeit in unimaginably small fractions of a second. It is, in short, an open question. Two entries in online philosophical encyclopedias (here and here) provide rich overviews of the current state of the discussion.
With time travel, most experiments are thought experiments, but Nemiroff went in search of empirical evidence. "The question of time travel was bouncing around in my head," he told me. "If it were possible and had happened, how would you know?"
The topic came up this past summer during the weekly poker game among Nemiroff and some of his students. They started kicking around ideas, and an approach took shape. If time travelers had visited us, the best evidence would be references to events or developments well before they occurred. A book from 1967 mentioning President Obama, for example, would be pretty hard to explain on any other basis.
The next step was combing through enormous masses of text in search of the "informational traces" (as the paper calls them) left by presumed chrononauts. Nemiroff and his students came up with a number of events and names -- "Pope Francis," for one, since the current pontiff is the first ever to use that name -- and went looking for anachronistic references. The task would be impossible without search engines, of course, while hashtags and Google Trends made it easier to find needles in the haystack.
Or not find them, as it happened. It turns out Dr. Who has not been passing through, or at least not posting on Twitter.
Some commentators have responded, paraphrasing broadly, "Well, duh." But the paper itself points out that the project's design also covered another possibility: that "information itself could be sent back in time," rather than people. Indeed, the retro-transmission of data seems at least somewhat more credible than the idea of human time-jumper. It "would be a type of time travel that might not directly involve the backwards transport of a significant amount of energy or momentum," the paper notes.
"This might be considered, by some, a more palatable mode of backwards time travel than transferring significant amounts of matter or energy back in time, as the later might break, quite coarsely, local conservation of energy and momentum. For example, were the same person at different epochs to stand next to themselves, the energy tied into their own rest mass seems not to have been conserved. Similarly, instantaneous time travel to the same place on Earth might violate conservation of momentum, as the motion of the Earth around the Sun (etc.) might delegate a significant change in momentum for a corporeal object even over a time scale of minutes."
Passages like that make it difficult to calibrate how much tongue Nemiroff had in cheek when undertaking the project. So I asked him outright.
"The whole thing was somewhat whimsical," he said. At the same time, he considered it "a real research project," driven by the primal scientific feeling of curiosity. And the brainstorming required also had pedagogical value: "Students learned a lot about classical physics, about how time and special relativity works, while I learned more about social media. I’m 53. I don’t use hashtags that much and didn't know about Google Trends. So it was a matter of the history of physics and the hypermodern world colliding in a cool way." It also exemplified a basic principle Nemiroff learned from his mother: "She said that philosophers used to talk about how many teeth a horse had. When somebody counted them, science was born."
Nemiroff submitted the paper to three journals, each of which rejected it without even sending it out for review, so he decided to make it available through arXiv. The online repository, while not practicing the full-court peer-review process, does screen submissions to keep out the alchemists, perpetual-motion engineers, and suchlike. Acceptance of the paper by arXiv, like the poster session at the astronomers' meeting, is a sign that time travel remains a topic for serious scientific consideration. "It's not likely," he told me, "but you can’t point to laws that preclude it."
For that matter, his reported findings don't rule out the possibility of time-travelers among us. They might be very discreet about what they know. Besides, as the old saying goes, the absence of evidence is not evidence of absence.
One college unexpectedly found that female engineering students responded particularly well to its project-based learning approach. Experts say the curriculum could help attract and retain women in the STEM fields.
Because of my experience as former CEO of the Seagram Corporation, young business students and aspiring entrepreneurs often seek my advice on the best way to navigate the complex and daunting world of business. As college students begin to think about selecting their majors, they may be influenced by the many reports coming out this time of year that tell them which majors provide the highest post-college earning potential. Last month, PayScale released its 2013-2014 report, lauding math, science and business courses as the most profitable college majors.
My advice, however, is simple, but well-considered: Get a liberal arts degree. In my experience, a liberal arts degree is the most important factor in forming individuals into interesting and interested people who can determine their own paths through the future.
For all of the decisions young business leaders will be asked to make based on facts and figures, needs and wants, numbers and speculation, all of those choices will require one common skill: how to evaluate raw information, be it from people or a spreadsheet, and make reasoned and critical decisions. The ability to think clearly and critically -- to understand what people mean rather than what they say -- cannot be monetized, and in life should not be undervalued. In all the people who have worked for me over the years the ones who stood out the most were the people who were able to see beyond the facts and figures before them and understand what they mean in a larger context.
Since the financial crisis of 2008, there has been a decline in liberal arts disciplines and a rise is pragmatically oriented majors. Simultaneously, there was a rise of employment by college graduates of 9 percent, as well as a decrease of employment by high school graduates of 9 percent. What this demonstrates, in my mind, is that the work place of the future requires specialized skills that will need not only educated minds, but adaptable ones.
That adaptability is where a liberal arts degree comes in. There is nothing that makes the mind more elastic and expandable than discovering how the world works. Developing and rewarding curiosity will be where innovation finds its future. Steve Jobs, the founder of Apple, attributed his company’s success in 2011 to being a place where “technology married with liberal arts, married with the humanities … yields us the results that makes our heart sing.”
Is that reflected in our current thinking about education as looking at it as a return on investment? Chemistry for the non-scientist classes abound in universities, but why not poetry for business students? As our society becomes increasingly technologically focused and we build better, faster and more remarkable machines, where can technology not replicate human thinking? In being creative, nuanced and understanding of human needs, wants and desires. Think about the things you love most in your life and you will likely see you value them because of how they make you feel, think and understand the world around you.
That does not mean forsaking practical knowledge, or financial security, but in our haste to get everyone technically capable we will lose sight of creating well-rounded individuals who know how to do more than write computer programs.
We must push ourselves as a society to makes math and science education innovative and engaging, and to value teachers and education. In doing so, we will ensure that America continues to innovate and lead and provide more job and economic opportunities for everyone. We must remember, however, that what is seen as cutting-edge practical or technological knowledge at the moment is ever-evolving. What is seen as the most innovative thinking today will likely be seen as passé in ten years. Critical to remaining adaptable to those changes is to have developed a mind that has a life beyond work and to track the changes of human progress, by having learned how much we have changed in the past.
I also believe that business leaders ought to be doing more to encourage students to take a second look at the liberal arts degree. In order to move the conversation beyond rhetoric it is important that students see the merits of having a liberal arts degree, in both the hiring process and in the public statements of today’s business leaders.
In my own life, after studying history at Williams College and McGill University, I spent my entire career in business, and was fortunate to experience success. Essential to my success, however, was the fact that I was engaged in the larger world around me as a curious person who wanted to learn. I did not rely only on business perspectives. In fact, it was a drive to understand and enjoy life -- and be connected to something larger than myself in my love of reading, learning, and in my case, studying and learning about Judaism -- that allows me, at 84, to see my life as fully rounded.
Curiosity and openness to new ways of thinking -- which is developed in learning about the world around you, the ability to critically analyze situations, nurtured every time we encounter a new book, or encountering the abstract, that we deal with every time we encounter art, music or theater -- ensures future success more than any other quality. Learn, read, question, think. In developing the ability to exercise those traits, you will not only be successful in business, but in the business of life.
Edgar M. Bronfman was chief executive officer of the Seagram Company Ltd. and is president of the Samuel Bronfman Foundation, which seeks to inspire a renaissance of Jewish life.
The most famous of us all are not real. True, scholars such as Albert Einstein and J. Robert Oppenheimer were once recognized by almost any sector of the American public. In fact, they were so well-recognized that Einstein’s hair and Oppenheimer’s pork pie hat were alone representative of their celebrity.
A theoretical physicist, an astrophysicist, an applied physicist, and an engineer are now arguably as well recognized as the Einsteins and Oppenheimers of days past. The problem is that these men, Sheldon Cooper, Rajesh Koothrappali, Leonard Hofstadter, and Howard Walowitz, are not real. They are, in fact, the stars of CBS’s "The Big Bang Theory. "
Just how popular are the show and its stars? "The Big Bang Theory" begins its seventh season tonight and frequently rode atop Nielsen’s weekly ratings for sitcoms in past years. Beyond sheer volume of viewers, "The Big Bang Theory" has also garnered a wide variety of awards. This year alone, for example, the show was nominated for eight Emmys and took home top honors for Outstanding Lead Actor in a Comedy Series (Jim Parsons, a.k.a. Sheldon Cooper) and Outstanding Guest Actor in a Comedy Series (Bob Newhart, a.k.a. Arthur Jeffries/Professor Proton).
Like a number of current sitcoms, the male protagonists are portrayed as being afflicted by variant strands of perpetual adolescence. If they are not working, they are playing online role games, hanging out at a comic book store, or ingesting successive waves of takeout. Of course, a sitcom must include a subplot of ongoing sexual frustration, and "The Big Bang Theory" does not disappoint. The lone exception is the theoretical physicist who views "coitus" – as he calls it – as a mere distraction from his work.
Given the show’s appeal, what, if anything, does it tell us about the American public’s views of the academic vocation? Speaking on behalf of what the American public thinks is risky, but I fear we all may already know the answer — they find the show humorous because it, in part, correlates to opinions they already hold.
For example, in one of the final episodes from last season, entitled "Tenure Turbulence," a tenured slot comes open in the physics department when a colleague dies. When discussing the possibility, the theoretical physicist with coitus avoidus, Sheldon, claims "a guaranteed job for life only encourages the faculty to become complacent." The astrophysicist, Rajesh, argues "people do their best work when they feel safe and secure." Regardless, they all initially agree whoever among them receives tenure should do so because of his ability to do the work, not because of faculty politics.
Events then spin out of control as each one of them seeks to one-up the other in an arms race of university politics. The target for their outlandish behavior is Mrs. Davies, a member of the human resources office serving on the tenure committee. Leonard risks being placed on a stalker watchlist by making his way into the previously unexplored territory of the wellness center simply to “schmooze” Mrs. Davies while she exercises. Raj sends her a self-made video touting his academic abilities dating back to his early childhood. Not to be outdone, Sheldon provides Mrs. Davies, an African-American, with the DVD box set of "Roots."
Just when you think you have seen it all, the most outlandish behavior comes just prior to the deceased colleague’s funeral. Standing in the hallway, each one of them becomes aware of the depraved lengths the others will go in this political game. Sheldon asks his girlfriend, Amy, to remind him that an appropriate emotional response to a funeral is sadness. Perennially incapable of speaking to women, Raj is left to rely on alcohol to help him be more assertive.
Despite their antics, Mrs. Davies recommends all three candidates for further review as a result of their considerable credentials. In a mere half-hour, however, a number of possible cultural stereotypes of the life of university faculty members are brought to light. One possible stereotype held by the larger public has to do with skepticism over the possibility of someone having access to a job for life. The second has to do with how such a job is earned.
Unfortunately, the best available data confirms the existence of both forms of skepticism. Although somewhat dated but arguably still the most authoritative of its kind, Neil Gross and Solon Simmons conducted a survey of "Americans’ Views of Political Bias in the Academy and Academic Freedom" back in 2006 for the AAUP. A more recent iteration of this line of work is now found in Neil Gross’s Why Are Professors Liberal and Why Do Conservatives Care? (Harvard University Press, 2013).
In general, Gross and Simmons found "Americans are generally supportive of the tenure system.... At the same time, about 80.7 percent think that tenure sometimes protects incompetent faculty, while 57.9 percent believe that giving professors tenure takes away their incentive to work hard." As a result, "only about 17.9 percent of respondents say the tenure system should remain as it is.”
In six-going-on-seven seasons, tenure is but one of the important issues portrayed in episodes of "The Big Bang Theory." Part of the reason why we laugh, though, is the way it mirrors views held by the American public and possibly by even some of us. Tenure and other practices like it are too critical to the work we do to be unquestioningly portrayed in such a manner.
The challenge facing us, those of us who are real, is how our efforts persistently challenge such perceptions. Perhaps one day a sitcom will climb the Nielsen ratings portraying tenure as a practice so revered that it inspires nothing but the highest devotion to teaching, research, and service.
Todd C. Ream is professor of higher education at Taylor University and a research fellow with the Institute for Studies of Religion at Baylor University. He (along with Drew Moser) is currently working on a cultural biography of Ernest L. Boyer.