You have /5 articles left.
Sign up for a free account or log in.

This week I remember a ten-hour neurosurgery I had in Pittsburgh on November 6, 2008. From radiology that suggested a schwanomma on my right trigeminal nerve, the doctors took an innovative endoscopic approach through my sinuses, down to the bottom of my brain where they drilled a little hole in my skull to access the top of the spinal cord and a mass of nerve ganglia. Once there, a bright white light reflecting the scope surprised the neurosurgeons. What should have been a fatty little tumor turned out to be glistening tissue in the nerve capsule.   Eager to do something given the work that it took to get there, the surgeons scrapped and scrapped at the nerve in hopes of fixing it.

Turns out scraping a nerve is not a good idea. I awoke in horrible pain. (My ex-wife, who was with me through the three weeks in Pittsburgh, and who is British stiff, upper-lipped, tried to explain to the doctors emphatically that I was “11 out of 10” on the pain scale.) While not excruciating as when I first awoke, I have had pain in my right trigeminal nerve every day ever since.  The pathology came back an amyloid; something I had never heard of before but have since learned it is a protein deposit. Alzheimer’s and football concussions and Parkinson’s have made the term more common. They are not all the same proteins, however.  In my case, a plasma cell disorder explains the specific type of protein that collected into the nerve. Tingling, twinges and numbness were the symptoms that brought me to the surgeon’s table. I still have those symptoms but they are less evident than the burn and ache on the right side of my tongue, gums, teeth, and cheek that cycles throughout the day from about 1 or 2 on the pain scale when I awake to 4 - 6 by the afternoon. Cocktail hour has taken on a whole new definition as a measure of pain management. 

I was already thinking about this anniversary of sorts when I read the NYT article “Battle Heats Up Over Exports of Surveillance Technology.” U.S. Internet companies are at odds with the U.S. government over proposed legislation that they claim is overly broad in its definition to target the sale of technologies that can be used by unsavory foreign governments to further their military and political ends. I have not read the legislation but I have no doubt that I, too, will find the language overly broad. I base this presumption on experience with the other unintended consequences of other legislation. For example, why you can’t get Gmail in Iran is because Google transmits encrypted data, therefore the transmission literally can be viewed as a violation of export control laws, or how section 1200 of the DMCA anti-circumvention provisions, which prohibit breaking encryption on software, might have contributed to why no one knew about Volkswagen’s deceptive emission’s software practices. 

As an English major, I invented the paraphrase “technology is neither good nor bad, only thinking makes it so” to help students evaluate their use of the Internet for everything from file sharing to social media. That phrase popped back up when I read this article and as I was already thinking about my “anniversary.” Obviously millions of lives are saved every day by medical science and the technologies used in the service of it.  In my case, that availability did not serve its intended purpose. Had I lived in an era before modern surgery, I would continue to have the symptoms of tingling, numbness and twinges for the rest of my life. I would prefer those sensations to burning and ache.

The same is true with servers and algorithms.  As the article noted, “many of the same tools that repressive governments seek from Western companies are vital for social media and other communications by political protesters and grass-roots organizers throughout the world.” Herein lies the rational for why the study of technology must be coupled with humanities and the arts, psychological and social sciences, business, law and ethics. Historically, technology is never without context.

James Barrat, among others, makes the case in Our Final Invention. Machine intelligence (commonly known as A.I. or artificial intelligence) has developed to such a degree that it may have the capacity to enslave humanity. As compelling of an argument he makes, I have never been too taken by these concerns.  Reflexively, I think, “pull the plug!”  But to entertain the argument, machines take over because we have pursued technology at the expense of ethics, an understanding of human foibles, or even common sense. More profound is the recognition that the development of technologies is the process by which we enslave ourselves. Fascinating and a challenge par excellence, machine intelligence also reflects our narcissism and possibly, if Barrat’s thesis is correct, even tragedy. 

From a higher education perspective, this recognition places the humanities in bold relief.  In light of such concerns, the humanities are eminently useful in reminding us of that which remains the greatest mystery: ourselves. No matter how captivating man’s control over nature is, the most intriguing pursuit of science and technology is not what we can fabricate but why we do it. Is it a surgeon using new tools and surgical approaches too aggressively and not thinking about the potential consequences on the patient? Is it computer scientists and engineers who have devoted their lives to refining machine intelligence to such a degree that they have neglected their own essential humanity? To keep a focus on these foundational inquiries assumes a discipline greater than intellect. It is one of human spirit. And that, ultimately, is what I will celebrate this week. 

 

Next Story

Written By

More from Law, Policy—and IT?