Algorithm of a Salesman
A theoretical physicist named Eugene Wigner once referred to “the unreasonable effectiveness of mathematics” -- a phrase that, on first hearing, sounds paradoxical. Math seems like rationality itself, at its most efficient and severe. But even someone with an extremely limited grasp of the higher realms of mathematics (your correspondent, for one) can occasionally glimpse what Wigner had in mind. His comment expresses a mood, more than an idea. It manifests a kind of awe.
For example, in the 1920s Paul Dirac came up with an equation that permitted two possible solutions, one of which applied to the electron. The other corresponded to nothing that physicists had ever come across. Some years later, physicists discovered a subatomic particle that did: the positron. The manipulation of mathematical symbols unveiled an aspect of the physical universe that had been previously unknown (even unsuspected). To adapt a line from the Insane Clown Posse’s foul-mouthed appreciation of the wonders of the universe, “This [stuff] will blow your mother[loving] mind.”
True, that. I’ve even felt it when trying to imagine the moment when Descartes first understood that algebra and geometry could be fused into something more powerful than either was separately. (Cartesian grids – how do they work?) But there’s a flipside to Wigner’s phrase that’s no less mind-boggling to contemplate: the existence of “simple” problems that resist solution, driving one generation of mathematicians after another to extremes of creativity. Fermat’s last theorem (formulated 1637, solved 1993) is the most famous example. The four-color map conundrum (formulated 1852, solved 1976) proved misleadingly uncomplicated.
And then there's the great, bewildering problem surveyed in William J. Cook’s In Pursuit of the Traveling Salesman: Mathematics at the Limits of Computation (Princeton University Press). The challenge has been around in one form or another since 1934. It looks so straightforward that it’s hard to believe no one has cracked it – or will, probably, any time soon.
Here’s the scenario: A traveling salesman has to visit a certain number of cities on a business trip and wants to get back home as efficiently as possible. He wants the shortest possible route that will spare him from going through the same city more than once. If he leaves home with just two stops, no planning is necessary: his route is a triangle. With three stops, it’s still something that he might work out just by looking at the map.
But his company, pinched by the economy, needs to “do more with less,” like that ever works. It keeps adding stops to the list. By the time he has five or six calls to make, planning an itinerary has gotten complicated. Suppose he's leaving home (A) to visit five cities (B, C, D, E, F). He starts out with five possibilities for his first destination, which means four for his second stop. And so on -- one fewer, each time. That means the total number of possible routes is 5 x 4 x 3 x 2 x 1 = 120. Our salesman may be relieved to discover that it's really only half of that, since traveling in the sequence ACDBFEA covers exactly the same distance as doing it the other way around, as AEFBDCA.
Still, picking the shortest of sixty possible routes is a hassle. Let the number of cities grow to 7, and it's up to 2,520. Which is crazy. The salesman needs a way to find the shortest trip, come what may -- even if the home office doubled the number of stops. There must be an an app.
Except, there isn't. I don’t mean with respect to available software, but at the level of a method that could solve the traveling salesman’s dilemma no matter how many cities are involved. A computer can tackle problems on a case-by-case basis, using brute force to calculate the distances covered by every possible route, then selecting the shortest. But that’s a far cry from having an elegant, powerful formula valid for any given number of cities. And even the most unrelenting brute-force attack on the traveling salesman problem (TSP) might not be enough. Finding the shortest way around a 33-city route would require calculating the distances covered by an unimaginably vast number of possible tours. I’m not up to typing out the figure in question, but it runs to 36 digits. And that's for a two-digit tour.
Cook explains what would happen if we tried to compile and compare every possible sequence for a 33-city route using the $133 million dollar IBM Roadrunner Cluster at the Department of Energy, which “topped the 2009 ranking of the 500 world’s fastest supercomputers,” The Roadrunner can do 1,457 trillion arithmetic operations per second. Finding the shortest route would take about 28 trillion years – “an uncomfortable amount of time," Cook notes, "given that the universe is estimated to be only 14 billion years old.”
That does not mean any given problem is insoluble, even with an extraordinarily high number of cities. In 1954, a group of mathematicians in California solved a 49-city problem by hand in a few weeks, using linear programming -- which seems appropriate, since linear programming (LP) was developed to help with business decisions on how best to use available resources. In Pursuit of the Traveling Salesman devotes a chapter to the history of LP and the development of a multipurpose tool called the simplex algorithm. (Cook’s treatment of LP is introductory, rather than technical, though it’s not exactly for the faint of heart.)
Other tour-finding algorithms find clusters of short routes, then link them as neatly as possible. If having absolutely the shortest path isn’t the top priority, various methods can generate an itinerary that might be close enough for practical use. And practical applications do exist, even with fewer traveling salesmen now than in Willy Loman’s day. TSP applies to problems that come up in mapping the genome, designing circuit boards and microchips, and making the best use of fragile telescopes in old observatories.
But no all-purpose, reliable, let-N-equal-whatever algorithm exists. It's the sort of cosmic untidiness that some people can’t bear. Cook quotes a couple of mathematicians who say that TSP is not a problem so much as an addiction. Combining brute-force supercomputer processing with an array of tour-finding concepts and shortcuts means that it’s possible to handle really enormous problems involving hundreds of thousands of cities. But that also makes TSP a boundary-pushing test of computational power and accuracy. Finding the shortest route is an optimization problem, but so, in a way, is figuring out how to solve it using the tools at hand.
Since 2000, the Clay Mathematics Institute has offered a prize of a million dollars for the definitive solution to a problem of which TSP is the exemplary case. (Details here.) “Study of the salesman is a rite of passage in many university programs,” Cook writes, “and short descriptions have even worked their way into recent texts for middle-school students.” But he also suggests that the smart money would not bet on anyone ever finding the ultimate TSP algorithm, though you’re more than welcome to try.
Another possibility, of course, is that the right kind of mathematics just hasn’t been discovered yet – but one day it will, proving its “unreasonable effectiveness” by solving TSP as well as problems we can’t even imagine at this point. As for our hypothetical traveler, he’d probably feel envy at one of the Cook’s endnotes. The author “purchased 50 years of annual diaries written by a salesman” via eBay, he writes, “only to learn that his tours consisted of trips through five or six cities around Syracuse, New York.” He didn't need an algorithm, and got to stay home on weekends.
Search for Jobs
Mathematics & Statistics, Full-Time, Tenure Track, Assistant Professor, John T. Beresford Endowed Chair
Assistant/Associate Professor of Social Work, Assistant/Associate Professor of Mathematics, Chair of Department of Mass Communications, and Chair of Department of Fine Arts