Algorithms, Optimization, and The Traveling Salesman Problem This is the fifth article in a seven-part series on Algorithms and Computation, which explores how we use simple binary numbers to power our world. The first article, How Algorithms Run the World We Live In, can be found here.

The essential job of a theoretical computer scientist is to find efficient algorithms for problems and the most difficult of these problems aren't just academic; they are at the very core of some of the most challenging real world scenarios that play out every day. The most critical of these is the problem of optimization: how do we find the best solution to a problem when we have a seemingly infinite number of possible solutions?

RELATED: NEW ALGORITHM ALLOWS AUTONOMOUS CARS TO CHANGE LANES MORE LIKE HUMANS

For now, the best we can do is take a heuristic approach and find a good enough solution, but we are creating an incalculable level of inefficiencies that add up over time and drain our finite resources that could be better used elsewhere. We call this the Traveling Salesman Problem and it isn't an understatement to say that the solution to this problem could save our economy trillions of dollars.

The Traveling Salesman Problem, Exponential Time Complexity, and Beyond

The Traveling Salesman Problem is described like this: a company requires one of their traveling salesman to visit every city on a list of n cities, where the distances between one city and every other city on the list is known. Each city can only be visited once and the salesman finishes in the city he started from. What is the shortest path that he can take to accomplish this?

The most efficient algorithm we know for this problem runs in exponential time, which is pretty brutal as we've seen. Unlike RSA encryption though, in the case of the Traveling Salesman Problem there is no modular arithmetic or turning factorization into period finding, as Shor's algorithm does. The Traveling Salesman Problem is a decision problem, and there are no shortcuts we know of that gets us under exponential time complexity.

Just to reinforce why this is an awful situation, let's use a very common example of how insane exponential time complexity can get. Lets say you could fold a piece of paper over and over as many times as you want and that will always have as much length as necessary to make the fold. Taking a measure of the width of the stack of "sheets" in the final product where the folded paper is growing in length away from us, this is what you can expect:

* 0 folds: 1/250th inch thick.
* 10 folds: ~2.05 inches thick.
* 25 folds: ~1 mile thick.
* 43 folds: The surface of the moon.
* 52 folds: Inside the sun.
* 57 folds: Passing Ultima Thule
* 67 folds: Takes light 1.5 years to travel from one end to the other.
* 82 folds: As wide as the Milky Way Galaxy.
* 93 folds: Within astronomical throwing distance of the supermassive black hole in the center of Messier 87.
*101 folds: Not sure what's there because it's beyond the observable universe.

And that's with the best algorithm we've got right now. Each one of those "sheets" in that stack is a route the salesman could take whose length by the end we would need to check and measure against all the other route lengths and each fold is equivalent to adding one extra city to the list of cities that he needs to visit. If we just blundered into trying to solve the Traveling Salesman Problem by checking every possible solution to find the best one, we're looking at factorial time complexity. Using our 128-bit number from our RSA encryption example, which was 2128, whereas 101 folds is only 2101, 35! blows past 2128 by at least a factor of 100. It just gets worse with each additional increment in your input, and this is what makes the Traveling Salesman Problem so important and also so maddening.

The Problem of Optimization Algorithms

We don't know how to find the right answer to the Traveling Salesman Problem because to find the best answer you need a way to rule out all the other answers and we have no idea how to do this without checking all the possibilities or to keep a record of the shortest route found so far and start over once our current route exceeds that number. That's the best we have, and that only brings things down to around exponential time complexity, so as a solution, it isn't much of a solution at all. The Traveling Salesman Problem is special for many reasons, but the most important is because it is an optimization problem and optimization problems pop up everywhere in day to day life. As far as input sizes go, 101 is not very large at all. In the real world, there are that many small towns or cities in a single US state that could theoretically be part of the delivery area of large commercial distributor. Figuring out the single shortest route between all the stops their trucks need to make to various customers on a day to day basis would save an incalculable amount of money in labor and fuel costs.

If you are sourcing parts from overseas for your factory, which route and combination of delivery methods will cost you the least amount of money? Which configuration of protein folds is the one that can defeat cancer? Finding an algorithm that can solve the Traveling Salesman Problem in something close to polynomial time would change everything and it would do so overnight.

Part of the problem though is that because of the nature of the problem itself, we don't even know if a solution in polynomial time is mathematically possible. This is because of the way we classify problems and the Traveling Salesman Problem belongs to a very special classification in that system, one that poses one of the greatest challenges in mathematics and computer science, with far reaching implications for the real world.

The sixth article in our series on Algorithms and Computation, P Vs. NP, NP-Complete, and the Algorithm for Everything, can be found here.