Unjamming Traffic with Computers

Insights gleaned from realistic simulations are already
moving from computer screens to asphalt

by Kenneth R. Howard

 

...........

 
As any driver who has crawled in the "fast lane" of a crowded highway can attest, traffic has become a plague. Infrastructures strain at ever more cars arriving on the road. And although colossal sums of money are being spent on solutions--$140 billion over five years by the federal government's Highway Trust Fund alone--tangible results have been elusive, and road planning remains somewhat akin to gambling without knowing the odds.

Traffic problems can stem from cars slowing or stopping, sometimes in response to accidents. Yet the more usual cause is simply too many people wanting to be in the same place at the same time. Limitations in the road system or minor inconsistencies in the behavior of drivers can then compound to cause torturous slowdowns. Planners and scientists often state the problem as "too many people, not enough roadway."

Past methods of predicting traffic patterns relied on statistical models that treated traffic as a homogeneous fluid, ignoring differences between individual drivers. Sections of transportation networks were often analyzed in a vacuum, without regard for the interactions between the components. Refinements in the mathematical treatment of complexity, however, coupled with hugely greater computing power, have revolutionized traffic analysis.

"Transportation is in a very cool spot between a social system and a physical system," explains Christopher L. Barrett, who studies traffic at Los Alamos National Laboratory. "Traffic lies in the middle. It goes beyond physics to a human scale." On the physical side are the stunning variety and number of vehicles and road systems, each contributing its own peculiarities, as well as the weather and other environmental factors. The social, behavioral side encompasses not only the individual preferences and second-by-second reactions of drivers but also actions taken by the rest of society--from corporate choices about where to locate headquarters to sports teams' play strategies that influence attendance at games. To understand the forces acting on traffic flow, transportation planners must analyze the many possible outcomes from this snarled network of decisions.

As a further complication, seemingly logical solutions to traffic problems can have counterintuitive consequences. One instance is known as Braess's paradox, named for the German operations researcher Dietrich Braess, who in 1968 first noticed the phenomenon. He discovered that raising a network's traffic capacity can sometimes slow the average travel speed. "You have to be on the lookout for this problem when designing traffic networks," says Joel E. Cohen, mathematician and professor of populations at the Rockefeller University. Cohen explains that in road networks, adding a lane or route can increase driving time as unpredicted bottlenecks arise from what was thought to be a fix--for example, when too many drivers pile onto a new shortcut, causing gridlock.

As Steen Rasmussen of Los Alamos National Laboratory points out, "When you design roads you want to maximize throughput [traffic flow], but it turns out at the point of most throughput, predictability drops. This means as you go toward capacity, reliability of the traffic system breaks down." As variability within the transportation system explodes, more and more parts of it are pushed into a "critical regime," he says, where "small perturbations can cause the system to break down." The goal, then, is to design systems that function just under capacity.

Silicon Traffic

To that end, recent simulations have made important strides in mimicking the broad dynamics of transportation systems. Many traffic researchers now view transportation systems as what complexity theorists call "self-organizing systems"--entities that manifest a cohesive behavior even though they lack a central controller. According to Rasmussen, traffic can be looked at as a system of diverse elements widely distributed over space. "The elements that interact with one another are like biological systems. They are dynamical hierarchies with controls at many different levels, like organelles, cells, tissues, humans," he says. The challenge for designers of transportation simulations is isolating and modeling the different elements, then bringing them together to operate as a whole.

The tools that allow complexity researchers to unite the myriad components and organizational layers into a real-time or faster computer model of a traffic system are often cellular automata. Cellular automata are a type of computer simulation best known through the "Game of Life," invented by John Conway in 1970. Various agents, or elements with defined properties, are placed on a grid and assigned an initial state. (The states for car agents might be "moving" or "not moving," for example.) As time advances, each agent changes state in keeping with the rules of its own behavior and the current state of its neighbors. A typical rule could be that an agent is in motion if adjacent to two or fewer other agents and at rest if surrounded by more than two. According to Barrett, with each tick of the clock, the computer tries to update the state of every agent by looking at all its neighbors; from these interactions a global system emerges.

Barrett applied the supercomputers of Los Alamos to create a model of traffic scenarios called the Transportation Analysis Simulation System (TRANSIMS). John L. Casti, a complexity theorist at the Santa Fe Institute, describes TRANSIMS as "copying in silicon the traffic of a metropolitan area and putting it in real time as if you were in a helicopter watching the second-by-second movements."

The model creates a lab for testing traffic scenarios. Transportation planners can use it to predict, with reasonable accuracy, what the effects of building a bridge or adding a highway lane might be--options too costly to test in the real world. In this way, a Braess's paradox might be caught before a planning error was set in concrete. The simulation also brings the scientific method to traffic planning; an experimental situation can be precisely re-created.

TRANSIMS, which is sponsored by the U.S. Department of Transportation and the Environmental Protection Agency, was first used in 1993 to model the traffic system of Albuquerque, N.M. The simulations successfully mimicked the actual observed traffic patterns. In its second incarnation, which began in late 1995, it simulated traffic for Dallas/Fort Worth, Tex., an area of 3,600 square miles with 2.3 million residents. Beginning in 1998, work will commence on a more advanced simulation of Portland, Ore., that will attempt to incorporate realistic patterns of people changing traffic modes, such as driving to a train station, taking the train and then riding a bus to finish a commute to work.

TRANSIMS is still only in a research and development phase, but experts with knowledge of local highways can study the results of the simulations and offer insights into why certain traffic patterns emerge. Says Casti: "There is a parallel in evolutionary biology. It is hard to predict change, but hindsight can give a good explanation of why things turned out as they did."

Meanwhile transportation planners are learning from simulations by the Massachusetts Institute of Technology's Intelligent Transportation Systems Program. This system (which is not based on cellular automata) mathematically models behavior down to individual driver habits--creating digital cars with a penchant for cutting off other cars, speeding down lanes and generally exhibiting traits seen every day on the highways in and around Boston. The program is being used by the city's $8-billion Central Artery/Third Harbor Tunnel project. In addition to testing road-plan scenarios before construction, the simulations are helping with designs for traffic management systems (such as traffic signal algorithms and driver information systems) that will smooth the flow of traffic. The strategy, according to Moshe Ben-Akiva, professor of civil and environmental engineering at M.I.T., is to alleviate traffic congestion by designing a physical system that guides drivers toward better choices.

Using computer simulations to find the best solution to traffic problems ultimately calls for the inclusion of many factors beyond driver behavior, traffic density and the like. Possible fixes such as congestion pricing for tolls--with higher prices for peak-hour usage--and mass transit could be taken into account. Pollution analysis is another important consideration, one mandated by law and subject to paradoxical effects. (Shortening the distances that cars travel, for example, might seem like a good way to reduce emissions. But during a short trip, a car's engine and its catalytic converter stay too cool to run efficiently and so proportionally emit more pollutants.) The ideal traffic simulators would consider aspects of air chemistry and construction patterns, because the geometry of buildings affects air movement.

The accumulating complexity of all these variables cannot yet be modeled or predicted easily. Planners will therefore have to wait for more complete computational tests. In the meantime, however, simulation is still likely to provide the insights necessary to keep traffic moving in the right direction.


KENNETH R. HOWARD is a writer based in New York City.