City-savy drivers often strategize every time they jump in their car, planning the fastest way to their destination. Few, however, take it as far as Dr. Alex de Barros, a civil engineering professor at the University of Calgary.
De Barros is looking at ways to help drivers commute more quickly by processing raw traffic data into usable information that will be available on the Internet.
“We’re looking into ways of providing better information on traffic conditions to drivers so they can make better-informed decisions,” said de Barros. “One of the problems we have today is that if you, as a driver, want to go somewhere and you have several options for routes, you have no idea which route is going to be congested.”
The objective of his research is to come up with a technology that will collect traffic data in real time and input it into a computer model that will predict traffic conditions on all routes available.
“[Global positioning systems tell] you where you are, the GPS devices that are now installed on automobiles have the ability to tell you the shortest distance between where you are and where you want to go,” said de Barros. “It does not tell you anything about traffic conditions. So it may very well be that the shortest route is not going to be the best route because it’s going to take you longer to get there.”
De Barros said media traffic reports are not much more helpful.
“Radio traffic information is very limited in its content,” he said. “It can tell you that Glenmore trail is congested now and that Heritage drive is free, but if you are in the NW and you are planning to go south, it’s going to take you at least 30 minutes to get there, and by the time you get there the conditions may have changed significantly. This technology we are working on is about traffic predictions. It’s trying to predict traffic conditions, not only now but in the next 30 to 60 minutes.”
De Barros explained that to be effective, the algorithm must run repeatedly to include drivers’ responses to information. Much like a radio report, a single run would only result in diverting traffic elsewhere.
“It’s going to do one run and then it’s going to find that because of the drivers reaction, now everyone is congesting the other route,” he said. “Then it’s going to do another run, and so on and so forth, until it reaches the point of an equilibrium.”
De Barros said the infrastructure is all in place; it’s just a matter of developing the technology and algorithms to process the data.
“This is all part of a broader area of research, called intelligent transportation systems, which looks into using technology to improve the transportation systems themselves,” he explained. “The only problem is, especially here in North America, the development of it has focused too much on the hardware, on developing the sensors and computers, and the gadgets to collect the information and convey it to the public. But very little has been done to actually develop the intelligence in the systems–the algorithms–to make something intelligent of the data. An algorithm is a computer system that is going to process that information and then come up with some information that is usable to the driver.”
According to de Barros, the project is still in its beginning stages because currently, there is no adequate cost estimates available.
“The idea is that it is going to be available to anyone who can access the web,” said de Barros. “[However,] it depends on who is going to be providing the service. If the city is going to be providing the service, then it will be available to everyone. If it’s going to be a company that is developing a commercial product, then you will have to pay a fee.”
De Barros said he is examining the possible environmental effects his research will have on emissions.
“That’s the very reason we’re doing this research,” he said. “Because we want to find out the impact it can have on the environment.”