September 27, 2016

Efstathios Bakolas

Efstathios Bakolas, an assistant professor in the Department of Aerospace Engineering and Engineering Mechanics, has received funding from the National Science Foundation to develop path-planning and decision-making algorithms that will help autonomous vehicles maneuver around other moving objects, and prioritize which objects to react to first.

“This is not a traditional path-planning problem where you have one vehicle you want to take from point A to point B,” Bakolas said. “What we’re doing is helping vehicles make informed decisions about how to optimally maneuver in the presence of obstacles that are moving, could be intelligent, and could be malicious or benign.”

The research is important for keeping autonomous vehicles safe as they execute a task, and could help inform laws on their operation across sectors.

For people, maneuvering around obstacles is second nature. We change direction quickly to navigate around people on a busy street, or sprint away from a dangerous encounter. We can change lanes on a highway, while also preparing to take a detour further up the road.

Bakolas’ goal is to create algorithms that allow autonomous vehicles—such as unmanned aerial vehicles (UAVs)—to make the same kinds of short-term and long-term decisions in real time. To do that he’s turning to a principle that’s well known to chess masters and market analysts, as well as autonomous systems researchers: game theory.

Game theory provides a framework to understand how interactions between individuals influence their own decisions and actions toward one another, and predicts how the decision or action of a particular individual, with a certain set of choices at his disposal, can influence certain outcomes in the near and long-term future. Bakolas said that algorithms informed by game theory logic will help keep autonomous vehicles on the safest path to a destination.

drone
Bakolas and his team may eventually test their algorithms on drones similar to the one shown here.

“Because you have multiple entities sharing the same environment you have many decisions to make in a possibly short period of time. You have to account for the most imminent threat, but also other threats that may become imminent in the future,” Bakolas said.

At this stage Bakolas said that the research is focusing on logic that will underlie the algorithms. While game theory is known in a wide array of fields, from economics to biology, it began as a way to evaluate human decision making behavior using applied mathematics. Bakolas plans to use such mathematical reasoning to simulate scenarios an autonomous vehicle might encounter.

Another part of the research is making algorithms adjustable to hardware of the autonomous vehicle. It’s likely that an off-the-shelf drone would have less processing ability than a self-driving car. The capabilities of different vehicles will require different decision-making approaches to work in real time.

Bakolas will be testing algorithms using simulations. However, by the end of the grant period he said that it’s possible that the algorithms may be tested in actual UAVs in the department’s robotics research facilities.

Early tests could be a preview of how autonomous vehicles may behave in the public sphere. Bakolas said that current regulations banning UAVs in airspace and self-driving cars are erring on the side of caution. If methods are developed that show that vehicles can quickly make risk-informed decisions and can successfully maneuver around all types of moving obstacles, the skies and roads could open.

“It has become a very relevant problem. I think this type of research can give us insights on how we can regulate because it will help us to understand better the problem,” Bakolas said.