The students positioned themselves at the starting line, holding up signs designed to get their robots to follow them. The winner would be the student-robot tandem that crossed the finish line first.
This is the race that 24 Master’s students took part in in early June as part of the Data and AI for Transportation class given by Alexandre Alahi, an assistant professor at EPFL’s Visual Intelligence for Transportation (VITA) laboratory.
Programming recognition algorithms
“We had to program our robot to recognize a visual signal captured by an embedded camera and then follow the signal. That required developing our own algorithm and programming it in the robot,” says Rayan Abi Fadel, a student in the College of Management of Technology. All of the competing teams had to use the same robot – a Segway Loomo – and the same base algorithm developed by two VITA PhD students, Yuejiang Liu and George Adaimi. But they could adapt the algorithm and reconfigure it using deep learning and AI methods.
The base algorithm was designed to train a deep neural network to detect an object’s position in any kind of image, with the objective of creating a database of synthetic images obtained by “pasting” the object on random backgrounds. “That let us fine-tune some of the algorithm’s parameters and consequently improve our robot’s performance,” says Alexandre Carlier, a computer science student who came in third place. “For example, since we wanted our robot to follow an object while the student holding it was running, we artificially blurred the background of the images used to train the robot, to simulate the effect of rapid movement.”
Most of the students who competed in the race already knew how to program, but not all. Those who had to learn along the way included Linah Charif and Sergej Gasparovich, the two civil engineering students who came in first. “It’s great to see that students who come into the class not knowing how to program can pick up that skill and do so well,” says Alahi.
Some of the main challenges the students had to overcome were screening out the interference caused by other race participants and coping with lighting fluctuations in the room where the race was held. “Turns were another big difficulty. We had to make sure we would always be in our robot’s line of vision so that it could keep following us,” says Carlier.
Most student teams programmed their robots to recognize an image rather than a face, as image signals are clearer and less susceptible to interference. The images ranged from a red circle or a banana to the Swiss flag, a glass of wine, or even Mickey Mouse.
Technology for self-driving cars, and more
The algorithms that the students developed are similar to those used in self-driving cars, which enable the vehicles to recognize street signs, traffic lights, pedestrians, and other cars. “Our laboratory aims to develop technology that helps humans and machines coexist,” says Alahi. For applications ranging from drones that deliver packages to robots that help the elderly carry their luggage or groceries, it’s essential that the machines have a basic grasp of human behavior and act with a certain amount of social intelligence. For instance, a robot moving through a crowd needs to be able to follow the right social and ethical conventions for a given situation. This race at EPFL demonstrates that humans and robots can live together in small spaces.
Students taking part in the race receive a grade that depends not only on what place they came in. “They are also graded on the work they did upfront on the base algorithm,” says Adaimi. “I hope that the race made our class more interesting and will encourage students to go further in this field of research.”