Uber Starts Self Driving Taxis
A select group of Pittsburgh Uber users will get a surprise the next time they request a pickup: the option to ride in a self-driving car.
The announcement comes a year-and-a-half after Uber hired dozens of researchers from Carnegie Mellon University’s robotics center to develop the technology. Uber gave a few members of the press a recent review when a fleet of 14 Ford Fusions equipped with radar, cameras and other sensing equipment pulled up to Uber’s Advanced Technologies Campus (ATC) northeast of downtown Pittsburgh.
During a 45-minute ride across the city, it becomes clear that this is not a bid at launching the first fully formed autonomous cars. Instead, this is a research exercise. Uber wants to learn and refine how self-driving cars act in the real world. That includes how the cars react to passengers, and how passengers react to them.
“How do drivers in cars next to us react to us? How do passengers who get into the backseat who are experiencing our hardware and software fully experience it for the first time, and what does that really mean?” said Raffi Krikorian, director of Uber ATC.
The autonomous Ford Fusions that Uber is now dispatching to riders appear to be, for the most part, regular cars. The most noticeable difference is an array of sensors that jut out of their roof. Additional sensors are integrated into the cars’ sides.
A Lidar unit uses a laser to collect 1.4 million map points a second, resulting in a 360 degree image of the car’s surroundings. Cameras and a GPS system add additional intelligence.
The self-driving car detected obstacles, people and even potholes, and responded intelligently. The expected is already mundane. The bigger challenge for Uber is planning for the unexpected.
Uber is first offering autonomous pickups in a few Pittsburgh neighborhoods. Within a few weeks, it will expand to the airport and a northern suburb. The slow rollout is because Uber pre-maps the roads its cars travel, a practice Carnegie Mellon University researcher Aaron Steinfeld, who is unaffiliated with Uber, assured me is totally normal. The cars receive pre-collected maps that include speed limits and other generally applicable information so they can focus on real-time detection of variables like pedestrians.
Uber logs each of its road tests and uses the data to tweak how the cars should respond in specific situations. For example, the cars know that when they arrive at a four-way stop they should drive on in order of when they arrived. But what happens when another car fails to respect that order? It knows it should stop if another car jumps the gun, but it should also know to go if another car takes too long.
Humans still abide by many social cues when they’re driving. They make eye contact with other drivers and can read the subtle body language of a jogger that says they are thinking about cutting across the street. Uber’s cars can predict the likelihood that a pedestrian is about to cross the road, but reading actual social cues is still just a goal.
The company plans to switch to one ride-along engineer within the next six months. Eventually, the final engineer could be replaced by a remote help center; when a car encounters a foreign situation, it contacts a human in the center for help. Uber is also researching how to prevent accidental gridlock situations and how cars should behave when there are many pedestrians in the street.