What’s it like to ride in a self-driving car?

Autonomous vehicles (AVs) are one of the most talked-about technologies of the moment. And little wonder: they promise to revolutionise the transport of people, and physical goods, just as dramatically as the internet transformed the delivery of information. But they raise many questions. When will they be available? Will they be safe? Will they make car-ownership obsolete? And most of all: what is it like to ride in a car that drives itself?

I’ve spent the past few months working on a 10,000-word special report on AVs for The Economist, which was published in this week’s issue. The focus of my report is mostly on the long-term implications of AVs, based on the assumption (a reasonable one, I think) that the technology can be made to work reliably in the next few years. Rather than focusing on the minutiae of things like the ever-changing industry alliances, or who is suing who, I concentrated instead on the impact on urban planning, the transformation of retailing and the broader social and political implications of cars that can drive themselves. I spoke to as many urban planners and social historians as machine-learning experts or car-industry executives. All this horizon-scanning and future-gazing was fun. But to kick off the report, I had to actually go in a self-driving car. Which is how I found myself, on a snowy morning a few weeks ago, standing in a car park in Pittsburgh, waiting for an automated ride.

Three years ago I went in a self-driving car in Shanghai. It was quite a basic example: what is known in the field as a “Level 2” vehicle. This means it can steer itself, and maintain a safe distance from the car in front, while driving in highway traffic. I rode in an Audi A7, and a few cars now on the market (notably those made by Tesla) are capable of Level 2 automation. But the driver is required to keep hands on or near the wheel, and to pay attention to the surroundings, in case anything unexpected happens. With the next level up, Level 3, the car takes more responsibility for monitoring its surroundings, allowing the driver to relax a bit more. But if the car encounters a situation it cannot deal with, it sounds a warning telling the driver to resume control. The first Level 3 vehicle, the Audi A8, goes on sale this year.

Level 2 and Level 3 are really just glorified forms of cruise control. A truly self-driving vehicle doesn’t just follow the road ahead of it. It does route-planning and knows where it’s going. It handles junctions, crossings, traffic lights and road signs, and interacts smoothly with other vehicles, pedestrians and cyclists. This is, to say the least, a big leap. A Level 4 vehicle is defined as one that can do all of this, without any input from a human driver, within a limited area: in practice, a city neighbourhood that has been mapped in very fine detail, to give the car a big head start in understanding its surroundings. A Level 5 vehicle (something that does not yet exist) is one that can in theory drive anywhere, like a human driver. That may be an unattainable goal: some people dislike driving at night, or in snow, and I would not volunteer to drive a car in Delhi unless I really had to. The upshot is that the most advanced AVs on the roads today are generally Level 4 vehicles that operate within specific regions of particular cities.

Different cities offer different testing environments. Phoenix, Arizona is popular because it has a regular grid system and reliably good weather (snow can confuse the LIDAR sensors that AVs use to scan their surroundings). AVs can also be seen on the roads in and around Mountain View, for similar reasons, and because so many technology firms are based in the Bay Area. The early history of self-driving vehicles was shaped by the rivalry between Stanford University in California and Carnegie Mellon University in Pittsburgh, which have produced many of the engineers now leading AV projects around the world. This has made Pittsburgh another hub of AV research and testing. The city is considered a more challenging environment than Phoenix, because its road layout is more complex, and the weather is worse. San Francisco’s urban environment is particularly complex, which is why Kyle Vogt, the boss of Cruise, an AV startup acquired by General Motors, says it is the best place for testing (check out his blog post for some very impressive video footage). If you can make it there, you might say, you can make it anywhere.

A self-driving Uber vehicle. That round thing on the top is the LIDAR sensor.

Anyway, back to Pittsburgh. Uber has hired a lot of engineers from Carnegie Mellon, and Uber’s Advanced Technologies Group, which is developing its self-driving cars, is based in the city. The vehicle I climbed into was a modified Volvo XC90, with a bundle of extra sensors, including cameras and a spinning LIDAR unit, on its roof. Ryan, the vehicle’s safety driver, manually drove the vehicle out of the car park and onto the public roads, before pressing a button to engage the self-driving system. And then the car started driving itself.

At first, the experience is thrilling. It seems like magic when the steering wheel turns by itself, or the car gently slows to a halt at a traffic light. The autonomous Uber drove carefully but confidently in downtown traffic and light snow, slowing down when passing a school or approaching the brow of a hill, and putting its foot down (as it were) when faced with an open, straight road with no other traffic. The most noticeable difference from a human driver was that the vehicle made no attempt to avoid Pittsburgh’s notorious potholes, making the ride slightly bumpy at times. Sitting in the back seat, I could see a digital representation, displayed on an iPad mounted between the front seats, of how the car perceived the world, with other vehicles, pedestrians and cyclists highlighted in clusters of blue dots. I felt as though I was living in the future. But then, after a minute or two, the novelty wore off. When technology works as expected, it’s boring.

How the car sees the world. Objects of particular interest are shown in blue.

This is, in fact, exactly the reaction that engineers are hoping for. Noah Zych, Uber’s head of system safety for autonomous cars, told me that after working on AVs for ten years, he was finally able to offer his parents a ride in a self-driving car last year when they came to visit. After their ride ended, he asked them what they thought of it. “And my mom said, ‘actually, it was kind of boring’. And that’s the response that we really want,” he says. Uber has offered some riders in Pittsburgh and Phoenix the option to travel in its self-driving vehicles, provided the start and end points of their ride fall within their area of operation. (Riders can say no if they want to.) Around 50,000 people have travelled in Uber’s self-driving cars in the past couple of years. Uber wants to understand how to design the in-car experience (such as what information should be shown on the screen), and it also wants to reassure both riders and other road-users about the safety of autonomous vehicles. “The best way to convince people that a self-driving vehicle is going to be safe and capable of driving them around in the future is to give them that first experience,” says Mr Zych.

Ryan, the safety driver in my self-driving Uber, had to take over occasionally, for example to steer the car around a delivery truck that had blocked the road — the car was programmed to play things safe and wait, rather than cross the double-yellow lines in the middle of the road — and to guide the car through roadworks where the lane markings had been recently changed. A couple of times he also took over when the car looked as though it might be passing a bit too close to another vehicle. In each case a collision was unlikely, Ryan explained, but if people think a collision is imminent, they will not feel safe. So part of his job is to flag up instances where the car’s driving style could be tweaked to provide a better experience for passengers. At the end of each day, the contents of the car’s on-board computers are downloaded for analysis. Each time the safety driver had to take over — an event known as a “disengagement” — the corresponding data can be analysed to see how the car’s software could be improved. It is then possible to simulate how the car would have responded with various modifications to its algorithms. “We can play it back again and again, vary the scenario and see the distribution of outcomes,” says Mr Zych. After being tested in simulation, the improved software is then rolled out in real vehicles. It is first tested on a small subset of routes, called “canonical” routes, which test different aspects of its behaviour. If it works as expected, the software is then rolled out for general use.

I spent most of my hour in a self-driving Uber discussing disengagements, algorithm design and user interfaces, which (to me, at least) are just as exciting as being in a futuristic robocar. And even when the driving algorithms are working perfectly, there are several practical questions that still have to be addressed. For example, how will people actually hail driverless vehicles? They can’t just stop anywhere, or they will block traffic and annoy people. Human drivers can pick a good place to stop, but machines will need help. Uber has already started identifying good pick-up and drop-off points in some cities, and suggesting them to riders of human-driven vehicles. But it may be that in future, streets will have designated pick-up and drop-off areas; already, some university campuses and apartment blocks are being built with ride-hailing in mind. And how will a self-driving vehicle be able to tell that everyone is on board and ready to go? Some kind of “start” button will be needed — and a “stop” button, too, in case a rider suddenly wants to get out of the vehicle. (The driverless Uber has a “pull over” button for this.)

Waymo, the self-driving unit of Google’s parent company, hopes to launch a robotaxi service in Phoenix later this year. Waymo has the lowest disengagement rate in the industry, and is generally considered the leader in the field; its autonomous vehicles can now operate in Phoenix without the need for safety drivers. GM’s Cruise, which is fast catching up with Waymo, hopes to launch a robotaxi service in 2019, using autonomous Chevy Bolts that do not have a steering wheel, pedals or any other kind of manual controls. So they, too, will have to be able to operate entirely autonomously without a safety driver. Dozens of other firms are also working on self-driving vehicles. Over the coming months and years more AVs will take to the roads in more cities, and the areas in which they operate will gradually expand. Probably sometime in the 2020s, you will take your first ride in a self-driving car. It will be exciting at first — but then, if all goes well, it should quickly become reassuringly boring.

1 Comment

Leave Comment

Your email address will not be published. Required fields are marked *