https://assets.rbl.ms/27863273/origin.jpg jpg
['Waymo', 'Tesla', 'Autopilot', 'Rivian', 'Driverless cars', 'Adaptation', 'heatmap.news']

Why Driverless Cars Still Can’t Handle Snow



If all the snow and ice over the past week has you fed up, you might consider moving to San Francisco, Los Angeles, Phoenix, Austin, or Atlanta. These five cities receive little to no measurable snow in a given year; subtropical Atlanta technically gets the most — maybe a couple of inches per winter, though often none. Even this weekend’s bomb cyclone, which dumped 7 inches across parts of northeastern Georgia, left the Atlanta suburbs with too little accumulation even to make a snowman.

San Francisco and the aforementioned Sun Belt cities are also the five pilot locations of the all-electric autonomous-vehicle company Waymo. That’s no coincidence. “There is no commercial [automated driving] service operating in winter conditions or freezing rain,” Steven Waslander, a University of Toronto robotics professor who leads WinTOR, a research program aimed at extending the seasonality of self-driving cars, told me. “We don’t have it completely solved.”

Snow and freezing rain, in particular, are among the most hazardous driving conditions, and 70% of the U.S. population lives in areas that experience such conditions in winter. But for the same reasons snow and ice are difficult for human drivers — reduced visibility, poor traction, and a greater need to react quickly and instinctively in anticipation of something like black ice or a fishtailing vehicle in an adjacent lane — they’re difficult for machines to manage, too.

The technology that enables self-driving cars to “see” the road and anticipate hazards ahead comes in three varieties. Tesla Autopilot uses cameras, which Tesla CEO Elon Musk has lauded for operating naturally, like a human driver’s eye — but they have the same limitations as a human eye when conditions deteriorate, too.

Lidar, used by Waymo and, soon, Rivian, deploys pulses of light that bounce off objects and return to sensors to create 3D images of the surrounding environment. Lidar struggles in snowy conditions because the sensors also absorb airborne particles, including moisture and flakes. (Not to mention, lidar is up to 32 times more expensive than Tesla’s comparatively simple, inexpensive cameras.) Radar, the third option, isn’t affected by darkness, snow, fog, or rain, using long radio wavelengths that essentially bend around water droplets in the air. But it also has the worst resolution of the bunch — it’s good at detecting cars, but not smaller objects, such as blown tire debris — and typically needs to be used alongside another sensor, like lidar, as it is on Waymo cars.

Driving in the snow is still “definitely out of the domain of the current robotaxis from Waymo or Baidu, and the long-haul trucks are not testing those conditions yet at all,” Waslander said. “But our research has shown that a lot of the winter conditions are reasonably manageable.”

To boot, Waymo is now testing its vehicles in Tokyo and London, with Denver, Colorado, set to become the first true “winter city” for the company. Waymo also has ambitions to expand into New York City, which received nearly 12 inches of snow last week during Winter Storm Fern.

But while scientists are still divided on whether climate change is increasing instances of polar vortices — which push extremely cold Arctic air down into the warmer, moister air over the U.S., resulting in heavy snowfall — we do know that as the planet warms, places that used to freeze solid all winter will go through freeze-thaw-refreeze cycles that make driving more dangerous. Freezing rain, which requires both warm and cold air to form, could also increase in frequency. Variability also means that autonomous vehicles will need to navigate these conditions even in presumed-mild climates such as Georgia.

Snow and ice throw a couple of wrenches at autonomous vehicles. Cars need to be taught how to brake or slow down on slush, soft snow, packed snow, melting snow, ice — every variation of winter road condition. Other drivers and pedestrians also behave differently in snow than in clear weather, which machine learning models must incorporate. The car itself will also behave differently, with traction changing at critical moments, such as when approaching an intersection or crosswalk.

Expanding the datasets (or “experience”) of autonomous vehicles will help solve the problem on the technological side. But reduced sensor accuracy remains a big concern — because you can only react to hazards you can identify in the first place. A crust of ice over a camera or lidar sensor can prevent the equipment from working properly, which is a scary thought when no one’s in the driver’s seat.

As Waslander alluded to, there are a few obvious coping mechanisms for robotaxi and autonomous vehicle makers: You can defrost, thaw, wipe, or apply a coating to a sensor to keep it clear. Or you can choose something altogether different.

Recently, a fourth kind of sensor has entered the market. At CES in January, the company Teradar demonstrated its Summit sensor, which operates in the terahertz band of the electromagnetic spectrum, a “Goldilocks” zone between the visible light used by cameras and the human eye and radar. “We have all the advantages of radar combined with all the advantages of lidar or camera,” Gunnar Juergens, the SVP of product at Teradar, told me. “It means we get into very high resolution, and we have a very high robustness against any weather influence.”

The company, which raised $150 million in a Series B funding round last year, says it is in talks with top U.S. and European automakers, with the goal of making it onto a 2028 model vehicle; Juergens also told me the company imagines possible applications in the defense, agriculture, and health-care spaces. Waslander hadn’t heard of Teradar before I told him about it, but called the technology a “super neat idea” that could prove to be a “really useful sensor” if it is indeed able to capture the advantages of both radar and lidar. “You could imagine replacing both with one unit,” he said.

Still, radar and lidar are well-established technologies with decades of development behind them, and “there’s a reason” automakers rely on them, Waslander told me. Using the terahertz band, “there’s got to be some trade-offs,” he speculated, such as lower measurement accuracy or higher absorption rates. In other words, while Teradar boasts the upsides of both radar and lidar, it may come with some of their downsides, too.

Another point in Teradar’s favor is that it doesn’t use a lens at all — there’s nothing to fog, freeze, or salt over. The sensor could help address a fundamental assumption of autonomy — as Juergen put it, “if you transfer responsibility from the human to a machine, it must be better than a human.” There are “very good solutions on the road,” he went on. “The question is, can they handle every weather or every use case? And the answer is no, they cannot.” Until sensors can demonstrate matching or exceeding human performance in snowy conditions — whether through a combination of lidar, cameras, and radar, or through a new technology such as Teradar’s Summit sensor — this will remain true.

If driving in winter weather can eventually be automated at scale, it could theoretically save thousands of lives. Until then, you might still consider using that empty parking lot nearby to brush up on your brake pumping.

Otherwise, there’s always Phoenix; I’ve heard it’s pleasant this time of year.

Popular

Latest News