Image-Quality Lidar and Laser-Accurate Cameras

A lidar startup named Innovusion recently raised $30MM to deliver “image-quality” lidar. As optics.org points out, several different startups are racing to build laser-based autonomous vehicle sensors: Luminar, AEye, Quanergy, Oryx, Velodyne (the industry-standard), and surely a few others.

Credit to Innovusion, though: “image-quality lidar” really frames the issue in a way that I hadn’t seen before. Cost aside, is it really possible to replace cameras with lidar?

Watching the Innovusion demo video, the answer seems to be “closer, but not yet”. The quality of the lidar scan is terrific. However, “image quality” isn’t quite right. Signs appear in the video and are illegible, and at least to my eyes the traffic signal was not classifiable.

It’s exciting to see how much work is being done to push lidar resolution to a point that it is competitive with cameras.

It would also be exciting to see cameras develop their measurement abilities to compete with lidar and radar. Generally, cameras are terrific for detection and classification tasks, but they measure distances, heights, and other dimensions poorly.

And while many startups are jumping into the lidar space, comparatively fewer are working on perception with cameras. That work tends to be left to academic researchers, automotive manufacturers, and Tier 1 suppliers. Mobileye, the most prominent automotive computer vision specialty company (now part of Intel), has mostly been quiet about their computer vision work.

With so many companies pushing lidar resolution to camera-like levels, there might be an opening for some computer vision startups to push camera measurements to lidar-like levels.

Navigate on Autopilot

On Friday, the Tesla blog announced the introduction of the Navigate feature to its Enhanced Autopilot system. Navigate will drive from exit-to-exit on the highway, and automatically change lanes to pass vehicles along the way.

Near the top of the post, Tesla writes, “until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain in control of their car at all times.”

That is a prominent disclaimer, but this feature basically looks like Level 3 partial autonomy. Depending on how aggressively Tesla requires drivers to keep their hands on the wheel, it’s not hard to imagine drivers diverting their attention elsewhere.

And that could be a great thing!

Tesla could start out by requiring drivers to basically keep their hands on the wheel at all times. Over time, as the software proves itself, Tesla could use over-the-air updates to slowly relax the requirements that drivers monitor the road.

Of course, Tesla could botch the rollout and cause lots of distracted driving accidents. But so far Tesla Autopilot has a great safety record, so I feel pretty good about this.

As the blog post notes, “Since we launched Autopilot in 2015, more than 1 billion miles of real-world driving data have been used to support the feature.”

Argo AI To Test in DC

The Washington Post reports that Ford Motor Company and Argo AI have announced they will begin testing self-driving cars in Washington, DC, next year.

This is exciting to me as a Ford alumnus, and because I grew up in the Virginia suburbs of Washington, DC, so I know the city well.

Beyond my personal connections, this just seems like another step in the increasingly rapid expansion of self-driving vehicles to more and more cities.

Lyft and Aptiv are testing with the general public Las Vegas right now, as is Drive.AI in the Dallas suburbs.

Uber has tested in Pittsburgh in the past, and probably will test again in the future, as has nuTonomy in Boston.

Waymo says they will open their Phoenix-area fleet to the general public this year. Cruise will open their fleet to the public in San Francisco next year. Now Ford says they will open to the public in Washington, DC.

New York, Los Angeles, Chicago, and Houston are the four largest cities in America. Presumably self-driving cars will get there in the next few years, too.

Regulators Block Self-Driving School Bus

The US National Highway Traffic Safety Administration shut down a self-driving school bus pilot program in Florida. The pilot was run by the French firm Transdev, and involved a small shuttle that travels at a glacial 8mph. To put that in context, the average Boston marathoner runs faster than this shuttle.

There’s a little bit of back and forth jawing between NHTSA, which says, “Using a non-compliant test vehicle to transport children is irresponsible, inappropriate, and in direct violation of the terms of Transdev’s approved test project.”

Transdev, for its part, Transdev “believed the pilot met the requirements of the testing and demonstration project approved by NHTSA for adults and children to ride on the same route.”

Realistically, it’s hard to imagine anyone getting hurt at 8mph. I mean, it’s possible, but the speed is so slow.

On the other hand, painting a self-driving shuttle yellow and calling it a school bus is basically inviting a public outcry, at least at this point in the development of autonomous vehicles. If the purpose of the trial was to demonstrate that adults and children can ride in a vehicle together, it seems like there are several intermediate steps to hit before calling anything a school bus.

There’s also relatively little to gain by automating school buses. Buses are remarkably safe. And since the cost of the driver is amortized over all of the passengers, the financial benefits of automation are low.

Tesla Deprecates “Full Self-Driving” Term

For the past two years or so, Tesla has provided a “full self-driving” option on its vehicles. The option cost $5,000.

Tesla CEO Elon Musk announced new mid-range options for the Model 3 this week, and the associated design website renames the “Full Self-Driving” feature to “Enhanced Autopilot”.

In a tweet, Musk said:

I am a bit confused by the meaning of Musk’s tweet. Is “full self-driving” actually still available, on an alternate menu? Does he mean “full self-driving” has been unavailable for week?

The simplest explanation seems to be that Tesla decided to rebrand their Autopilot feature.

A lot of people have been skeptical about the ability of Tesla to create a fully autonomous vehicle with only cameras and one radar and no lidar. Perhaps this is a nod in that direction.

Autopilot is still the best ADAS system on the market, though.

Udacity Festival 2018 This Weekend

Udacity Festival 2018 is coming this weekend!

This is an online event for Udacity students and alumni to learn from and connect with each other, as well as to hear from Udacity staff. Since Udacity is an online education institution with students all around the world, this is a virtual event, taking place online throughout the weekend.

The Festival will feature:

“Presentations covering everything from pitching projects and landing new jobs, to career change and entrepreneurial success.

Exclusive digital meetups for each Udacity school — Artificial Intelligence, Autonomous Systems, Business, Data Science, and Programming.

Panel discussions with alumni sharing their career advancement strategies.
… and so much more!”

As a teaser, the School of Autonomous Systems event will feature a ride in Carla, Udacity’s self-drivng car!

As if that weren’t enough, I will engage is a special round of Carla Karaoke. You will not believe my closing number. Like, literally, you will not believe it. You will watch me sing it and still will not believe I chose this song.

RSVP here!

Self-Driving Ships

The Verge reports on a partnership between Intel and Rolls-Royce to build “self-driving” ships. The article blends discussion of three different scenarios:

  1. autonomous long-haul shipping
  2. remote-control operation
  3. pilot assistance for docking and similar scenarios

I have almost no knowledge of shipping or boats or the ocean or even water. I do know how to swim.

Nonetheless, I speculate that #3 seems the most useful.

The gains achieved by removing a human crew from a cargo ship seem minimal. In the context of a massive shipping vessel stuffed with rectangular containers, the cost of the human crew just doesn’t seem that significant.

But in the context of the close quarters of a harbor or port, I can imagine that there might be substantial performance gains from automation or pilot assistance.

Again, knowing not much about the actual constraints of maritime shipping, I could imagine harbors as bottlenecks, where ships get queued up in lines, waiting for relatively scarce tugboats and harbor pilots. Furthermore, ships do not turn on a dime, and so presumably need to maintain substantial buffer distances.

Autonomous shipping in close quarters might improve both the latency of docking (by allowing ships to skip the line) and the throughput (by allowing ships to shrink buffer distances).

TiE Autonomous Vehicles Panel

Come join me next Tuesday, October 9, at the TiE Silicon Valley panel on the Future of Autonomous Vehicles!

The panel will be from 5:30pm — 8:30pm in Sunnyvale, California, at the offices of Micro Focus. It’s quite a lineup they’ve assembled:

  • David Hall: Founder and CEO of Velodyne
  • Manji Suzuki: VP at Denso
  • Ashish Karandikar: VP at NVIDIA
  • Qasar Younis: Founder and CEO of Applied Intuition
  • Vijay Nadkarni: VP at Visteon
  • Hyunggi Cho: Founder and CEO of Phantom AI

If you want to learn what is going on at the cutting-edge of self-driving cars in Silicon Valley, this seems like the place to do it. I’m excited myself to learn what the other panelists have to say about the future of autonomous vehicles!

Register here đźš—