Where Are The ADAS Startups?

The second quarter of 2020 has been pretty great for Phantom.ai. The Silicon Valley-based ADAS startup closed a $22 million Series A financing in April, led in part by Ford Motor Company. Today, they announced a partnership with Renesas, a Japanese Tier 1 automotive supplier, to develop “full stack Level 2 advanced driver assistance systems.”

This makes Phantom.ai one of the very few startups targeting what would seem to be a lucrative and promising market.

An oddity of the self-driving car revolution is that startups have so far had much more success tackling Level 4 full autonomy, compared to Level 2 advanced driver assistance.

Level 2 means that a driver still needs to be in control of the vehicle, which leaves startups with one of two difficult paths:

  1. Become a manufacturer and build vehicles for consumers.
  2. Sell ADAS packages into the existing automotive ecosystem, with lead times approaching a decade.

Faced with that challenge, and perhaps also for safety concerns, most startups have opted to instead work on Level 4 autonomy. This is a much harder challenge, but carries the potential of deploying robotaxis directly and probably (maybe?) avoiding the existing automotive supply chain.

The only company that has cracked this nut, Mobileye, cracked it in a huge way, exiting to Intel in 2017 for $15 billion. On the one hand, I would have thought more entrants would’ve been attracted to this space. On the other hand, it took Mobileye 18 years to achieve this success, highlighting how long the automotive supplier road can be.

That leaves Phantom.ai, which has survived on a mere $5 million seed funding round since 2016, and overcame a cringe-inducing 2018 rear-end collision with a press crew on-board. Kudos to them as the leading startup in the space.

Even Phantom’s own employees seem a little dumbstruck by this state of affairs. One anonymous employee wrote in a 2019 Glassdoor review:

“ If they exist, we don’t know who our competitors are, other than MobilEye. Is another company going to come in and steal our thunder? It’s my biggest worry. [Our competitors, by the way, are not Waymo, Aurora, Cruise, etc… their product is for a different market.]”

The main competitor I can think of is Comma.ai. I own their EON DevKit and have installed it in several different vehicles. The performance of the OpenPilot software it runs is impressive. I wish it would get to market in a bigger way than it has so far.

But cracking the automotive supply chain is tough.

Tesla Takes A Baby Step Toward Ridesharing

Elon Musk famously tweeted that Tesla vehicles will be appreciating assets, a first for automobiles, if that comes to pass. The logic stems from another controversial Musk claim, that Teslas will eventually become robotaxis, generating passive income for their owners.

Recently, Electrek and other outlets wrote Tesla has taken a baby step toward the robotaxi vision. Nothing self-driving, much more pedestrian (excuse the pun) than that.

Tesla has created an “Add Drive” feature in its app.

Tesla does not yet appear to be advertising this feature, and I don’t own a Tesla, so I can’t confirm for myself. But apparently Tesla owners can now give access to their car to anybody, just by adding an email address. No key necessary, just the Tesla app and a confirmed email address.

Even if the robotaxis are a long time coming, you could imagine this might make it a lot easier for Tesla owners to rent their vehicles to other drivers through sites like Turo.

Mobileye: Driving In Jerusalem

Mobileye, which now belongs to Intel, recently published a video of a 40-minute autonomous drive through urban Jerusalem. Mobileye is based in Jerusalem, and its streets may be among the most challenging that self-driving cars currently handle.

The recording clocks in at about 26 minutes, shorter than the actual drive, because parts of the drive are played back at 2x speed. I annotated the video with my thoughts.

[1:05] Mobileye’s sensing display is pretty sparse, but does a nice job focusing on the important items.

[1:05] I am a little surprised that Mobileye’s in-cabin display set-up is clearly aftermarket. I guess that shouldn’t surprise me, but it’s a reminder that Mobileye is a supplier, not an OEM.

[1:50] Merges from a standing start are tough. Nice work.

[3:10] Overtaking the parked truck is very safe and delibrate, but I could imagine eventually that will need to happen faster.

[4:25] The pedestrian is detected well before the crosswalk. Awesome.

[5:15] “We use environmental cues to determine that these cars are parked, and not just stuck in traffic.” I wonder what those cues are? The map?

[6:20] This drive is camera-only, but Mobileye says they are building an entirely separate, redundant sensing stack that is radar plus lidar. They want to achieve full self-driving with each subsystem independently. They call this “true redundancy.” Interesting choice to build two separate stacks, divided by modality, as opposed to equip the vehicle with two independent sensor suites with all modalities.

[8:15] Mobileye’s Road Experience Management technology pulls anonymized sensor data from BMW, VW, Nissan, and SAIC vehicles. Mobileye fuses this data into is mapping system, to keep its maps up to date around the world. This is part of the dream of ADAS — that you can get much more data from a production fleet of human drivers than a fleet of test vehicles. “We are basically leveraging Mobileye’s strong ADAS position to build and maintain a near-real-time HD map.” I wonder exactly what data they are pulling, and whether the manufacturers will agree to this in the long-run.

[10:17] This route includes some very narrow Jerusalem streets. This one, at least, is totally straight. I’m not sure this “proves” AVs can operate in places like India, but this is certainly a more challenging environment than, say, Phoenix.

[11:10] The unprotected left turn felt a little tense, but basically okay.

[12:20] Nice job detecting a pedestrian dragging a forklift. This scenario is reminiscent of the situation that led to Uber ATG’s fatal collision with Elaine Herzberg in Arizona. Mobileye seems to have no problem with this here.

[13:15] Really interesting and successful “negotiation” to merge around a stopped vehicle.

[15:00] The human driver takes over to pull over in a bus stop zone, so that the drone operator (riding shotgun) can land the drone and change the battery. I am surprised the human driver had to take over here. Compared to a lot of autonomous maneuvers in this video, “pull over,” seems pretty basic (and necessary).

[15:00] This stop highlights that the human driver does not seem to provide any input to the AV during the whole drive. The route appears pre-programmed from start to finish. I wonder how strong Mobileye’s in-vehicle UX is.

[15:30] This shot reminds me how impressive drone operators are. You take it for granted, but this drone operator is sitting in the passenger seat of a moving vehicle. He’s keeping a drone, which he can’t see, in place directly above the car, at hundreds of feet off the ground, for forty minutes!

[17:15] Roundabouts are tough for Americans. I’d pay a self-driving car to handle roundabouts for me.

[17:55] Even self-driving cars want to change to the next lane if it looks faster!

[19:05] Super-narrow street with lots of cars. I’d be nervous driving here. Impressive, especially for a camera-only system! Localization typically relies on lidar. Mobileye can clearly localize effectively with just cameras.

[19:35] A driver exits a parked car to wave the AV around. The AV doesn’t seem to “understand” the wave, but once the driver gets out of the way, it figures out to pass the parked car.

[21:27] That was a challenging unprotected left turn. I’m impressed again.

[22:45] Interesting that the AV does not yield to the moped (I think it’s a moped) in the crosswalk. The system seems to recognize the vehicle as a moped, but the moped is trying to use the crosswalk like a pedestrian. Tricky situation.

[24:20] The sensing UI seems to recognize a leading car quite far ahead — so far ahead that it doesn’t appear in the drone shot. I wonder what the system’s range is.

[25:25] Once again, the safety driver takes control to pull over and end the ride. I’m puzzled why that wasn’t pre-programmed, like the rest of the ride.

Overall, this was a lot of fun to watch and a really impressive performance by Mobileye. Jerusalem seems like a tough place to drive!

Help Wanted

It’s been a tough 18 months for self-driving cars. The enthusiasm (and cash) that poured into the industry from 2016 to 2018 has dampened as everyone realizes Level 4 driverless robotaxis are not immediately around the corner.

But companies are still making progress and hiring.

VentureBeat reports that Aurora now boasts 500 employees (including interns!). CEO Chris Urmson says, “With the industry shakeup right now, there’s a lot of new talent on the market, an opportunity we intend to take full advantage of.”

Good to read!

Meanwhile, Zoox had to lay off some employees and might be purchased by Amazon. In a twist, Cruise is taking advantage of the uncertainty at Zoox to poach engineers!

Cruise has laid off employees recently, as well, all though supposedly this was in the interest of focusing on engineering, which would align with the Zoox hires.

Reuters reports that Cruise founder and CTO Kyle Vogt sent quite an email to Zoox engineers:

“Cruise is willing to recognize the full value of the rewards you’ve earned at Zoox — something that is very unlikely to occur via an acquisition in this environment.”

Companies are still competing for great talent!

The Six NVIDIA Xavier Processors

NVIDIA’s Xavier system on a chip (SoC) for self-driving cars recently passed TÜV ISO 26262 functional safety testing. Reading NVIDIA’s blog post on this achievement, I was struck by just how many specialized processors Xavier has, many of which were new to me.

Also, did you know there exists a site called Wikichip?

GPU
Of course an NVIDIA SoC will have a GPU, in this case a Volta GPU. The Volta GPU on the Xavier is optimized for inference. That means the neural network is probably going to be trained somewhere else and then loaded onto this platform when it’s ready for production deployment.

Wikichip lists this GPU at 22.6 tera-operations per second (TOPS). For comparison, Tesla Motor’s purpose-built self-driving chip boasts 36 TOPS. I confess I don’t know enough about just how far to the redline these chips go to understand whether 23 TOPS vs. 36 TOPS is basically the same thing or wildly different.

CPU
Although NVIDIA is a GPU company, the Xavier has a CPU. The CPU has 8 Carmel cores. I assume it’s fast.

VPU
Xavier includes a vision processing unit (VPU), which makes sense for a SoC designed for lots of cameras.

NVIDIA sometimes calls this a “Stereo/Optical Flow accelerator.” Optical flow is a machine learning technique for inferring data (distance, velocity) from stereo cameras. I assume more generally the goal is to accelerate machine learning algorithms on sequential frames of video.

ISP
I had not before heard of image signal processors. Like a VPU, an ISP is designed to accelerate the performance of algorithms on camera data. ISPs seem to focus on individual high-resolution frames, probably for classification tasks on things like signs.

PVA
Vision is clearly a strength of the Xavier. The programmable vision accelerator is an NVIDIA proprietary technology. The best documentation I could find is a patent that seems to focus on collapsing multiple loops into a single loop in order to accelerate vision calculations.

The “programmable” qualifier presumably means that firmware engineers can customize this chip to their specific needs.

DLA
The deep learning accelerator is an open-source architecture NVIDIA has released to create accelerators for neural network inference. It’s really cool that NVIDIA has open-sourced this technology.

As with the PVA, the DLA appears to be programmable with Verilog, so that customers can adapt the firmware to meet their needs.

Most likely a goal of the DLA is to provide acceleration of lidar and other data that may not be optimized for the other vision-optimized chips on the Xavier.

That is a lot of processing power and specialization on one SoC!

Here’s NVIDIA CEO Jensen Huang touting the DRIVE AGX Xavier Developer Kit, which contains two Xavier SoCs.

Delivering Goods and People

Lots of self-driving companies are back to testing, in limited capacity, in the US. Right now, they’re typically testing delivering goods — not people — to vulnerable communities.

As an aside, Jewel Li from AutoX mentioned on a recent Autonocast episode that Chinese self-driving companies are totally back to normal, testing at full capacity, and working in the office.

But here in the US, lockdowns are still mostly in effect and self-driving companies are trying to both do the right thing and get back out on the road by becoming delivery services.

I imagine this plays especially well for self-driving companies that were founded from the start as delivery services, not robotaxis. First and foremost in that list is Nuro, which announced a partnership with CVS to deliver prescriptions.

Interestingly, “As with all our pilots, we will begin service with our autonomous Prius fleet to make deliveries, before introducing deliveries with R2, our custom-built delivery bot.”

I wonder what Nuro’s stages are, moving from a Prius with (presumably) a safety operator, to a driverless R2 (possibly with a safety operator trailing in another vehicle?), to a driverless R2 with no Nuro staff in the vicinity. I did a quick scan of Nuro’s blog and didn’t see anything, but I haven’t followed them closely on this particular issue.

On the other end of the spectrum, robotaxis face the challenge of providing a safe vehicular environment for many, many passengers to share (albeit at different times).

Early in the COVID crisis my old boss, Oliver Cameron, who is now co-founder and CEO of Voyage, tweeted:

Oliver is so good at Twitter. Things that normal people like me would spend days and even real dollars on, Oliver puts on Twitter and gets answers.

You can read in the Twitter thread that he got a lot of suggestions. We’ll see if any of them pan out. The immediate upshot seemed to be captured by this GIF he subsequently posted.

In the medium-term, a big question for robotaxi companies will be whether this becomes mandatory, or whether COVID diminishes as a real public health concern, leaving the world the way it was in mid-2019.

If COVID doesn’t go away soon, a lot of robotaxi companies might be tempted to become delivery companies.

Optimus Ride Delivering Meals at Paradise Valley Estates

Optimus Ride is testing its self-driving vehicles and delivering meals at Paradise Valley Estates, a retirement community in California. This information is fully over a year old but somehow it slipped past my radar.

It makes sense that a self-driving operation would test in a retirement community — there’s a natural geofence, the speed limits are low, and there is a built-in customer base.

But until now this particular niche seemed only to be approached by Voyage, who has made this central to their development strategy.

Voyage’s deployment at The Villages in Florida is on a vastly different scale than Paradise Valley Estates: 25,000 acres in Florida vs. 80 acres in California.

But from a business perspective, the characteristics are at least broadly similar.

I wonder if eventually bidding wars will break out to serve these communities, similar to what you might see with National Park concessionaires.

Indy Autonomous Challenge

The Indy Autonomous Challenge is a self-driving race series organized by the Indianapolis Motor Speedway, home of the Indy 500. The organizers have explicitly modeled the competition after the DARPA Challenges, which kicked off the self-driving car boom fifteen years ago.

The Pittsburgh Post-Gazette has a great article on the effort to get the challenge off the ground and involve university students.

The series also bears similarity to Roborace, the Formula E autonomous series that has run for the last few years.

There is so much to learn from these races!

Most of the day-to-day challenges of self-driving cars center around perception and planning. Those skills are less central (although still critical) to race car driving. On the track, control of the vehicle is key, especially when we push the machine to its limits.

There is a whole new set of skills to be learned at hundreds of miles an hour. Eventually, that research will make its way to street-legal autonomous vehicles.

Don’t Sleep on Mobility as a Service

I read a few posts today by and about Bestmile, an under-the-radar Swiss autonomous vehicle company. Although Bestmile was born out of an autonomous vehicle demonstration, its focus now seems to be on “orchestrating” mobility through a combination of ridesharing, micro-transit, autonomous shuttles, and robotaxis.

This is probably the closest of any current company I’ve seen to a true “mobility as a service” platform.

I’ve never interacted with anyone from Bestmile or used the service, so it’s certainly possible the reality is a lot different than the vision. But the idea of “AWS but for mobility” is exciting.

Some of the ridesharing companies taken steps in this direction, by adding bicycles and scooters and mopeds and even helicopters to their apps. But I haven’t really had a seamless experience where I moved from one transportation mode to another.

More to the point, what will be really exciting is when entrepreneurs can use mobility-as-a-service networks to build their own businesses, the same way entrepreneurs (and now giant corporations) use cloud computing providers.

I don’t know that Bestmile will be the winning solution — it seems early and the commodity components don’t really exist yet, especially in the critical autonomy realm. I’m excited to watch this develop, though.

Level 3: Mercedes-Benz EQS Flagship Sedan

Mercedes-Benz recently launched an online video series called, “Meet Mercedes Digital.” This first episode featured CEO Ola Kallenius, who briefly teased the launch of the Mercedes-Benz EQS sedan in the second half of 2020.

“This is a special year for us. It’s the year where we launch of flagship car, the S-Class. That only comes around every so often…It’s happening in the second half of the year and we’re quite excited about it.”
Ola Kallenius

The EQS is a futuristic luxury vehicle that should be a big shot in the arm for Daimler, the parent company of Mercedes-Benz.

They could use it, too. Like most automotive companies, Daimler has been hit hard by COVID-19, with the stock price down nearly 50% this year.

The EQS will be all-electric, all-wheel drive, with a top speed of “> 200 km/h” (125mph).

Most exciting to me, the vehicle will feature Level 3 autonomy. Mercedes doesn’t dance around this term, either. Right in the middle of the vehicle overview, they state:

“The Vision EQS show car supports the driver with highly-automated driving at Level 3, e.g. on longer motorway journeys. Thanks to the modular sensor systems, the level of autonomy can be extended up to fully-automated driving in the future.”

Well, maybe they dance around it a little by writing about the “Vision EQS show car”, instead of the 2021 production EQS. But that is a bold and refreshing statement.

Given Audi’s recent step back from Level 3 technology, due to liability concerns, it will be interesting to see whether Level 3 will be available at launch this fall.

I’m excited to get behind the wheel and take my hands off.