Robotics Funding Picks Up

Every month Robotics Business Review compiles a list of private financing deals for robotics companies.

In April, as the world shut down for COVID-19, funding basically dried up.

“Robotics Business Review tracked about 26 transactions worth a total of more than $600 million last month, compared with 29 deals worth $2.7 billion in March 2020 and 30 transactions worth $6.5 billion in April 2019.”

Many of the April transactions that did occur were in China.

May, however, showed a meaningful uptick. May 2020 numbers were comparable to where they were a year ago, and at least in the same order of magnitude as the March 2020 figures.

“In May 2020, Robotics Business Review tracked 18 deals worth about $1.5 billion, compared with 26 robotics transactions worth more than $600 million in April 2020 and $1.5 billion in 27 transactions in May 2019.”

The May figures were led by huge funding rounds for Waymo and Didi. The rest of the May transactions totaled only $250 million.

For comparison, the May 2019 figures were even more concentrated, with the bulk of the month’s investments driven by a huge fundraising round for Cruise Automation.

I’m not quite ready to declare a return to normalcy yet, but it’s a big step in the right direction.

Tesla Takes A Baby Step Toward Ridesharing

Elon Musk famously tweeted that Tesla vehicles will be appreciating assets, a first for automobiles, if that comes to pass. The logic stems from another controversial Musk claim, that Teslas will eventually become robotaxis, generating passive income for their owners.

Recently, Electrek and other outlets wrote Tesla has taken a baby step toward the robotaxi vision. Nothing self-driving, much more pedestrian (excuse the pun) than that.

Tesla has created an “Add Drive” feature in its app.

Tesla does not yet appear to be advertising this feature, and I don’t own a Tesla, so I can’t confirm for myself. But apparently Tesla owners can now give access to their car to anybody, just by adding an email address. No key necessary, just the Tesla app and a confirmed email address.

Even if the robotaxis are a long time coming, you could imagine this might make it a lot easier for Tesla owners to rent their vehicles to other drivers through sites like Turo.

Mobileye: Driving In Jerusalem

Mobileye, which now belongs to Intel, recently published a video of a 40-minute autonomous drive through urban Jerusalem. Mobileye is based in Jerusalem, and its streets may be among the most challenging that self-driving cars currently handle.

The recording clocks in at about 26 minutes, shorter than the actual drive, because parts of the drive are played back at 2x speed. I annotated the video with my thoughts.

[1:05] Mobileye’s sensing display is pretty sparse, but does a nice job focusing on the important items.

[1:05] I am a little surprised that Mobileye’s in-cabin display set-up is clearly aftermarket. I guess that shouldn’t surprise me, but it’s a reminder that Mobileye is a supplier, not an OEM.

[1:50] Merges from a standing start are tough. Nice work.

[3:10] Overtaking the parked truck is very safe and delibrate, but I could imagine eventually that will need to happen faster.

[4:25] The pedestrian is detected well before the crosswalk. Awesome.

[5:15] “We use environmental cues to determine that these cars are parked, and not just stuck in traffic.” I wonder what those cues are? The map?

[6:20] This drive is camera-only, but Mobileye says they are building an entirely separate, redundant sensing stack that is radar plus lidar. They want to achieve full self-driving with each subsystem independently. They call this “true redundancy.” Interesting choice to build two separate stacks, divided by modality, as opposed to equip the vehicle with two independent sensor suites with all modalities.

[8:15] Mobileye’s Road Experience Management technology pulls anonymized sensor data from BMW, VW, Nissan, and SAIC vehicles. Mobileye fuses this data into is mapping system, to keep its maps up to date around the world. This is part of the dream of ADAS — that you can get much more data from a production fleet of human drivers than a fleet of test vehicles. “We are basically leveraging Mobileye’s strong ADAS position to build and maintain a near-real-time HD map.” I wonder exactly what data they are pulling, and whether the manufacturers will agree to this in the long-run.

[10:17] This route includes some very narrow Jerusalem streets. This one, at least, is totally straight. I’m not sure this “proves” AVs can operate in places like India, but this is certainly a more challenging environment than, say, Phoenix.

[11:10] The unprotected left turn felt a little tense, but basically okay.

[12:20] Nice job detecting a pedestrian dragging a forklift. This scenario is reminiscent of the situation that led to Uber ATG’s fatal collision with Elaine Herzberg in Arizona. Mobileye seems to have no problem with this here.

[13:15] Really interesting and successful “negotiation” to merge around a stopped vehicle.

[15:00] The human driver takes over to pull over in a bus stop zone, so that the drone operator (riding shotgun) can land the drone and change the battery. I am surprised the human driver had to take over here. Compared to a lot of autonomous maneuvers in this video, “pull over,” seems pretty basic (and necessary).

[15:00] This stop highlights that the human driver does not seem to provide any input to the AV during the whole drive. The route appears pre-programmed from start to finish. I wonder how strong Mobileye’s in-vehicle UX is.

[15:30] This shot reminds me how impressive drone operators are. You take it for granted, but this drone operator is sitting in the passenger seat of a moving vehicle. He’s keeping a drone, which he can’t see, in place directly above the car, at hundreds of feet off the ground, for forty minutes!

[17:15] Roundabouts are tough for Americans. I’d pay a self-driving car to handle roundabouts for me.

[17:55] Even self-driving cars want to change to the next lane if it looks faster!

[19:05] Super-narrow street with lots of cars. I’d be nervous driving here. Impressive, especially for a camera-only system! Localization typically relies on lidar. Mobileye can clearly localize effectively with just cameras.

[19:35] A driver exits a parked car to wave the AV around. The AV doesn’t seem to “understand” the wave, but once the driver gets out of the way, it figures out to pass the parked car.

[21:27] That was a challenging unprotected left turn. I’m impressed again.

[22:45] Interesting that the AV does not yield to the moped (I think it’s a moped) in the crosswalk. The system seems to recognize the vehicle as a moped, but the moped is trying to use the crosswalk like a pedestrian. Tricky situation.

[24:20] The sensing UI seems to recognize a leading car quite far ahead — so far ahead that it doesn’t appear in the drone shot. I wonder what the system’s range is.

[25:25] Once again, the safety driver takes control to pull over and end the ride. I’m puzzled why that wasn’t pre-programmed, like the rest of the ride.

Overall, this was a lot of fun to watch and a really impressive performance by Mobileye. Jerusalem seems like a tough place to drive!

Help Wanted

It’s been a tough 18 months for self-driving cars. The enthusiasm (and cash) that poured into the industry from 2016 to 2018 has dampened as everyone realizes Level 4 driverless robotaxis are not immediately around the corner.

But companies are still making progress and hiring.

VentureBeat reports that Aurora now boasts 500 employees (including interns!). CEO Chris Urmson says, “With the industry shakeup right now, there’s a lot of new talent on the market, an opportunity we intend to take full advantage of.”

Good to read!

Meanwhile, Zoox had to lay off some employees and might be purchased by Amazon. In a twist, Cruise is taking advantage of the uncertainty at Zoox to poach engineers!

Cruise has laid off employees recently, as well, all though supposedly this was in the interest of focusing on engineering, which would align with the Zoox hires.

Reuters reports that Cruise founder and CTO Kyle Vogt sent quite an email to Zoox engineers:

“Cruise is willing to recognize the full value of the rewards you’ve earned at Zoox — something that is very unlikely to occur via an acquisition in this environment.”

Companies are still competing for great talent!

The Six NVIDIA Xavier Processors

NVIDIA’s Xavier system on a chip (SoC) for self-driving cars recently passed TÜV ISO 26262 functional safety testing. Reading NVIDIA’s blog post on this achievement, I was struck by just how many specialized processors Xavier has, many of which were new to me.

Also, did you know there exists a site called Wikichip?

GPU
Of course an NVIDIA SoC will have a GPU, in this case a Volta GPU. The Volta GPU on the Xavier is optimized for inference. That means the neural network is probably going to be trained somewhere else and then loaded onto this platform when it’s ready for production deployment.

Wikichip lists this GPU at 22.6 tera-operations per second (TOPS). For comparison, Tesla Motor’s purpose-built self-driving chip boasts 36 TOPS. I confess I don’t know enough about just how far to the redline these chips go to understand whether 23 TOPS vs. 36 TOPS is basically the same thing or wildly different.

CPU
Although NVIDIA is a GPU company, the Xavier has a CPU. The CPU has 8 Carmel cores. I assume it’s fast.

VPU
Xavier includes a vision processing unit (VPU), which makes sense for a SoC designed for lots of cameras.

NVIDIA sometimes calls this a “Stereo/Optical Flow accelerator.” Optical flow is a machine learning technique for inferring data (distance, velocity) from stereo cameras. I assume more generally the goal is to accelerate machine learning algorithms on sequential frames of video.

ISP
I had not before heard of image signal processors. Like a VPU, an ISP is designed to accelerate the performance of algorithms on camera data. ISPs seem to focus on individual high-resolution frames, probably for classification tasks on things like signs.

PVA
Vision is clearly a strength of the Xavier. The programmable vision accelerator is an NVIDIA proprietary technology. The best documentation I could find is a patent that seems to focus on collapsing multiple loops into a single loop in order to accelerate vision calculations.

The “programmable” qualifier presumably means that firmware engineers can customize this chip to their specific needs.

DLA
The deep learning accelerator is an open-source architecture NVIDIA has released to create accelerators for neural network inference. It’s really cool that NVIDIA has open-sourced this technology.

As with the PVA, the DLA appears to be programmable with Verilog, so that customers can adapt the firmware to meet their needs.

Most likely a goal of the DLA is to provide acceleration of lidar and other data that may not be optimized for the other vision-optimized chips on the Xavier.

That is a lot of processing power and specialization on one SoC!

Here’s NVIDIA CEO Jensen Huang touting the DRIVE AGX Xavier Developer Kit, which contains two Xavier SoCs.

Delivering Goods and People

Lots of self-driving companies are back to testing, in limited capacity, in the US. Right now, they’re typically testing delivering goods — not people — to vulnerable communities.

As an aside, Jewel Li from AutoX mentioned on a recent Autonocast episode that Chinese self-driving companies are totally back to normal, testing at full capacity, and working in the office.

But here in the US, lockdowns are still mostly in effect and self-driving companies are trying to both do the right thing and get back out on the road by becoming delivery services.

I imagine this plays especially well for self-driving companies that were founded from the start as delivery services, not robotaxis. First and foremost in that list is Nuro, which announced a partnership with CVS to deliver prescriptions.

Interestingly, “As with all our pilots, we will begin service with our autonomous Prius fleet to make deliveries, before introducing deliveries with R2, our custom-built delivery bot.”

I wonder what Nuro’s stages are, moving from a Prius with (presumably) a safety operator, to a driverless R2 (possibly with a safety operator trailing in another vehicle?), to a driverless R2 with no Nuro staff in the vicinity. I did a quick scan of Nuro’s blog and didn’t see anything, but I haven’t followed them closely on this particular issue.

On the other end of the spectrum, robotaxis face the challenge of providing a safe vehicular environment for many, many passengers to share (albeit at different times).

Early in the COVID crisis my old boss, Oliver Cameron, who is now co-founder and CEO of Voyage, tweeted:

Oliver is so good at Twitter. Things that normal people like me would spend days and even real dollars on, Oliver puts on Twitter and gets answers.

You can read in the Twitter thread that he got a lot of suggestions. We’ll see if any of them pan out. The immediate upshot seemed to be captured by this GIF he subsequently posted.

In the medium-term, a big question for robotaxi companies will be whether this becomes mandatory, or whether COVID diminishes as a real public health concern, leaving the world the way it was in mid-2019.

If COVID doesn’t go away soon, a lot of robotaxi companies might be tempted to become delivery companies.

Optimus Ride Delivering Meals at Paradise Valley Estates

Optimus Ride is testing its self-driving vehicles and delivering meals at Paradise Valley Estates, a retirement community in California. This information is fully over a year old but somehow it slipped past my radar.

It makes sense that a self-driving operation would test in a retirement community — there’s a natural geofence, the speed limits are low, and there is a built-in customer base.

But until now this particular niche seemed only to be approached by Voyage, who has made this central to their development strategy.

Voyage’s deployment at The Villages in Florida is on a vastly different scale than Paradise Valley Estates: 25,000 acres in Florida vs. 80 acres in California.

But from a business perspective, the characteristics are at least broadly similar.

I wonder if eventually bidding wars will break out to serve these communities, similar to what you might see with National Park concessionaires.

Indy Autonomous Challenge

The Indy Autonomous Challenge is a self-driving race series organized by the Indianapolis Motor Speedway, home of the Indy 500. The organizers have explicitly modeled the competition after the DARPA Challenges, which kicked off the self-driving car boom fifteen years ago.

The Pittsburgh Post-Gazette has a great article on the effort to get the challenge off the ground and involve university students.

The series also bears similarity to Roborace, the Formula E autonomous series that has run for the last few years.

There is so much to learn from these races!

Most of the day-to-day challenges of self-driving cars center around perception and planning. Those skills are less central (although still critical) to race car driving. On the track, control of the vehicle is key, especially when we push the machine to its limits.

There is a whole new set of skills to be learned at hundreds of miles an hour. Eventually, that research will make its way to street-legal autonomous vehicles.

SpaceX Automated Docking

Yesterday SpaceX launched a pair of astronauts into space. This was a huge deal, for reasons NASA captured on Twitter:

One small part of this effort was automated docking at the International Space Station. As The Verge explains:

“The vehicle is designed to autonomously approach the ISS and latch on to a standardized docking port, without any input from its human passengers…The predecessor to the capsule, SpaceX’s cargo Dragon, did not have this capability when it delivered supplies and food to the ISS. For all of those cargo missions, astronauts on board the ISS had to use the station’s robotic arm to grab hold of an approaching cargo Dragon and bring it onto a docking port. That technique is known as berthing, and it requires a lot of work from the astronauts on board the ISS. The Crew Dragon’s automated capabilities should help free up time for the astronauts to work on other things when new crews arrive.”

The SpaceX video that captures automated docking is a anticlimatic, compared to the rocket launch, but what’s going on behind the scenes is plenty impressive.

State estimation and control must have been huge challenges to make this work. On the ground — in automobiles, for example — gravity and the earth reduce the complexity of motion control from three dimensions down to only two dimensions. In a car you can go left or right, forward or backward, but you can’t go straight up or down.

In space — or in the air — that third dimension makes motion control much harder.

What’s also hard, and less obvious, is state estimation. This is sometimes called just “localization” in self-driving cars, because that’s really all there is to the problem (believe me, localization alone is hard enough). But in three dimensions it becomes a real challenge to keep track of your present state in three dimensions.

Hats off to SpaceX!

Don’t Sleep on Mobility as a Service

I read a few posts today by and about Bestmile, an under-the-radar Swiss autonomous vehicle company. Although Bestmile was born out of an autonomous vehicle demonstration, its focus now seems to be on “orchestrating” mobility through a combination of ridesharing, micro-transit, autonomous shuttles, and robotaxis.

This is probably the closest of any current company I’ve seen to a true “mobility as a service” platform.

I’ve never interacted with anyone from Bestmile or used the service, so it’s certainly possible the reality is a lot different than the vision. But the idea of “AWS but for mobility” is exciting.

Some of the ridesharing companies taken steps in this direction, by adding bicycles and scooters and mopeds and even helicopters to their apps. But I haven’t really had a seamless experience where I moved from one transportation mode to another.

More to the point, what will be really exciting is when entrepreneurs can use mobility-as-a-service networks to build their own businesses, the same way entrepreneurs (and now giant corporations) use cloud computing providers.

I don’t know that Bestmile will be the winning solution — it seems early and the commodity components don’t really exist yet, especially in the critical autonomy realm. I’m excited to watch this develop, though.