This week Microsoft and Cruise announced a $2 billion investment from the former into the latter. The focus of the partnership is squarely on cloud computing. Press releases from both companies specified Microsoft as the âpreferred cloud providerâ of both Cruise and General Motors.
âMicrosoft, as Cruiseâs preferred cloud providerâŚâ âAs Cruise and GMâs preferred cloud, we will apply the power of Azure to help them scaleâŚâ âGM will work with Microsoft as its preferred public cloud providerâ
What does it mean to be a âpreferred cloud provider?â
Preference vs. Exclusivity
For starters, it seems likely that âpreferredâ does not mean âexclusive.â
Thatâs notable because a number of recent Waymo partnerships (with auto manufacturers, not with cloud providers) have referred to Waymo as an âexclusiveâ partner. For example:
âWaymo is now the exclusive global L4 partner for Volvo Car GroupâŚâ
Credits vs. Cash
I also wonder whether this framing means that Microsoft didnât invest actual cash in Cruise, or at least not the headline $2 billion.
I remember hearing rumors (or maybe it was official) after Hondaâs investment in Cruise, that much of that investment came in the form of manufacturing credits at Honda plants, not cash dollars. Similarly, I wonder if any of the $2 billion take the form of Azure credits.
Valuation
This investment values Cruise at $30 billion, which is basically the same as Waymoâs recent valuation about a year ago. This is a testament to Cruiseâs progress. The valuation might also indicate of how eager Microsoft is for Cruise to become a credible competitor to Waymo, and (more importantly) Alphabet.
Waymo seems to mostly adhere to this philosophy. Their âpartnershipsâ, mostly with automotive manufacturers, seem to largely amount to vendor-customer relationships.
Cruise, as well as most other companies in the self-driving industry, tend toward more a wider range of partnerships. The Microsoft investment might fall in that category, depending on the structure. Of course, it may also be a straightforward cash-for-equity transaction.
In any case, $30 billion is pretty amazing. Go Cruise!
Alphabet is shutting down Loon. You could be forgiven for not knowing Loon was even a thing that existed for Alphabet to shut down. Forgive the incredibly uncreative pun, but Loon never really got off the ground.
âWhen we unveiled Loon in June 2013, we meant everything in its name. It was a way-out-there and risky venture. Not just fragile-balloons-on-the-edge-of-space risky, but risky at the core of the question it was asking. Could this be the radical idea that might finally bring abundant, affordable Internet access, not just to the next billion, but to the last billion? To the last unconnected communities and those least able to pay?â
I have very little knowledge of Loon and point you instead to Tellerâs blog post and the Wikipedia page on Loon.
What interests me is this very public act of shutting down a startup. The folks who worked on Loon were probably very, very successful and smart, otherwise they wouldnât have made it into X, Alphabetâs Moonshot Factory, in the first place.
And yet, even for them, this one didnât work out.
Youâll be on y our way up! Youâll be seeing great sights! Youâll join the high fliers who soar to high heights.
You wonât lag behind, because youâll have the speed. Youâll pass the whole gang and youâll soon take the lead. Wherever you fly, youâll be best of the best. Wherever you go, you will top all the rest.
Except when you donât. Because, sometimes, you wonât.
Iâm sorry to say so but, sadly, itâs true that Bang-ups and Hang-ups can happen to you.
You can get all hung up in a prickle-ly perch. And your gang will fly on. Youâll be left in a Lurch.
The good news, and it is good news, is that shutting down Loon liberates everyone to work on something else.
When things arenât going well, inertia can keep us tethered (again with the puns) to fruitless endeavors. Conceding failure is almost physically painful.
But Iâve found that once I make the decision to move on, all sorts of opportunities open up.
I expect thatâs what will happen to the folks at Loon, and weâll all be better off for it. Them most especially.
Levandowski is one of, and one of the youngest, foundational participants in the self-driving car industry. He was there from the start, at the original DARPA Grand Challenge, with an autonomous motorcycle called Ghost Rider.
The motorcycle fell over almost immediately, but Levandowskiâs career in the industry was just beginning. He joined Sebastian Thrun at Google, working first on Google Maps and then later on the Google Self-Driving Car Project. Eventually he left to found his own start-up, Otto, which is where the trouble began.
Otto didnât last long as a stand-alone company before it was acquired by Uber for hundreds of millions of dollars. Shortly thereafter, Google sued Uber, claiming that Levandowski had stolen tens of thousands of documents from the Google Self-Driving Car Project. Google believed that IP was illegally benefitting Otto, and which was now owned by Uber.
These events intersected lightly with my own history in the self-driving car ecosystem. I joined Udacity in the summer of 2016, working with Sebastian Thrun to build the Self-Driving Car Engineer Nanodegree Program. Sebastian quickly introduced me to Otto, whose engineers offered to help teach the program.
I only met Levandowski briefly, but when the lawsuit hit a few months later, it was surreal to find myself connected, however tangentially, to the drama.
The Google-Uber lawsuit ended with a massive settlement from Uber to Google, and led to Levandowski pleading guilty of downloading a project tracking spreadsheet from his job at Google. According to Wikipedia, âLevandowski admitted to accessing the document about one month after leaving Google.â
I never could figure out whether Levandowski was really guilty, and if he was, whether it even mattered. Co-mingling personal computers and phones with cloud emails and information presumably leads to enormous amounts of data downloaded on most corporate employeesâ personal devices. Often, we donât even know that this is happeningâââthe emails and documents get downloaded in the background. When we do load and review something, itâs not always clear whether that information existed locally on personal device, or is stored in the cloud.
And if the worst thing Levandowski did was look at a project planning spreadsheet a month after he left a job, that seems negligible.
But it did cost Levandowski hundreds of millions of dollars, as well as jail time.
There are many more worthy recipients and potential recipients of mercy than a brilliant engineer who made and then lost a fortune, and is young enough and brilliant enough to make it all again. But neither do I begrudge Levandowski the pardon. Frankly, Iâm glad he received it. I only wish that many more people, from all walks of life, would receive such forgiveness.
The official explanation of the pardon, such as it is, has already been wiped from the White House website, only hours into the next presidential administration. But thereâs always the Way Back Machine, which records the justification for posterity.
âPresident Trump granted a full pardon to Anthony Levandowski. This pardon is strongly supported by James Ramsey, Peter Thiel, Miles Ehrlich, Amy Craig, Michael Ovitz, Palmer Luckey, Ryan Petersen, Ken Goldberg, Mike Jensen, Nate Schimmel, Trae Stephens, Blake Masters, and James Proud, among others. Mr. Levandowski is an American entrepreneur who led Googleâs efforts to create self-driving technology. Mr. Levandowski pled guilty to a single criminal count arising from civil litigation. Notably, his sentencing judge called him a âbrilliant, groundbreaking engineer that our country needs.â Mr. Levandowski has paid a significant price for his actions and plans to devote his talents to advance the public good.â
I just purchased new tires for my 2004 Toyota Highlander, which made me cringe a little bit at the rubber being chewed up in this video. Otherwise, itâs awesome đ
Chris Gerdesâs lab at Stanford has been working on autonomous donuts and drifting for a few years. Now theyâve partnered with Toyota Research Institute.
I imagine this work requires incredibly accurate state estimation and motion control. The former senses when when the vehicle has crossed boundaries between different states, such as âtractionâ and âside-slip.â These states are what an engineer or mathematician would call ânon-linear.â Thatâs basically just a mathematical way of saying what most drivers intuitively knowâââthe vehicle starts to handle much differently when itâs in a skid.
The motion controller must then be tuned for several different states, and respond appropriately as the vehicle transitions between states.
I might also imagine that a very finely tuned simulator, modeling the physical components of the vehicle, comes into play.
All of this is a ways away from the more common problems that self-driving cars face, like object tracking and detection.
But high-performance state estimation is necessary for both map-less driving and autonomous flight. Even though this is a car, I bet a lot of what theyâre learning could translate to airborne vehicles.
The motion control advances here might eventually allow autonomous vehicles to safely and comfortably travel at higher speeds than humans have ever been able to handle.
(Truly, we were sitting down. I know some people do the standing desk thing, and I probably should do that, too. Voyage even supplies those standing desks during Covid, if I could haul myself down to the office to pick one up. But for now, I sit. So does Jason.)
David: What is an Engineering Manager for the Perception Team?
Jason: Weâre looking for someone who can lead our Perception Team. Itâs a small but mighty team of about half a dozen engineers who work with the sensors on Voyageâs robotaxis in order to perceive the environment.
Perception!
What is perception?
Specifically, Voyageâs Perception Team handles three main tasks:
Detectionâââfinding objects and agents in the environment, primarily using deep learning for computer vision and sensor fusion, but also other robust techniques
Trackingâââfiguring out if a car we detected one second ago is the same car weâre detecting right now
Localizationâââcalculating our vehicleâs position with respect to the environment and our high-definition maps
The ideal candidate for this position would have really deep knowledge of at least one of these tasks, and familiarity with the others.
Whatâs exciting about this role? How will it help a candidate grow?
Voyage is at the cutting-edge of both computer vision and sensor fusion. Weâve implemented an Active Learning approach that automatically curates and selects the most valuable sensor data from our massive data set. For example, over time our Active Learning system might discover that we need to concentrate more on golf cart samples, relative to pedestrian samples. Training on a smaller subset of the most valuable data dramatically accelerates our development process and improves performance.
Voyageâs Active Learning system selects small data subsets for optimal performance.
One of this teamâs most important deliverables is camera-based Depth Perception. We train deep learning models on both camera images and lidar point clouds, so that we can ultimately use camera images alone to infer depthâââthe distance to objects in a two-dimensional image. This is incredibly important for redundancy, safety, and performance. And itâs easy to generate a proof-of-concept, but what we require is fully robust performance under all manner of conditions.
Voyage trains an ensemble of different deep learning models on lidar point clouds.
Weâre also working with an ensemble of multiple deep learning models for point cloud detection, tracking, and sensor fusion. Very few leaders have access to the engineering team, volume of data, and real-world validation opportunities to push the cutting edge in this domain. The Engineering Manager of the Perception Team at Voyage will have all of those tools!
Is this role more about people management or technical leadership?
More technical leadership. The engineers on the Perception Team will report to this manager, so thereâs an important people management aspect. But our engineers are fairly senior and strong, so what we really need is a technical expert who can serve as a sounding board and leader for architectural design decisions.
Could this be somebodyâs first managerial role?
Ideally, weâre looking for someone who has managed a team before, but it could be somebodyâs first managerial roleâ weâre open to that. In that case, what weâd really want to see is strong technical project leadership experience. A Technical Lead who hasnât officially managed people, but has shipped large-scale computer vision and sensor fusion projects to completion, could potentially be a good candidate for this role.
The real world is crazy! Do you see those turkeys? Voyageâs perception stack does!
Would Voyage hire somebody from outside the robotaxi industry?
Weâd consider it. Computer vision and sensor fusion experience is critical for this role, so weâd want to see that. And we find that there are a lot of specific nuances to building safety-critical Level 4 autonomous driving systems. Thatâs an advantage for engineers who already work in AV. But this position could be a good fit for somebody from a related industry, like Drones, Computer Vision, or other areas of Robotics.
Lidar is a key component of Voyageâs perception stack.
Could the candidate be remote? Work from Hawaii?
Ideally, having someone based in the Bay Area would be preferred given the nature of the role, but we have flexibility to hire a really strong candidate remotely. Voyage supports remote work!
Jason Wong, VP of Talent at Voyage
Alright, letâs get down to brass tacks. Whatâs the interview process? Is it hard?
Ha. We have an amazing Perception Team at Voyage and weâre looking for an amazing leader for that team. We donât set out to make the interview process âhardâ, per se, but by the end of the process, we want to be confident that a candidate is phenomenal!
The first stage is a technical domain interview with a member of the Perception Team. In this stage, weâll gauge a candidateâs expertise and skill in the perception domain. And the candidate can start to gauge us and the work weâre doing!
The second stage is an interview with our VP of Engineering, Davide Bacchet. This position reports to Davide, so we want to make sure thereâs a strong relationship. Even more importantly, Davide will want to discuss the candidateâs vision of leadership and team growth.
The final stage is a set of panel interviews with the engineers on the team. These interviews will focus on domain expertise and managerial philosophy. Itâs really important to us that every engineer on the team get to meet the candidate before we extend an offer. And we want the incoming manager to feel good about all the team members, too. The strength of this team is one of the selling points of the role.
Is there a coding interview? Does the candidate need to be able to reverse a linked list in five minutes or less?
LOL. No, thereâs no coding interview for this position. The type of candidate weâre looking for does code and can reverse a linked list, but thatâs not part of the selection process. Weâre much more focused on deep domain expertise, thorough system design, and technical leadership.
I interviewed Ben Alfi, the CEO of Blue White Robotics, and wrote for Forbes.comabout the companyâs aspirations to provide a vendor-neutral cloud robotics platform.
The company aspires to support any type of robot on its platform. The management and orchestration that Blue White Robotics aims to provide its customers is reminiscent of the functionality that cloud computing providers, such as Amazon Web Services or Microsoft Azure, offer. Just as cloud computing services typically donât build servers themselves, but rather rent them to customers on-demand, Blue White Robotics hopes to achieve the same with autonomous vehicles.
This expands on my post about the company from a few weeks ago. After writing about them a little bit here, I was intrigued and was fortunate to be able to talk with their executive leadership for a deeper dive. I enjoyed it and I hope you do, too!
Like otherpeople, I like to start the year by making predictions about what will happen, particularly with respect to self-driving cars and autonomous vehicles. Following the example of Scott Alexander, I assign probabilities to my predictions.
100% Confidence
No Level 5 self-driving cars will be deployed anywhere in the world.
90% Confidence
Level 4 driverless vehicles, without a safety operator, will remain publicly available, somewhere in the world. No âself-driving-onlyâ public road will exist in the U.S. Tesla will remain the industry leader in Advanced Driver Assistance Systems. An autonomy company will be acquired for at least $100 million. Level 4 autonomous vehicles, with or without a safety operator, will remain publicly available in China.
80% Confidence
C++ will remain the dominant programming language for autonomous vehicles. A lidar-equipped vehicle will be available for sale to the general public. My parents will not ride in an autonomous vehicle (except at Voyage or anywhere else I might work). Tesla will not launch a robotaxi service. Fully driverless low-speed vehicles will transport customers (not necessarily the general public).
70% Confidence
Waymo will expand its public driverless transportation service beyond Phoenix. A Chinese company will offer self-driving service, with or without a safety operator, to the public, outside of China. A self-driving Class 8 truck will make a fully driverless trip on a public highway. Aerial drone delivery will be available to the general public somewhere. Tesla will remain the worldâs most valuable automaker.
60% Confidence
Fully driverless grocery delivery will be available somewhere in the US. Tesla Full-Self Driving will offer Level 3 (driver attention not necessary until requested by the vehicle) functionality somewhere in the world. A member of the public will die in a collision involving a Level 4 autonomous vehicle (including if the autonomous vehicle is not at-fault). A company besides Waymo will offer driverless service to the general public, somewhere in the US. A company will deploy driverless vehicles for last-mile delivery.
50% Confidence
Level 4 self-driving, with or without a safety operator, will be available to the public somewhere in Europe. A Level 3 vehicle will be offered for sale to the public, by a company other than Tesla. The US requires driver-monitoring systems in new vehicles. The industry coalesces around a safety standard for driverless vehicles. Self-driving service will be available to the general public, with or without a safety operator, in India.
Normally, Iâd wrap up 2020 by looking back at the predictions I made at the beginning of 2020. ExceptâŚI didnât make any predictions at the beginning of 2020. I skipped a year, so Iâll have to dig back two years to look at the predictions I made at the start of 2019.
Following the example of Scott Alexander, I assign probabilities to my predictions. This allows a finer-grained evaluation of how accurate my predictions were. Unfortunately, scoring one-year predictions two years later kind of nullifies this exercise, but here we go.
100% Certain
âNo Level 5 self-driving cars will be deployed anywhere in the world.
90% Certain
âLevel 4 autonomous vehicles will be on the road, at least in test mode, somewhere in the US. âDeep learning will remain the dominant tool for image classification. âHuman drivers will be permitted on all public roads in the US. âNo car for sale anywhere in the world will include vehicle-to-traffic-light communication. [Maybe by now this is true in China?] âC++ will be the dominant programming language for autonomous vehicles. âAutonomous drone delivery will be available commercially somewhere in the world. [Google and Walmart have announced pilotsâââunclear if those pilots are currently ongoing.]
80% Certain
âLevel 4 self-driving cars will be available to the general public (with or without a safety operator) somewhere in the US. âWaymo will have recorded more autonomously-driven miles (all-time) than any other company. âLevel 4 vehicles will operate, at least in test mode, without a safety operator, somewhere in the US. âNo vehicle available for sale to the general public will come with OEM-installed lidar. [I think the Audi A8 still has a lidar, but itâs unclear. Volvo announced Luminar-equipped vehicles, but theyâre not yet in production.] âNo dominant technique will emerge for urban motion planning.
70% Certain
âLevel 4 vehicles will be available to the general public somewhere in Europe. âLevel 4 vehicles will be available to the general public somewhere in China. âAn autonomous shuttle running on public roads will be open to the general public somewhere in the world. [This seems like it must be true, but Iâm not sure where. Public shuttles from May Mobility and Navya pop up periodically, but they always seem to be short-term engagements.] âA company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $100M USD. [Luminar, Uber ATG, Zoox, although all of those are special cases in their own ways.] âGrocery delivery via autonomous vehicles, with no safety operator, will be available somewhere in the world.
60% Certain
âNo Level 4 self-driving cars will be available to the general public, without a safety operator, anywhere in the US. âTesla will offer the best-performing Advanced Driver Assistance System available to the public. [âBest-performingâ is subjective. Various ratings have downgraded Tesla Autopilot and Full Self-Driving, largely due to poor communication and driver monitoring. But Tesla still seems to me to be clearly in the lead with ADAS.] âAll publicly available Level 4 vehicles will use lidar. âA member of the public will die in a collision involving a Level 4 autonomous vehicle (including if the autonomous vehicle is not at-fault). [Not that Iâm aware of since January 1, 2019.] âSelf-driving cars will be available to the general public somewhere in India. [Not that I know.]
50% Certain
âA Level 3 vehicle will be for sale to the general public somewhere in the world. [Audi has pulled back on this. Volvo has announced but not yet delivered.] âTeslaâs full self-driving hardware will include a custom-designed computer. âAmazon will make routine (e.g. non-demonstration) autonomous deliveries using autonomous vehicles. [Supposedly Scout is still testing deliveries, but theyâre pretty under-the-radar and I donât consider these yet âroutine.â] âA company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $1B USD. [Zoox, although this is not quite what I expected when I made the prediction.] âTwo of the US Big Three and German Big Three (i.e. two of six) will merge.
Evaluation
Evaluation one-year predictions over a two-year horizon isnât really accurate, but hereâs how I scored.
100% confidence = 100% accuracy
90% confidence = 100% accuracy
80% confidence = 60% accuracy
70% confidence = 40% accuracy
60% confidence = 40% accuracy
50% confidence = 40% accuracy
The graph should ideally be a straight line up and to the right. Instead, my graph looks like this.
Not terrible, but room to improve, for sure.
Looking over what I got wrong, it seems like two-year-ago-David thought there would be much more widespread public testing of Level 4 vehicles (Europe! India! Deliveries! Fatalities!) but all with safety operators. Instead, weâve seen steady and cautious progress (more miles, removing the safety driver) by the largest companies in the markets in which they were already operating.
Iâve always been curious about what it would be like to buy a rental car, so I enjoyed the opportunity to talk with Greg Nierenberg, who leads Avis Car Sales. I wrote up the details in Forbes. Check it out!
Nierenberg explains that the Ultimate Test Drive is technically a rental, which gives Avis more flexibility than the typical car dealership. Indeed, a 2017 advertisement for the Ultimate Test Drive opens with the statistic that âthe average test drive lasts for 17 minutes,â but the Ultimate Test Drive lasts for up to three days.