(Truly, we were sitting down. I know some people do the standing desk thing, and I probably should do that, too. Voyage even supplies those standing desks during Covid, if I could haul myself down to the office to pick one up. But for now, I sit. So does Jason.)
David: What is an Engineering Manager for the Perception Team?
Jason: We’re looking for someone who can lead our Perception Team. It’s a small but mighty team of about half a dozen engineers who work with the sensors on Voyage’s robotaxis in order to perceive the environment.
Perception!
What is perception?
Specifically, Voyage’s Perception Team handles three main tasks:
Detection — finding objects and agents in the environment, primarily using deep learning for computer vision and sensor fusion, but also other robust techniques
Tracking — figuring out if a car we detected one second ago is the same car we’re detecting right now
Localization — calculating our vehicle’s position with respect to the environment and our high-definition maps
The ideal candidate for this position would have really deep knowledge of at least one of these tasks, and familiarity with the others.
What’s exciting about this role? How will it help a candidate grow?
Voyage is at the cutting-edge of both computer vision and sensor fusion. We’ve implemented an Active Learning approach that automatically curates and selects the most valuable sensor data from our massive data set. For example, over time our Active Learning system might discover that we need to concentrate more on golf cart samples, relative to pedestrian samples. Training on a smaller subset of the most valuable data dramatically accelerates our development process and improves performance.
Voyage’s Active Learning system selects small data subsets for optimal performance.
One of this team’s most important deliverables is camera-based Depth Perception. We train deep learning models on both camera images and lidar point clouds, so that we can ultimately use camera images alone to infer depth — the distance to objects in a two-dimensional image. This is incredibly important for redundancy, safety, and performance. And it’s easy to generate a proof-of-concept, but what we require is fully robust performance under all manner of conditions.
Voyage trains an ensemble of different deep learning models on lidar point clouds.
We’re also working with an ensemble of multiple deep learning models for point cloud detection, tracking, and sensor fusion. Very few leaders have access to the engineering team, volume of data, and real-world validation opportunities to push the cutting edge in this domain. The Engineering Manager of the Perception Team at Voyage will have all of those tools!
Is this role more about people management or technical leadership?
More technical leadership. The engineers on the Perception Team will report to this manager, so there’s an important people management aspect. But our engineers are fairly senior and strong, so what we really need is a technical expert who can serve as a sounding board and leader for architectural design decisions.
Could this be somebody’s first managerial role?
Ideally, we’re looking for someone who has managed a team before, but it could be somebody’s first managerial role— we’re open to that. In that case, what we’d really want to see is strong technical project leadership experience. A Technical Lead who hasn’t officially managed people, but has shipped large-scale computer vision and sensor fusion projects to completion, could potentially be a good candidate for this role.
The real world is crazy! Do you see those turkeys? Voyage’s perception stack does!
Would Voyage hire somebody from outside the robotaxi industry?
We’d consider it. Computer vision and sensor fusion experience is critical for this role, so we’d want to see that. And we find that there are a lot of specific nuances to building safety-critical Level 4 autonomous driving systems. That’s an advantage for engineers who already work in AV. But this position could be a good fit for somebody from a related industry, like Drones, Computer Vision, or other areas of Robotics.
Lidar is a key component of Voyage’s perception stack.
Could the candidate be remote? Work from Hawaii?
Ideally, having someone based in the Bay Area would be preferred given the nature of the role, but we have flexibility to hire a really strong candidate remotely. Voyage supports remote work!
Jason Wong, VP of Talent at Voyage
Alright, let’s get down to brass tacks. What’s the interview process? Is it hard?
Ha. We have an amazing Perception Team at Voyage and we’re looking for an amazing leader for that team. We don’t set out to make the interview process “hard”, per se, but by the end of the process, we want to be confident that a candidate is phenomenal!
The first stage is a technical domain interview with a member of the Perception Team. In this stage, we’ll gauge a candidate’s expertise and skill in the perception domain. And the candidate can start to gauge us and the work we’re doing!
The second stage is an interview with our VP of Engineering, Davide Bacchet. This position reports to Davide, so we want to make sure there’s a strong relationship. Even more importantly, Davide will want to discuss the candidate’s vision of leadership and team growth.
The final stage is a set of panel interviews with the engineers on the team. These interviews will focus on domain expertise and managerial philosophy. It’s really important to us that every engineer on the team get to meet the candidate before we extend an offer. And we want the incoming manager to feel good about all the team members, too. The strength of this team is one of the selling points of the role.
Is there a coding interview? Does the candidate need to be able to reverse a linked list in five minutes or less?
LOL. No, there’s no coding interview for this position. The type of candidate we’re looking for does code and can reverse a linked list, but that’s not part of the selection process. We’re much more focused on deep domain expertise, thorough system design, and technical leadership.
I interviewed Ben Alfi, the CEO of Blue White Robotics, and wrote for Forbes.comabout the company’s aspirations to provide a vendor-neutral cloud robotics platform.
The company aspires to support any type of robot on its platform. The management and orchestration that Blue White Robotics aims to provide its customers is reminiscent of the functionality that cloud computing providers, such as Amazon Web Services or Microsoft Azure, offer. Just as cloud computing services typically don’t build servers themselves, but rather rent them to customers on-demand, Blue White Robotics hopes to achieve the same with autonomous vehicles.
This expands on my post about the company from a few weeks ago. After writing about them a little bit here, I was intrigued and was fortunate to be able to talk with their executive leadership for a deeper dive. I enjoyed it and I hope you do, too!
Like otherpeople, I like to start the year by making predictions about what will happen, particularly with respect to self-driving cars and autonomous vehicles. Following the example of Scott Alexander, I assign probabilities to my predictions.
100% Confidence
No Level 5 self-driving cars will be deployed anywhere in the world.
90% Confidence
Level 4 driverless vehicles, without a safety operator, will remain publicly available, somewhere in the world. No “self-driving-only” public road will exist in the U.S. Tesla will remain the industry leader in Advanced Driver Assistance Systems. An autonomy company will be acquired for at least $100 million. Level 4 autonomous vehicles, with or without a safety operator, will remain publicly available in China.
80% Confidence
C++ will remain the dominant programming language for autonomous vehicles. A lidar-equipped vehicle will be available for sale to the general public. My parents will not ride in an autonomous vehicle (except at Voyage or anywhere else I might work). Tesla will not launch a robotaxi service. Fully driverless low-speed vehicles will transport customers (not necessarily the general public).
70% Confidence
Waymo will expand its public driverless transportation service beyond Phoenix. A Chinese company will offer self-driving service, with or without a safety operator, to the public, outside of China. A self-driving Class 8 truck will make a fully driverless trip on a public highway. Aerial drone delivery will be available to the general public somewhere. Tesla will remain the world’s most valuable automaker.
60% Confidence
Fully driverless grocery delivery will be available somewhere in the US. Tesla Full-Self Driving will offer Level 3 (driver attention not necessary until requested by the vehicle) functionality somewhere in the world. A member of the public will die in a collision involving a Level 4 autonomous vehicle (including if the autonomous vehicle is not at-fault). A company besides Waymo will offer driverless service to the general public, somewhere in the US. A company will deploy driverless vehicles for last-mile delivery.
50% Confidence
Level 4 self-driving, with or without a safety operator, will be available to the public somewhere in Europe. A Level 3 vehicle will be offered for sale to the public, by a company other than Tesla. The US requires driver-monitoring systems in new vehicles. The industry coalesces around a safety standard for driverless vehicles. Self-driving service will be available to the general public, with or without a safety operator, in India.
Normally, I’d wrap up 2020 by looking back at the predictions I made at the beginning of 2020. Except…I didn’t make any predictions at the beginning of 2020. I skipped a year, so I’ll have to dig back two years to look at the predictions I made at the start of 2019.
Following the example of Scott Alexander, I assign probabilities to my predictions. This allows a finer-grained evaluation of how accurate my predictions were. Unfortunately, scoring one-year predictions two years later kind of nullifies this exercise, but here we go.
100% Certain
✓No Level 5 self-driving cars will be deployed anywhere in the world.
90% Certain
✓Level 4 autonomous vehicles will be on the road, at least in test mode, somewhere in the US. ✓Deep learning will remain the dominant tool for image classification. ✓Human drivers will be permitted on all public roads in the US. ✓No car for sale anywhere in the world will include vehicle-to-traffic-light communication. [Maybe by now this is true in China?] ✓C++ will be the dominant programming language for autonomous vehicles. ✓Autonomous drone delivery will be available commercially somewhere in the world. [Google and Walmart have announced pilots — unclear if those pilots are currently ongoing.]
80% Certain
✓Level 4 self-driving cars will be available to the general public (with or without a safety operator) somewhere in the US. ✓Waymo will have recorded more autonomously-driven miles (all-time) than any other company. ✓Level 4 vehicles will operate, at least in test mode, without a safety operator, somewhere in the US. ✗No vehicle available for sale to the general public will come with OEM-installed lidar. [I think the Audi A8 still has a lidar, but it’s unclear. Volvo announced Luminar-equipped vehicles, but they’re not yet in production.] ✓No dominant technique will emerge for urban motion planning.
70% Certain
✗Level 4 vehicles will be available to the general public somewhere in Europe. ✓Level 4 vehicles will be available to the general public somewhere in China. ✗An autonomous shuttle running on public roads will be open to the general public somewhere in the world. [This seems like it must be true, but I’m not sure where. Public shuttles from May Mobility and Navya pop up periodically, but they always seem to be short-term engagements.] ✓A company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $100M USD. [Luminar, Uber ATG, Zoox, although all of those are special cases in their own ways.] ✗Grocery delivery via autonomous vehicles, with no safety operator, will be available somewhere in the world.
60% Certain
✗No Level 4 self-driving cars will be available to the general public, without a safety operator, anywhere in the US. ✓Tesla will offer the best-performing Advanced Driver Assistance System available to the public. [“Best-performing” is subjective. Various ratings have downgraded Tesla Autopilot and Full Self-Driving, largely due to poor communication and driver monitoring. But Tesla still seems to me to be clearly in the lead with ADAS.] ✓All publicly available Level 4 vehicles will use lidar. ✗A member of the public will die in a collision involving a Level 4 autonomous vehicle (including if the autonomous vehicle is not at-fault). [Not that I’m aware of since January 1, 2019.] ✗Self-driving cars will be available to the general public somewhere in India. [Not that I know.]
50% Certain
✗A Level 3 vehicle will be for sale to the general public somewhere in the world. [Audi has pulled back on this. Volvo has announced but not yet delivered.] ✓Tesla’s full self-driving hardware will include a custom-designed computer. ✗Amazon will make routine (e.g. non-demonstration) autonomous deliveries using autonomous vehicles. [Supposedly Scout is still testing deliveries, but they’re pretty under-the-radar and I don’t consider these yet “routine.”] ✓A company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $1B USD. [Zoox, although this is not quite what I expected when I made the prediction.] ✗Two of the US Big Three and German Big Three (i.e. two of six) will merge.
Evaluation
Evaluation one-year predictions over a two-year horizon isn’t really accurate, but here’s how I scored.
100% confidence = 100% accuracy
90% confidence = 100% accuracy
80% confidence = 60% accuracy
70% confidence = 40% accuracy
60% confidence = 40% accuracy
50% confidence = 40% accuracy
The graph should ideally be a straight line up and to the right. Instead, my graph looks like this.
Not terrible, but room to improve, for sure.
Looking over what I got wrong, it seems like two-year-ago-David thought there would be much more widespread public testing of Level 4 vehicles (Europe! India! Deliveries! Fatalities!) but all with safety operators. Instead, we’ve seen steady and cautious progress (more miles, removing the safety driver) by the largest companies in the markets in which they were already operating.
Reutersreports that Ouster, a five year-old, San Francisco-based lidar startup, plans to go public via a SPAC (special purpose acquisition company), at a market capitalization of nearly $2 billion. Kudos to Paul Lienert at Reuters, who also broke a recent story on Apple’s car efforts and is having quite a week.
According to Reuters, Ouster is the fifth lidar company this year to “agree” to go public via a SPAC, after Velodyne, Luminar, Innoviz, and Aeva. That’s kind of amazing, especially given that the primary customer of these companies will presumably be self-driving car manufacturers, almost none of whom have even launched a product yet — much less built profitable businesses.
I confess to not fully understanding the advantages of SPACs. I assume they bypass a lot of the paperwork and headaches associated with traditional IPOs. But I also imagine that in theory they should come with quite high capital costs. The number of SPACs available to take a startup public is much smaller than the number of institutional investors who would buy shares in a traditional IPO.
However, the outsized valuations of Luminar and Ouster, in particular, show that companies can achieve really high valuations via SPACs. According to the CEO of Colonnade Acquisition Corp., which will acquire Ouster and take it public, “It’s not a business plan — they’re selling real products to real customers right now.”
That’s kind of a surprising quote for a $2 billion valuation.
Yesterday, Zoox unveiled its long-awaited vehicle. It doesn’t yet have a name (the Zoox website lists it simply as, “VEHICLE”), although the press describes it as a “carriage”, at least in form factor. It resembles the Cruise Origin more than a little bit, including the glass elevator-style doors.
Zoox has done some amazing technical work with this vehicle. Most notably, the vehicle supposedly moves it not only forward and backward, like a normal car, but also side-to-side, like a dolley.
That said, I am a little skeptical about the utility of a four-person passenger vehicle as the true form factor for the self-driving future. We’re used to four-person vehicles now because consumers have to purchase cars that fill lowest-common-denominator needs. In a transportation-as-a-service world, though, I suspect we’ll all want to travel in our own personal vehicles.
They cover AI, transportation, digital medicine, autonomous flight, Udacity, the future of technology, and more. You even get to hear Sebastian talk about a refrigerator flirting with a dishwasher.
(I contributeForbes.com; it’s a stretch to call Steve Forbes my “boss”. I’ve actually never met Steve Forbes myself, but just go with it.)
I always enjoy learning about new autonomous vehicle companies. Recently I heard from an Israeli startup called Blue White Robotics that is working with lots of different types of autonomous vehicles.
BWR offers an end-to-end service that includes robots, software, operations, and even “boots on the ground.” Their website has photos of drones and autonomous shuttles and self-driving cars, alongside business objectives that range from agricultural pollination to medical transportation to HAZMAT.
Their website sizes the company at 60 employees, which implies a fairly large existing operational portfolio.
Like Bestmile, the goal for Blue White Robotics appears to be a multi-modal platform that customers can configure for their specific needs. A little bit like “AWS for autonomy.”
This is a big goal and I am excited to see companies like BWR aiming for it.
Although neither Amazon nor Walmart are known first and foremost as self-driving companies, they both have been doing a lot of work in the space. But they have been doing that work differently. Amazon has been investing while Walmart has been partnering.