My latest Forbes.com article reviews Embraer’s Q1 2020 results, which were buffeted by a failed sale of its commercial aviation division to Boeing, also the COVID-19 pandemic, delinquent customers, and surprising strength in its executive aviation and defense units.
The headline numbers were down: revenue decreased 24% compared to Q1 2019, and commercial aircraft deliveries declined 55% from the previous year. Nonetheless, Embraer heads into the rest of 2020 in a strong cash position, with $2.5 billion on its balance sheet, in large part due to strong Q4 2019 results.
A few weeks ago, I had the delight of visiting Europe with Udacity’s Berlin-based European team, meeting both automotive partners and Udacity students. The trip was so much fun!
We started in Stuttgart, where we met with our partners at Bosch and toured their Abstatt campus. Their campus reminds me of a plush Silicon Valley office, except instead of overlooking Highway 101, they overlook vineyards and a European castle.
Thanks to Udacity student Tolga Mert for organizing!
We discussed the self-driving ecosystem and, of course, how to get a job working on self-driving cars at Bosch.
The next day we headed to Berlin to prepare for our deep learning workshop at Automotive Tech.AD. What a great collection of autonomous vehicle engineers from companies across Europe!
In the evening we hosted a Meetup for current and prospective Udacity students at our Berlin office. It is always a delight to meet students and hear firsthand what they love about Udacity, and how they feel we can improve the student experience.
It’s a lot of fun to fly across the globe and see different places, but the best experience of all is getting to meet students from all different parts of the world.
We learn what our students are working on, what excites them about self-driving cars, and about the difference Udacity has made in their lives. It’s wonderful!
If you’re interested in becoming a part of our global Self-Driving Car community, consider enrolling in one of our Nanodegree programs. No matter your skills and experience, we’ve got a program for you!
Udacity Self-Driving Car Engineer Nanodegree program
The second lesson of the Udacity Self-Driving Car Nanodegree program is actually a lesson followed by a project. In “Finding Lane Lines”, my colleague Ryan Keenan and I teach students how to use computer vision to extract lane lines from a video of a car driving down the road.
Students are able to use this approach to find lane lines within the first week of the Nanodegree program! This isn’t the only way to find lane lines, and with modern machine learning algorithms it’s no longer the absolute best way to find lane lines. But it’s pretty effective, and it’s amazing how quickly you can get going with this approach.
Here’s a photo of Interstate 280, taken from Carla, Udacity’s own self-driving car:
The first thing we’re going to do is convert the image to grayscale, which will make it easier to work with, since we’ll only have one color channel:
Next, we’ll perform “Canny edge detection” to identify edges in the image. An edge is place where the color or intensity of the image changes sharply:
Now that we have the edges of the image identified, we can use a technique called a “Hough transform” to find lines in the image that might be the lane lines we are looking for:
All of these tools have various parameters we can tune: how sharp should the edges be, how long should the lines be, what should the slope of the line be. If we tune the parameters just right, we can get a lock on our lane lines:
Apply these lane lines to the original image, and you get something like this “Finding Lane Lines” project, submitted by our student Jeremy Shannon:
For the last year and a quarter, I’ve been working with a team at Udacity to build the Self-Driving Car Engineer Nanodegree program. This is a nine-month program that prepares software engineers for jobs working on autonomous vehicles.
Over the coming weeks and months, I’m going to produce a new post about each of the lessons in the Nanodegree program, to help you explore what you can learn. As of right now, there are 67 lessons, so I anticipate this process will take me several months to complete. But I’m excited to spend time reviewing and sharing what we’ve built!
During our program we cover: computer vision, deep learning, sensor fusion, localization, path planning, control, advanced electives, and finally system integration. In the final part of the program, students even get to put their own code on Carla, Udacity’s actual self-driving car.
I’ll start today with a quick post about our 1st lesson, which is entitled: “Welcome”.
I still wear the orange and purple, even though the Suns are 3–8 and sit near the bottom of the Western Conference. Charles Barkley 4ever.
And so I paid good money and trekked to the Oracle Arena tonight, all to watch the Suns collapse in the final minutes of the game, as soon as the Warriors decided to actually start trying.
On my way in, I saw a giant Uber sign, with an arrow pointing into the stadium lots. As far as I can tell, the Warriors are actively facilitating Uber, even though it undercuts their take from stadium parking.
Maybe it’s because fans demand it. Maybe it’s because Joe Lacob, the Warriors’ venture capitalist owner, has a stake in Uber.
Buying and renting anything — a home, a car, a movie — involves a tradeoff between stability and flexibility. Buying provides the stability of permanent ownership and availability, whereas renting provides the flexibility of adjustment to fit changing needs and wants.
The automotive market is moving from an ownership model to a rental model, as ride-sharing services push the stability-flexibility trade in favor of renting, rather than owning. And what we’ve seen with ride-sharing is just the tip of the iceberg. Self-driving cars will push this tradeoff an order of magnitude further.
As consumers come to value flexibility in transportation, we can take lessons from the manufacturing industry on the practice of mass customization.
Today, car buyers have to purchase a one-size-fits-all vehicle. If I need to drive in snow twenty days a year, I might get a four-wheel drive vehicle, even though I would be better off with a compact car the other 345 days. Similar considerations govern the purchase of a car capable of occasional carpooling, or downtown parking, or a client visit.
In the self-driving car future, we’ll be able to rent the car we want, and the companies that win will get good at doing this really fast.
Need a minivan this morning? It’ll be there in 30 seconds.
Want a convertible this evening? It’ll be there in 45 seconds.
What Do People Want?
In this world, getting the right car to somebody’s door in 60 seconds or less might be the easy part. Mass customization has been studied and optimized and is mostly a solved problem.
The harder challenge is to figure out what people want.
We have some basic starting points: sedans, vans, SUVs, pickups, sports cars.
But these are all built for human drivers in a one-size-fits-all world.
In a mass customization world, we no longer have to make tradeoffs between scenario. We can tune each vehicle option to a specific use case.
It could even be that we’ll hail one car service if we want a maneuverable short-haul vehicle, and a different service if we want a fast, long-haul vehicle.
What kinds of vehicles would you like to see in a self-driving world?
How do you envision the future of vehicle mass customization? Share your thoughts in the comments. Thanks!
Recently I’ve had a few discussions with people who are nervous about self-driving cars, mostly because they find driving fun. They’re worried that we’re entering a brave new world where people won’t be allowed to drive for fun anymore.
I think this is a legitimate concern, but my response is that driving will become like biking.
Biking today is primarily a leisure activity that people do for fun. Except in a certain uncommon (and usually urban) instances, biking is rarely the most efficient or fastest way to transport yourself.
Once self-driving cars become common, I expect to see much the same thing. We might see certain roads designated only for human-driven cars, just like many paths are designated specifically for bikes today.
And it won’t shock me if we see a replay of some of the cyclist vs. driver road rage in the form of human drivers vs human passengers in self-driving cars, all trying to use the same road.
My model for thinking about how human-driven cars will map onto the self-driving road system is to think about how bikes map onto the current human-driven road system.
There are a lot of bike-only paths, often along scenic routes. Outside of those routes, cyclists will often use slower, smaller, residential streets for biking. The instances in which cyclists need to use main commuting thoroughfares are the situations in which bike-car conflict is the greatest.
So I can imagine scenic roads, like US 1 in California, or Skyline Drive in Virginia, being set aside specifically for human drivers. This might be especially true if self-driving cars ultimately attain speeds far beyond what human drivers can safely handle today.
Big interstate highways, though, might become the domain of computer-driven cars traveling hundreds of miles per hour.