Maybe this is all just about profit margins and exiting underperforming markets to focus on stronger ones. Ford is reducing 10% of its workforce, for similar reasons.
But it’s also easy to imagine these are early steps in a longer-term disruption of the automotive industry — a change from automotive sales to mobility.
To be sure, consumer automotive sales will continue for a long time, especially in rural areas. But the bright future of automotive manufacturers probably lies in fleet sales to ride sharing companies, or maybe even becoming ride sharing companies themselves. Pulling back from foreign markets might be part of that trend.
The guiding star of the Udacity Self-Driving Car Nanodegree Program is to prepare students for jobs working on autonomous vehicles. So we were excited that Galen found his dream job working on autonomous vehicles for HERE in Boulder, Colorado. He also gives lots of credit to the Udacity, which is generous of him 🙂
“The private Slack channel for students is filled with a tangible excitement. I’ve never been a part of a such a large student body, let alone a student body that is committed to the success of every student (no grading curve here). Between Slack, the dedicated forums, and your own private mentor, there is no reason to be stuck on a problem — there are so many people willing to help answer your questions. Instead, you can focus on finding your own way to improve the foundations of the projects.”
Ricardo provides a thorough rundown of his Behavioral Cloning project, which runs on both simulators. He synthesized and built on the insights of earlier Udacity students:
“My first step was to evaluate the driver log steering histograms for 100 bins and do all required transformation, drop and augmentation to balance it. Here I followed the same methodology as in the well explained pre-processing from Mez Gebre, thanks Mez!”
Yazeed was struggling with the Behavioral Cloning Project when he realized he mixed up his colorspaces. Students on Slack pointed out that this might have been because I mixed up the colorspaces in some demo code. Oops!
“I spend about 15 days training my network over and over in the third project of the Self-Driving Cars Engineer NanoDegree by Udacity, it drives me crazy because it can’t keep itself on the road. I reviewed the code hundreds of times and nothing wrong with it. Telling you the truth, when I almost gave up, I read an article that has nothing to do with this project. It mentions that OpenCV read images as BGR (Blue, Green, Red) and guess what, the drive.pyuses RGB. What The Hell I’m doing for 2 weeks !!!”
Hadi went back and reviewed everything he learned during Term 1. He included all of his project videos, which are awesome!
“The last and fifth project of the first term is to write a program to detect vehicles by drawing a bounding box on each detected vehicle. the project is done using a Support Vector Machine SVM that is a kind of classifier that is used to classify and differentiate between different classes. In this case, the classifier takes multiple features of images as inputs and learns to classify them into two classes, cars and non cars.”
Dominic wrote up the generative adversarial network that he trained for the Udacity Deep Learning Nanodegree Foundations Program. Normally I downvote blog posts from other programs, but Dominic is a Self-Driving Car student and DLFND is great and GANs are very cool, so I’ll let it slide.
“The training took approximately 15–20 minutes and even though the first few iterations looked like something demonic — the end result was fine.”
“Details about the deal between Waymo and Lyft were scant. The companies declined to comment on what types of products would be brought to market as a result of it or when the public might see the fruits of the collaboration.”
Basically, there’s a deal but nobody knows what’s in it.
The big tease, of course, is that Waymo would put its self-driving cars on Lyft’s network.
That would be totally awesome, but somewhat at odds with past indications of Waymo’s plans for commercializing its self-driving technology.
Previously, Waymo has appeared to be launching a ride-sharing service, then a lidar manufacturing business, and now possibly a self-driving car operation (without owning the actual ride-sharing service). At other points, people have speculated that Waymo might try to make its software the default tech stack of autonomous vehicles, similar to what Android has become for mobile phones.
NVIDIA CEO Jensen Huang is famous for his keynote addresses at the company’s GPU Technology Conference.
Jensen delivered this year’s keynote yesterday, and the focus was deep learning and artificial intelligence. NVIDIA GPUs are critical for training deep neural networks, so NVIDIA is fast becoming as much an AI company as a gaming company.
Engadget created a super-fast mashup of Jensen’s keynote, if you only have 13 minutes to catch the highlights. In particular, check out the autonomous vehicle announcements around NVIDIA Drive and Guardian Angel.
Ricardo’s lane-finding pipeline works amazing well on challenge video for the Advanced Lane Finding Project. He has an incredibly thorough rundown of his pipeline: calibration, undistortion, color transforms, perspective transform, lane detection, curvature, and unwarping:
“Once you know where the lines are in one frame of video, you can do a highly targeted search for them in the next frame. This is equivalent to using a customized region of interest for each frame of video, which helps to track the lanes through sharp curves and tricky conditions.”
I just met Yazeed yesterday at NVIDIA’s GPU Technology Conference, and then I found his blog post today. He loves GPUs and deep learning so much he flew all the way from Saudi Arabia for the conference!
“For me it was a gambling to buy 1200$ Titan Xp which is just relased 17 houres ago — when I bought it — with a promise from NVIDIA to support macOS and I don’t have the Thunderbolt3 port which is the supported port for eGPU. So, I said like Richard Branson said “Screw It, Let’s Do It” and it works like a charm. without further ado, let’s dive in.”
A while back Mez published his results on behavioral coning with SqueezeNet, but he’s back with a super-enthusiastic blog post on his network. Only 52 paramters and 6 second epochs on a CPU!
“One good rule of thumb I developed from this project is to try and reduce the number of variables you are tuning to gain better results faster.”
Arsen has a great writeup on using a neural network to calculate both steering and throttle values for the Behavioral Cloning Project. Also, he uses early stopping to prevent overfitting the data.
“The validation set helped determine if the model was over or under fitting. I used EarlyStopping (utils.py line 299) to stop training when validation mse has stopped improving.”
JC’s writeup of his Behavioral Cloning Project covers a really important topic — how to debug, or at least visualize, what’s going on inside a neural network.
“During the process of training, I felt very uneasy as it is almost like a blackbox. Whenever the model failed to proceed at a certain spot, it is very hard to tell what went wrong. Although my model passed both tracks, the process of try and error and meddle around with different combination of configurations is quite frustrating.”
Mobileye banks on mapping revenue. There have been very few companies able to supply the high-definition maps that autonomous vehicles need. Mobileye plans to be one of them.
K-City will dwarf M-City. Korea is building an urban testing environment for self-driving cars. It will be almost three times bigger than the environment that Ford and the University of Michigan built.
Last week I traveled with colleagues to Sonoma Raceway for Safe Driver Training, a mandatory class for autonomous vehicle operators.
The class itself is not oriented around autonomous vehicles, but rather how to anticipate and evade dangerous situations on the road. The logic of requiring this class of autonomous vehicle operators, I suppose, is that if you have to take over the vehicle in an emergency, hopefully you are able to anticipate and evade a collision.
The biggest lessons are to look as far ahead as possible. Sit lower in the vehicle and raise your eyes toward the horizon. Then, when performing an evasive maneuver, lock your eyes on where you want the vehicle to go. Or, as the instructors say, “Keep your eyes on safety.”
The class was a lot of fun. Several of the exercises involved negotiating tight turns at high speed, just like if an obstacle popped out at the last minute. Other exercises required us to spin out the vehicle in a tight turn, then regain control and proceed through a gate.
Here’s a practice run for a tight turn exercises — the procedure gets tougher when they don’t tell you which way to turn until the last second:
Since the program is held at Sonoma Raceway, there all sorts of cool racecars around.
Nearly all of my savings are in various index funds, but I do own stock in one, single individual company: Berkshire Hathaway.
It’s mostly for sentimental reasons. I went to Omaha a couple of times during business school: once for the Berkshire annual conference (“Woodstock for Capitalists”) and once to meet the Oracle himself, as part of a school trip.
I’ve known for a while that autonomous vehicles would hurt insurance, which is one big part of Berkshire’s business. The logic is that insurance companies only exist because drivers need to insure themselves against the costs of accidents. If accidents diminish, the need for insurance diminishes.
But a question at this year’s annual meeting pointed out that another big part of Berkshire’s business is highly vulnerable to autonomous vehicles: railroads.
Berkshire purchased the Burlington Northern Santa Fe (BNSF) railroad for $26.5 million in 2010 and it’s been a good investment.
That investment will come under intense pressure from self-driving trucks, however. Once trucks can operate nearly constantly, without the cost or physical limitations of a driver, the cost advantage of transportation by rail will diminish, or maybe even disappear completely.