BMW is turning its focus to self-driving cars. “The youngest head of a major carmaker, Krueger is part of a generational shift that’s now looking for ways to respond to new challengers such as Apple Inc. and Google, which the BMW CEO on Monday described as competitors.”
Arrowstreet is a Boston-based architectural firm that specializes in parking garage design.
So they’ve naturally been thinking about what autonomous vehicles mean for the future of parking garages.
Their hypothesis is that garages will evolve in two stages.
The first stage involves modifying conventional parking garages to support both conventional cars and self-driving cars. Conventional cars would park closer to the pedestrians, to minimize effort for drivers, and autonomous vehicles can park themselves further back (or up) in the garage.
The second stage of development will be garages oriented completely towards autonomous vehicles. These garages will have very tight parking in a restricted area, with vehicles driving out to specified zones for passenger retrieval.
Arrowstreet even thinks that conventional garages can be retro-fitted for residential and commercial space once parking is no longer a necessity.
NPR has a short interview with Chris Urmson, technical director of Google’s self-driving car project.
The interview focuses on whether human drivers should be able to take over from the computer or not.
Urmson has a neat analogy I hadn’t heard before:
You wouldn’t imagine that in the back of a taxi, we put an extra steering wheel or brake pedal there for the passenger to grab ahold of anytime. It would just be crazy to think about doing that.
Interestingly, Urmson notes that Google might allow human drivers to take control of the car from a standing start, because people might enjoy driving on the weekend.
The Motley Fool reports that Tesla has hired away to top-notch chip designers from Apple, and also that Elon Musk is being coy about whether Tesla wants to design its own chips.
The Motley Fool concludes that this is insane and chip-making is not Tesla’s business.
That seems about right to me, but one question is whether Tesla hired these chip designers as carrots or sticks.
The carrot approach is that, by having amazing chip designers on staff, Tesla can better work with NVIDIA and other manufacturers, guiding product development.
The stick approach is that Tesla might like to credibly pressure chip manufacturers, as a means to getting what it wants.
For years, Stanford’s Chris Gerdes has been working with students to build a self-driving race car.
The car recently hit speeds of 120mph at Thunderhill Raceway in Willows, California, and the video shows what it looks like to have a car weave around a track with nobody at the wheel.
Of course, a racetrack lacks many of the variables and obstacles that cars encounter in real life. But raw performance is important, particularly since I dream of one day commuting in self-driving cars at 300mph 🙂
Google has accepted “some responsibility” for its first at-fault accident in autonomous mode.
We clearly bear some responsibility, because if our car hadn’t moved, there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.
Google’s self-driving car has been in a number of accidents over the years, but none were the fault of the autonomous driving software. The accidents all either occurred when the vehicle was in “human-driver” mode or were the fault of the driver of a different vehicle (Google’s cars have been rear-ended several times).
On Valentine’s Day, however, Google filed an accident report that might possibly be first accident for which the self-driving car software was at fault.
This was a very minor accident with no injuries, and it’s not completely clear from Google’s self-report who was at fault, although it seems like the Google car was. I would be curious to see how the insurance companies involved parcel out blame.
From the report:
A Google Lexus-model autonomous vehicle (“Google AV”) was traveling in autonomous mode eastbound on El Camino Real in Mountain View in the far right-hand lane approaching the Castro St. intersection. As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St. The Google A V then moved to the right-hand side of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop and go around sandbags positioned around a storm drain that were blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sandbags. A public transit bus was approaching from behind. The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue. Approximately three seconds later, as the Google AV was re-entering the center of the lane it made contact with the side of the bus. The Google AV was operating in autonomous mode and traveling less than 2 mph, and the bus was travelling at about 15mph at the time of contact.
The Google AV sustained body damage to the left front fender, the left front wheel and one of its driver-side sensors. There were no injuries reported at the scene.
Automotive News has a short and fun interview with Raj Nair (who’s LinkedIn title is “Executive Vice President and Chief Technical Officer — Global Product Development at Ford Motor Company”).
The write-up is mercifully concise. Here’s a highlight:
The human body is an amazing array of sensors — two great optical sensors, auditory sensors and balance sensors that provide information that the brain doesn’t only perceive but also filters.
To reproduce all of that with a combination of lidar [a kind of radar based on laser beams], radar and ultrasonic sensors is a big challenge. Then there are the algorithms. They are reasonably straightforward for the basic aspects of driving. If you can see the white lines it’s not that hard to steer the vehicle between them. But for all the other things that your mind works through when driving, you need to be prepared for all of them. This increases the level of sensor capability processing you need.