字幕列表 影片播放
More companies are trying to bring self-driving cars to the masses than
ever before. Yet a truly autonomous vehicle still doesn't exist.
And it's not clear if, or when, our driverless future will arrive.
Proponents like Elon Musk have touted an aggressive timeline but missed
their goals and others in the industry have also missed projections.
Well, our goal is to deploy these vehicles in 2019.
So you'll have the option to not drive.
It's not happening in 2020.
It's happening today. We wanted to check in.
Where exactly are we with self-driving cars?
And when can we expect them to be part of our daily lives?
The current state of driverless cars is very interesting because we've
passed what people refer to as peak hype and we've entered what's called
the trough of disillusionment.
Which is, even people within the industry are saying, gee, it turns out
there's a lot harder than we thought.
We're definitely not anywhere near as far along as a lot of people thought
we would be three years ago.
But I think over the last 18 to 24 months, there's been a real injection
of reality. There was a sense maybe a year or two ago that our algorithms
are so good, we're ready to launch, we're gonna launch driverless cars any
minute. And then obviously there's been these setbacks of people getting
killed or accidents happening and now we're a lot more cautious.
Several big players have begun to walk back their predictions on how soon
we could see this technology.
Even Waymo's Chief External Officer admitted that the hype around its
self-driving cars has become unmanageable.
The technology has come a long way, but there's still a lot of work to be
done. There's the perception, which is, using the sensors to figure out
what's around the vehicle, in the environment around the vehicle.
Prediction, figuring out what those road users are going to be doing next
in the next few seconds.
Turns out the perception and especially prediction are really, really hard
problems to solve. Companies tackling self-driving today are taking two
general approaches. Some are building a self-driving car from the ground
up. Others are developing the brains that drive the car.
An early leader was Google, who started its self-driving car project in
2009. Known as Waymo today, the company is developing hardware and
software that can function as the brains in a self-driving car.
Aurora is taking a similar approach.
Founded in 2017 by early players from Uber, Tesla and Google's
self-driving initiatives, it's already raised $620 million in funding from
Amazon and other big name investors.
Aurora is testing vehicles on the road in Pittsburgh, Pennsylvania and out
here in the Bay Area. We don't yet let the public in our cars.
Our cars are on the road, we have two of our test operators in there.
The technology we're building can operate from a compact electric car, to
a minivan, to even a big, long haul truck.
Argo AI and Aptiv are examples of other companies taking a similar
approach. Lyft is developing its own self-driving systems now too and
offering self-driving rides on its app through partnerships in select
areas. Self-driving is too big for just one company and one effort.
And if you look at our strategy, that is why we're working with partners
on the open platform, Aptiv and Waymo, and why we're building the tech
here. Companies like Tesla, Zoox and GM, with its Cruise division, are
making their own vehicles.
Aiming for self-driving cars that can operate in all environments.
This is the engineering challenge of our generation.
We've raised seven and a quarter billion dollars of capital.
We have deep integration with both General Motors and Honda, which we
think is central when you're building mission critical safety systems and
building those in a way that you can deploy them at very large scale.
Cruise, which was acquired by General Motors in 2016, has been testing its
fleet of vehicles in San Francisco with safety drivers onboard.
To give you a sense for the magnitude of the difference between suburban
driving and what we're doing everyday on the streets of San Francisco.
Our cars on average see more activity in one minute of San Francisco
driving than they see in one hour of driving in Arizona.
Zoox, led by the former chief strategy officer at Intel, is working on
creating an all in one self-driving taxi system with plans to launch in
2020. Instead of retrofitting cars with sensors and computers and saying,
hey, here's a self-driving car.
We think there's an opportunity to create a new type of vehicle that from
the very beginning was designed to move people around autonomously.
Nissan and Tesla both have semi-autonomous systems on the roads today.
Tesla's has been available in beta on its vehicles since 2015 and drivers
have been known to use the current system hands-free.
Tesla's promising full self-driving software is just around the corner.
It's going to be tight, but it still does appear that we'll be at least in
limited, in early access release, of a feature complete full self-driving
feature this year. I think Tesla is actually a lot further back than they
would like the world to to believe they are because they are, in fact, so
much more limited in terms of their hardware.
Others are making self-driving shuttles that operate along designated
routes only or focusing on trucks with long haul highway routes.
And then there are companies like Ghost and Comma.ai
working on aftermarket kits.
Essentially hardware that could be installed in older cars to bring them
new self-driving capabilities one day.
For all players in this space, the path ahead is filled with challenges.
Chief among them, proving the technology is safe.
Driverless systems have to meet a very high safety bar that has to be
better than a human before they're deployed at scale.
There are no federally established standards or testing protocols for
automated driving systems in the U.S.
today, but there have been fatal crashes.
A woman named Elaine Herzberg was killed by an autonomous Uber with a
safety driver who was paying no attention.
This woman was crossing the street, walking her bicycle, should easily
have been seen by the autonomous vehicle, was not, was run over.
Nobody stepped on the brakes.
In 2016, a Tesla fan named Joshua Brown died in a crash while using
autopilot hands-free in Florida.
Other autopilot involved accidents are now under investigation.
Still, the industry is hopeful that autonomous vehicles will make the
roads far safer than they are today.
Really, the kind of zero to one moment for the industry will be when we
can remove those safety drivers safely and the vehicle can operate without
the presence of any human. Others, like Elon Musk, have said it's almost
irresponsible not to have these vehicles out there because they are safer
and will be safer than human drivers.
Even if we could say that an autonomous vehicle was better than a human
driver, it doesn't mean that an autonomous vehicle is better than a human
driver plus all of the advanced driver assist systems we have.
When looking at when the tech could actually be ready one of the principle
metrics touted by companies is the number of miles driven, but not all
miles are created equal when testing automated systems.
You could take an autonomous vehicle and go, put it on an oval track or
just a straight road, and you could drive 100 million miles.
But that's not really gonna tell you much about how well the system
actually functions because it's not encountering the kinds of things that
are actually challenging in a driving environment.
Testing self-driving vehicles out on public roads isn't enough.
They need to be exposed to every imaginable scenario, so companies rely on
simulation. We can create situations that we're basically never going to
see or very rarely see.
So, for example, we might want to simulate what happens as a bicycle comes
through an intersection, runs a red light and crashes into the side of our
car. Turns out that doesn't happen very often in the real world, but we
want to know that if that happens, our vehicles are going to do something
safe. Basically allow the car to practice up in the cloud instead of on
the road. When you're testing autonomous vehicles out on public roads, not
only are the people riding in that car part of the experiment, but so is
everybody else around you. And they didn't consent to being part of an
experiment. I remain concerned that humans will be used as test dummies.
Instead of self-certification and de-regulation I want to see strong
independent safety regulations from the agencies in front of us today.
The self-certification approach did not work out well for the Boeing 737
Max 8 and now Boeing is paying the price.
We should heed that lesson when it comes to finding out the best way to
deploy autonomous vehicles.
Lawmakers held hearings this month to figure out how to keep the public
safe without holding back self-driving innovation.
In September, the National Highway Traffic Safety Administration released
new federal guidelines for automated driving systems.
But they're only voluntary suggestions at this point.
State legislation is farther along.
As of October, 41 states have either enacted laws or signed executive
orders regulating autonomous vehicles.
With regulatory questions looming, it's no surprise that self-driving
companies are proceeding cautiously at first.
What we're going to be seeing in the next several years is more limited
deployments in very specific areas where there's confidence that the
technology can work. I think we'll see limited deployments of self-driving
vehicles in the next five years or so.
You'll see these moving goods and you'll see them moving people, but
you'll see them specifically in fleet applications.
Aurora says its systems could be integrated into any vehicle, from fleets
of taxis to long haul trucks.
The cost of self-driving technology is another deciding factor for how it
will be deployed. Most consumers are never going to own a vehicle that's
really autonomous because the technology is expensive and there's a whole
raft of issues around product liability and making sure that it's properly
maintained and sensors are calibrated.
That's one reason ride hailing companies Lyft and Uber are getting in the
game. We have two autonomous initiative.
One is the open platform where we're connecting Lyft passengers with our
partner self-driving vehicles.
And so this is Aptiv in Las Vegas and Waymo in Chandler, Arizona.
And then also kind of the product experience for the tech that you see
here, which is Level 5. As AV companies inch toward the mainstream public
perception, simple understanding of the tech has become another issue that
could impact progress.
Some in particular in the industry have done a disservice to the public in
overhyping the technology before it's really ready.
It's still not very clear to most people what we mean when we say
driverless car. Waymo and General Motors Cruise Automation are very
close to having what they referred to as level five cars most of the time.
In other words, again, they can in theory function all by themselves.
But so far, it seems that they function like a 15 year old driver hoping
to get a driver's license.
There's a lot of people who think that you can buy autonomous vehicles
today, especially when you can go out and buy a car, buy an option that's
called full self-driving and pay for that.
You expect that it actually exists.
And the fact is, it does not exist today.
With an uncertain timeline and a history of missed targets, public
confusion is no surprise.
Despite big developments, most companies have recognized we are still
years away from having truly self-driving cars as part of our daily lives.
One big question is when is the car ready?
You have to have a good sense of all of the scenarios and all of the
situations that the vehicle will need to encounter.
And that just takes time.
We expect level four vehicles to be feasible in small quantities within
the next five years.
And what that means is you'll probably see hundreds or maybe thousands of
vehicles out either delivering packages or moving people through
neighborhood or maybe hauling goods on our freeways.
And now, even the experts hesitate to make promises on when true
self-driving will get here.
You always have to assume that the user is going to find a way to misuse
the technology. Assume the worst and then design for that.
I think it's a mistake to be over promoting the technology, over hyping it
when it's still very much a work in progress.
This is something we need to do with society, with the community and not
at society. And we take that very seriously.
We're building mission critical safety systems that are going to have a
huge positive impact on people's lives.
And the tech adage of move fast and break things most assuredly does not
apply to what we're doing here.