Launching in Boston this week! Learn about our Boston Startup Challenge... Launching in Boston — Learn More
Back to The Latest

Humans Not Allowed

March 28, 2017

copyright Li Jiang, still a human, for now.

In 2015, computer scientist Jerry Kaplan wrote a book titled Humans Need Not Apply.

At that time, it provoked debate from technologists and policy makers alike about the rate in which technology may make humans obsolete.

Yet not even two years later, I propose that Jerry didn’t go far enough.

What if the headline of our future isn’t “Humans Need Not Apply” but “Humans Not Allowed”?

At least in saying Humans Need Not Apply, we leave the door ajar for people to make a choice, even if suboptimal, to hire humans to keep doing their jobs.

But what if people were just banned from doing certain jobs?

Here is a case in point. Drumroll please…

You knew I was going to pick Tesla.

On October 9, 2014, the company introduced Tesla Autopilot, a part of the $2,500 “Tech Package” option. Since then Tesla has released a number of hardware and software updates, most notably the “Hardware 2″ update in October 2016 that allows for fully autonomous operation (SAE Level 5).

Tesla, Elon, and the media announced several milestones for the number of miles driven by Tesla Autopilot throughout 2016 including [0]:

May 2016: 100 million miles

August 2016: 140 million miles

October 2016: 222 million miles

November 2016: 300 million miles

In this period, Tesla has seen 1 fatality so it has already surpassed the 1 fatality per 94 million miles driven average in the U.S.

Let’s Run The Math

Let’s assume that by January 1, 2017, Tesla had logged 400 million miles on Autopilot based on the data we have from 2016. The number of Autopilot miles added per day has been reported to be north of 1 million miles per day.

We know that Tesla delivered 50,580 cars in 2015 and 76,230 cars in 2016 for a total of 126,810 cars. Let’s conservatively estimate that there are 100,000 Autopilot vehicles in Tesla’s fleet.

I assume the average Autonomous Miles per car per day is roughly 12.5 miles, meaning Tesla is adding 1.25 million miles of Autopilot data per day.

Again, these assumptions are pretty conservative and Elon is going to tweet about how he disagrees with this.

But let’s play this out. The number of Autonomous Vehicles is going to increase dramatically as Tesla pushes to hit their 500,000 car production goal for 2018. The number of Autonomous Miles per car per day is likely to increase as well as people get more comfortable using it.

By March 31, 2017, I estimate Tesla is adding 1.75 million Autopilot Miles per day, with a total of 533 million autonomous miles driven.

I’ve assumed Tesla will deliver 185,000 vehicles in calendar year 2017 which is more than doubled its 2016 production. By the end of 2017, Tesla will be adding 4.76 million Autopilot Miles per day, with 1.37 BILLION cumulative Autopilot miles driven. If Tesla can be fatality free in 2017, it will be 14.5 TIMES safer than a human driver (1.37 billion over 94 million).

If Tesla comes close to its 500,000 vehicle goal in 2018 (I’ve assumed 405,000), it will be adding 14.4 million Autopilot Miles per day by 12/31/2018 with 4.61 BILLION cumulative Autopilot Miles driven.

See my model here.

So What?

If my model is “in the ballpark”, by the end of 2018, for every 6.5 days that goes by without a fatality for the Tesla Autopilot, it will be the equivalent of saving a life compared to human drivers.[1]

So with this data in hand, couldn’t Tesla present a persuasive argument to regulators that Autopilot should be legal in all states in the land? But let’s extrapolate this 1 more step.

I should add a note here to explain why Tesla is deploying partial autonomy now, rather than waiting until some point in the future. The most important reason is that, when used correctly, it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability.

— Elon Musk

If Tesla could generate data that demonstrates Autopilot being 5x safer than the average human driver, or 10x safer, or 20x safer, could regulators in 2018/2019 decide that it would be “morally reprehensible” to actually let humans drive?

What about people with DUIs or other accident records? Would it be “morally reprehensible” to not require these drivers to purchase a self driving car if their financial circumstance allows for it. Would it be “morally reprehensible” to not require trucking fleets to use autonomous drivers instead?

Forward

This is just one area where it feels like “oh that’s really far off”, but we are very close to the “tipping point” where we will have generated enough data to prove that machines are 2x, 5x, 10x, 20x safer than humans at performing a job. Whether it be driving, working in manufacturing, diagnosing diseases and subscribing treatments, each will reaching its own “tipping point” where reasonable people and regulators may be compelled to come to the inevitable conclusion of “Humans Not Allowed”.

Notes

[0] I’m using total number of miles driven by Autopilot. There are other data points including the number of miles driven with a car that has Autopilot and the total number of miles driven by the Tesla fleet. I’m using the more conservative (and probably most accurate) data point for when Autopilot is actually actively engaged and is driving the vehicle.

[1] A little more math in case that was confusing. Every day Tesla will be cranking out 14.4 million miles. In 6.5 days the fleet will drive 94 million miles, the U.S. average miles driven per fatality.

Follow GSVlabs
Stay
Connected

Amazing Things Happen Here

Sign up for email updates to stay connected with GSVlabs.