Skip to main content

Tesla owner hacks Autopilot’s debugging mode – giving insights into back-end of Tesla’s semi-autonomous system

At this point, 8 years after Google put a spotlight on self-driving technology, there are over 2 dozens somewhat serious companies with autonomous driving programs at different stages of development.

Tesla’s Autopilot is among the most well-known and arguably one of the most exciting since it’s already powering features in vehicles owned by customers. For better or worse, it lets people experiment with some aspect of it and through those experimentations, we now get a look at the Autopilot’s debugging mode – giving insights into the back-end of Tesla’s semi-autonomous system.

Tesla’s second generation Autopilot is quite complex, but in short, it consists a computer vision technology called Tesla Vision that uses images fed from 8 cameras around the vehicle (currently mainly the 3 front-facing cameras) in order to steer the vehicle with the help of GPS and radar data.

With the data gathered through its entire fleet, Tesla is also building “high-precision maps” and its vehicles can download “tiles” based on their location and use them to better autonomously steer itself.

At any given time, Autopilot uses one of these technologies or a fusion of them in order to operate. The Tesla Vision system can also use either a lead vehicle or detect lane markings in order to steer.

Tesla’s Autopilot debug mode, which Tesla Motors Club member ‘verygreen’ managed to hack, tells us exactly which of those metrics the system is using to take its decisions. He posted his latest discoveries from the system in an interesting thread on the forum.

It shows some Autopilot settings currently unavailable to Tesla owners (picture credits to ‘verygreen’):

Of course, ‘Augmented Vision’ caught everyone’s attention, especially after all the talk about heads-up displays, but the options in the tab is not telling us a lot about it:

verygreen noted that it should “be displaying a video feed of some sort”, but he can’t make it work on his car.

As we previously noted in reports about Tesla owners hacking their vehicles, Tesla has one software build that it pushes to all its vehicles which is then limited on the user’s end. For example, a development vehicle in Tesla’s internal fleet could have the same software build as verygreen’s but with access to the functions that he is seeing in the debug mode.

When driving with the debug mode, he can see in real-time the information that Autopilot is using, like the GPS data and map tiles:

He even posted a video of the debug mode as he was driving his Model S. You can see what the Autopilot is seeing in real-time:

 

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.