Tesla Model S REST API Authentication Flaws

[contextly_sidebar id=”fac2d6589aabb899e309cf4413768c10″]As many of you know, APIs matter to me. I have lightbulbs that have APIs. Two months ago, I bought a car that has an API: The Tesla Model S.

For the most part, people use the Tesla REST API via the iPhone and Android mobile apps. The apps enable you to do any of the following:

  • Check on the state of battery charge
  • Muck with the climate control
  • Muck with the panoramic sunroof
  • Identify where the hell your car is and what it’s doing
  • Honk the horn
  • Open the charge port
  • Change a variety of car configuration settings
  • More stuff of a similar nature

For the purposes of this article, it’s important to note that there’s nothing in the API that (can? should?) result in an accident if someone malicious were to gain access. Having said that, there is enough here to do some economic damage both in terms of excess electrical usage and forcing excess wear on batteries.

The Authentication Protocol

The authentication protocol in the Tesla REST API is flawed. Worse, it’s flawed in a way that makes no sense. Tesla ignored most conventions around API authentication and wrote their own. As much as I talk about the downsides to OAuth (a standard for authenticating consumers of REST APIs—Twitter uses it), this scenario is one that screams for its use.

Authentication happens when you call the /login action with the email address and password of the Tesla customer. This is the same email address and password used to log into www.teslamotors.com. Every customer has one because this website is where the customer builds their car.

The authentication action creates a “token” that is valid for 3 months. Any further requests use that token for validation. You don’t use the email address/password again until the token expires in 3 months (assuming you store the token somewhere).

This model suffers from the following flaws:

  • It cannot safely operate over any channel but a trusted SSL connection (minor)
  • It requires the sharing of the user’s password with third-parties (major)
  • No mechanism exists for cataloging applications with active tokens (significant)
  • Only an inconsistent blunt-force mechanism exists for revoking access to a compromised application (moderate)
  • No mechanism exists for revoking the access of a compromised application (major)
  • The automated expiration of tokens in 3 months encourages applications to improperly store your email and password (significant)

Potential Attack Vectors

There’s no immediate danger from this architectural flaw that compromises the safety of the Model S. However, it does open up the following potential attacks:

  • You want to leverage a tool on a web site with some useful functionality. You enter your email/password. They willfully and incorrectly store that information and are subsequently compromised (or worse, they use it themselves).
  • An attacker gains access to a web site’s database of authenticated tokens. It has free access to all of that site’s cars up to 3 months with no ability for the owners to do anything about it unless the user changes their TeslaMotors.com password, in which case access for all third-party applications is revoked give or take some unspecified caching interval.

As noted above, the impact of any of these very real attack vectors is pretty much economic. I can target a site that provides value-added services to Tesla owners and force them to use a lot more electricity than is necessary and shorten their battery lives dramatically. I can also honk their horns, flash their lights, and open and close the sunroof. While none of this is catastrophic, it can certainly surprising and distracting while someone is driving (though not all functions are supported remotely while the car is in motion).

Perhaps the scariest bit is that the API could be used to track your every move.

The core issue, however, isn’t how bad an attack could be as a result of these specific flaws. Instead, I’m commenting on the larger picture of the Internet of Things in which everything has an API and everything needs to be secured reasonably. I don’t think the Tesla software engineers have given the security of the REST API its proper due and I see a common theme among Internet-connected “things” (the Hue being a good example) of not thinking through the security impacts of what they are doing.

What’s truly stupid here is that there was no reason for Tesla to write their own API authentication. There are, IMHO, two common models for API authentication:

  • User-to-system (as is the case with the Tesla API)
  • System-to-system

OAuth is the proper authentication mechanism for User-to-System authentication, and it’s truly unfortunate that Tesla ignored it.

Other Notes

  1. Applications don’t talk directly to the car. They talk via a web portal that processes requests and sends them on to the car. As far as I can tell (with very limited visibility), Tesla keeps a very strong security separation among the various operational components of the car.
  2. This is NOT about Tesla in the end. It’s about how we should be approaching API design in a world in which everything has an API. Hint: it starts with security and ends with the functionality, not the way Tesla, Hue, and most people are doing it today.
  3. Hue has a much easier to exploit flaw, but all an attacker can do is turn on/off lights and change their colors.


People have gotten a variety of inconsistent results from testing token revocation. Because it does sometimes work for sure, I’ve made some changes noted above to the original article. My best guess is that the inconsistent results are because of some kind of caching mechanism, but it doesn’t explain why in some cases the tokens have remained active for hours/days beyond a password change.

Either way, brute force removal of access for all applications isn’t the proper way to handle revocation for connected devices and that doesn’t impact all of the other issues.

tags: , , , , ,