back to article Bug bounty alert: Musk lets pro hackers torpedo Tesla firmware risk free

Tesla will allow vetted security researchers to hunt for vulnerabilities in its vehicle firmware risk free – as long as it is done under its now-tweaked bug bounty program. The luxury electric automaker said this week it will reflash the firmware on cars that have been bricked by infosec bods probing for exploitable bugs in …

  1. Steve Aubrey
    Meh

    Easy-peasy!

    I have mixed emotions about this. At one level - yes, great, providing some level of protection is good.

    At another level, I'm not sure I'd want to brick my car and still face towing charges in the *hope* that they can fix the problem. For a limited number of occurrences.

    And at the humor level, what a great opportunity to have the black-hat hackers provide their information in advance!!

  2. Anonymous Coward
    Anonymous Coward

    Elon Musk

    Putting the pedo into Torpedo.

  3. Big Al 23

    Another lawsuit filed against Tesla for autopilot crash

    According to Reuters another Tesla model S customer has filed suit against Tesla for a crash caused by autopilot. It seems according to NHTSA who is investigating multiple Tesla autopilot deaths that the vehicles actually speed up just prior to crashing in several instances. In the latest suit the woman was seriously injured but not killed.

    1. Giovani Tapini

      Re: Another lawsuit filed against Tesla for autopilot crash

      Although this does not sound like a software security issue, more an undocumented feature.

      This idea of formally enabling researchers within reason does sound a good idea in principle. If it works as intended then I hope other corporates are brave enough to do so. I assume most of the Tesla software is their own though. I suspect most brands have bought in almost every element making the "fixing" process highly complex commercially within any useful timescale.

    2. My-Handle

      Re: Another lawsuit filed against Tesla for autopilot crash

      @Big Al 23

      The latest incident I heard of that matches what you say is this:

      https://arstechnica.com/tech-policy/2018/09/tesla-hit-with-lawsuit-after-woman-crashed-her-model-s-while-in-autopilot/

      I assume (perhaps incorrectly) that this is the incident you are referring to. The woman broke her ankle, and caused the accident because she was using her phone and back-ended a fire engine. She took no action to prevent this in the apparently mistaken belief that the car would do the braking for her. I believe the police reprimanded her for the US equivalent of driving without due care and attention.

      If I've got the wrong incident, can you provide a link to the Reuters article?

      1. Jellied Eel Silver badge

        Re: Another lawsuit filed against Tesla for autopilot crash

        The car did do the breaking for her foot.. But the Utah case and others sound like a combination of driving without due care and attention, due to an over reliance on the 'autopilot'. Which may have some order of operation issues. So Utah and some others seem to have been due to cruise control and following other vehicles. Car ahead changes lane, Tesla speeds up and drives into obstruction.

        That seems to be a large risk with 'smart' cars and dumb drivers. So figure on a worst-case example of 'smart' car tailgating a car with all-wheel steering that makes a sudden lane change to avoid <something>. If the vehicle or driver following isn't paying attention, there's a collision. I guess it'd be possible to add (or legislate) warnings based on how far ahead a vehicle's sensors can see & detect obstructions and the vehicle's speed.

        But figuring out the rules used by 'autopilot' software could be an interesting part of Tesla security testing. And also if it's possible to create your own system backup so you could reflash your motor in the event of an OTA update failing, or Tesla's systems becoming unavailable due to litigation..

    3. Brangdon

      Re: Another lawsuit filed against Tesla for autopilot crash

      Speeding up is expected behaviour if the car in front changes lanes to reveal open highways. The cruise control had been slowed down by the car in front, and when that car is gone it will accelerate to its set speed.

      Sometimes that behaviour interacts with a heuristic about ignoring stationary obstacles. Apparently when the car is moving fast, a lot of cruise control algorithms assume anything stationary is probably clutter, because if it was an obstacle it wouldn't be on the highway.

  4. John Smith 19 Gold badge
    Unhappy

    "Those who want to be enrolled in the research program will need to contact Tesla directly"

    And have a Tesla of course.

    Which sort of raises the entry fee a bit.

    1. Wellyboot Silver badge

      Re: "Those who want to be enrolled in the research program will need to contact Tesla directly"

      Existing Tesla owners could set up a guinea pig owners club in the spirit of helpfulness!

      Because, they've all bought the cars to help save the planet (I'm told)

    2. Elmer Phud Silver badge

      Re: "Those who want to be enrolled in the research program will need to contact Tesla directly"

      " owners will need to register both themselves and their cars"

      Reading is soooo last week . . .

      1. Anonymous Coward
        Anonymous Coward

        Re: "Those who want to be enrolled in the research program will need to contact Tesla directly"

        Reading is soooo last week . . .

        And Slough is soooo last century. Shitholes both.

  5. Anonymous Coward
    Anonymous Coward

    No “pedos” though...

    ^ see title.

  6. /\/\j17

    Musk lets pro hackers torpedo Tesla firmware risk free*

    * Well, other than the risk of him randomly branding you a peadophile on Twitter and having a child bride (who's in her 40s).

  7. regregular

    It looks good on the surface, but it seems to be a shoddy compromise for PR looks mainly.

    So, you can enroll IF you happen to have a Tesla and are a known security researcher. Put that into a Venn diagram and how many people does it leave you with?

    And they will do OTA or reasonable actions at a service center, if deemed appropriate. That is such a dirty weasel-clause. To me it reads: If we can't OTA it, you're free to tow your car to the service center at your own cost. If you shoot out the control unit beyond what the Tesla diagnostic service computers can deal with you're on your own. If you genuinely brick the device beyond physical repair you're on your own.

    How about: we will OTA or attempt to service the control units for anyone who requests it, even hobbyists. If you are a vetted researcher you can be assured to receive a replacement control unit if necessary and you are eligible to receive a set of control units to work them over in a proper bench setup at your hearts content, without freezing your gonads off in the cold garage.

    That would be a worthy recognition of free security research.

    1. Lord Elpuss Silver badge

      @regregular

      It's a large step in the right direction, and as such worthy of recognition. The fact that the step isn't as massive as you might like it to be, i.e. reaching the end goal in one fell swoop, doesn't mean it's worthless.

      And the 'dirty weasel' clause, as you put it, sounds like sensible limitation of collateral damage on Tesla's part - if you happen to be in a part of the world where there are no Tesla service centers, it might cost them many thousands to recover your bricked vehicle; and why the hell should they do that when they didn't ask you to do the bricking?

      Sheesh. Reminds me of this: Dudley's Presents.

  8. Nick Kew Silver badge
    Alert

    Fine wheeze

    The cynical view may or may not apply here. Even if it wasn't Musk's intention, the idea is now out there.

    If you come out and say some security researchers are licensed to hack without fear of reprisals, you implicitly threaten other researchers with a tonne-of-bricks treatment, and the argument in court that IF not on our approved programme THEN black hat.

    1. John Brown (no body) Silver badge

      Re: Fine wheeze

      "If you come out and say some security researchers are licensed to hack without fear of reprisals, you implicitly threaten other researchers with a tonne-of-bricks treatment, and the argument in court that IF not on our approved programme THEN black hat."

      Not sure why you got downvoted. I can see other companies following suit and acting exactly as you describe, even if Tesla themselves don't. Someone like MS, for example, I can quite easily imagine coming up with something similar and then gradually tightening the terms and conditions and/or revoking the status or people they don't like because they embarrass MS or find things MS would rather not be found until the only "independent" infosec bods left are all MS shills.

  9. Brewster's Angle Grinder Silver badge

    Help!

    I'm a good faith researcher doing bad faith research. Do I qualify?

    What about if I'm a bad faith researcher but doing good faith research?

    Researchers: we have inquiring minds.

  10. Anonymous Coward
    Anonymous Coward

    From what I have observed, much of the Tesla hacking has been/is being done in Europe and mostly in countries where Tesla does not sell cars or provide service. Tesla doesn't sell parts or their diagnostic tools to unauthorised mechanics/garages but those who have chosen to take Teslas to these countries as grey imports need to have their cars serviced and repaired somewhere. This has generated a market for secondhand Tesla parts (sometimes, stolen) and people specialising in providing firmware updates, enabling features like autopilot and even tweaking perfomance - all entirely unauthorised and unvetted by Tesla.

  11. jaffa99

    "risk-free"

    Plenty of caveats in their terms that make it definitely not "risk-free"

  12. EnviableOne Bronze badge

    What IF...

    Someone playing around on shodan finds x number of devices with a certain port available, connects to one such device, manages to open an remote console and discovers they now have root access to x Tesla vehicles. This is a rather important flaw, now they have not registered with Tesla, and neither are the vehicles they have access to, are they going to get hit by US legal intervention, are the users going to get issues having their car flashed if said researcher made cosmetic changes as PoW?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019