In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

    • Crampon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      20 hours ago

      I have no doubt the car will crash.

      But I do feel there is something strange about the car disengaging the auto pilot (cruise control) just before the crash. How can the car know it’s crashing while simultaneously not knowing it’s crashing?

      I drive a model 3 myself, and there is so much bad shit about the auto pilot and rain sensors. But I have never experienced, or heard anyone else experiencing a false positive were the car disengage the auto pilot under any conditions the way shown in the video with o sound or visual cue. Considering how bad the sensors on the car is, its strange they’re state of the art every time an accident happens. There is dissonance between the claims.

      Mark shouldn’t have made so many cuts in the upload. He locks the car on 39mph on the video, but crashes at 42mph. He should have kept it clean and honest.

      I want to see more of these experiments in the future. But Marks video is pretty much a commercial for the Lidar manufacturer. And commercials shouldn’t be trusted.