A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

  • Beryl@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    20 days ago

    You don’t even have to rig a bomb, a better analogy to the sensor spoofing would be to just shine a sufficiently bright light in the driver’s eyes from the opposite side of the road. Things will go sideways real quick.

    • EvilBit@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 days ago

      It’s not meant to be a perfect example. It’s a comparable principle. Subverting the self-driving like that is more or less equivalent to any other means of attempting to kill someone with their car.

      • Beryl@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        20 days ago

        I don’t disagree, i’m simply trying to present a somewhat less extreme (and therefore i think more appealing) version of your argument