• hperrin@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    5
    ·
    edit-2
    1 year ago

    The point of OK Google is to start listening for commands, so it needs to be really good and accurate. Whereas, the point of fluffy blanket is to show you an ad for fluffy blankets, so it can be poorly trained and wildly inaccurate. It wouldn’t take that much money to train a model to listen for some ad keywords and be just accurate enough to get a return on investment.

    (I’m not saying they are monitoring you, just that it would probably be a lot less expensive than you think.)

    • jard
      link
      fedilink
      arrow-up
      23
      arrow-down
      5
      ·
      edit-2
      1 year ago

      deleted by creator

      • Monument@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        I think what the person is saying is that if you aren’t listening for keywords to fire up your smart speaker, but are more instead just ‘bugging’ a home, you don’t need much in the way of hardware in the consumers home.

        Assuming you aren’t consuming massive amounts of data to transmit the audio and making a fuss on someone’s home network, this can be done relatively unnoticed, or the traffic can be hidden with other traffic. A sketchy device maker (or, more likely, an app developer) can bug someone’s home or device with sketchy EULA’s and murky device permissions. Then they send the audio to their own servers where they process it, extract keywords, and sell the metadata for ad targeting.

        Advertising companies already misrepresent the efficacy of the ads, while marketers have fully drank the kool-aid - leading to advertisers actually scamming marketers. (There was actually a better article on this, but I couldn’t find it.) I’m not sure accuracy of the speech interpretation would matter to them.
        I would not be surprised to learn that advertisers are doing legally questionable things to sell misrepresented advertising services. … but I also wouldn’t be surprised to learn that an advertising company is misrepresenting their capabilities to commit a little (more) light fraud against marketers.

        sigh yay capitalism. We’re all fucked.

      • pixxelkick@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        1 year ago

        I was about to write this but you took the words right out of my mouth, so I will just write “this ^”

      • jimmycrackcrack@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        This along with much else that’s pointed out make the whole devices capturing audio to process keywords for ads all seem unlikely, but, one thing worth pointing out is that people do sell bad products that barely or even just plain old don’t do what they told their customers it would do. Someone could sell a listening to keywords to target ads solution to interested advertisers that just really sucks and is super shit at its job. From the device user’s standpoint it’d be a small comfort to know the device was listening to your conversations but also really sucked at it and often thought you were saying something totally different to what you said but I’d still be greatly dismayed that they were attempting, albeit poorly, to listen to my conversations.