• Kattail_@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    59
    ·
    1 year ago

    do I want a microchip? yes. would i take if it was made by a greedy corporation? fuck no. I want my HUD, but that feels like too much of a gamble for it

  • q47tx@lemmy.world
    link
    fedilink
    arrow-up
    42
    ·
    1 year ago

    I’d be ok with a microchip in my brain if the code was open-source and I could make it at home. In any other case, no.

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        22
        arrow-down
        1
        ·
        1 year ago

        This. I’m a hard core nerd and open source enthusiast. My primary, always needs to work, device is an iPhone with developer mode disabled.

        Not because it’s in any way better, but because it’s way easier for me to refrain from messing with it.

        • Norah - She/They@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          edit-2
          1 year ago

          I think if you’re running it stock than iOS is better for privacy than (most) stock Android ROMs. You can absolutely make Android more private, but you need to mess with it to achieve that.

          I don’t know, maybe that’s a controversial opinion ¯\_(ツ)_/¯

          • Skimmer@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 year ago

            I agree with you. Stock Android OSes include so much proprietary bloatware, spyware, and other garbage from OEMs and Google themselves, that its a pretty horrible experience. They don’t take privacy and security seriously at all, not even good from a usability perspective either most of the time imo, as it also leads to worse performance and battery life, etc. I would much rather use iOS over like any Stock version of Android, even despite the many problems of iOS.

            Only way to make Stock Android somewhat usable is through removing what you can through ADB, but even that is far from ideal and won’t solve all of the issues.

            Overall though, by far best option is to just use an alternate Android OS like Graphene, beats iOS or Stock Android any day. Though between iOS and Stock Android, if I had to pick, I’d easily choose iOS.

          • Lemongrab@lemmy.one
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            Android has partial sandboxing of applications and a whole bunch of different permission options as to limit to one function. To my understanding iOS limits you to the 1st party app store (without sideloading). I understand why limiting the available apps improves security, but that means you are locked into using a lot of proprietary closed-source apps (which sucks). Apple also requires the use of an Apple account, and I also don’t think comparing default configs is worth much because to improve security/privacy I would look at the ceiling of what is possible to harden (I am refering to just basic settings, not dev stuff), as a default is for a general userbase.

    • Lemongrab@lemmy.one
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      With the increasing complexity of machine learning models, even the designers can’t understand how it functions (what input leads to a given output). Open source doesn’t mean safe at all. And even if it functions as intended, what happens wheb their is a vulnerability (or 0-zero day), or when the device reaches EOSL?

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 year ago

    Outside of having some debilitating problem that can only be fixed with a microchip in my brain; I’m opting out.

    But if I was blind and it allowed me to see, sign me up.

    • Norah - She/They@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      Even then, I wouldn’t want it to have any functionality to update the code it runs once it’s implanted. And I’d want that code to be incredibly well tested and verified alongside the hardware. No bugs beforehand means no reason to update it later.

      • Lemongrab@lemmy.one
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        No bugs is a hard thing to accomplish, especially for an immerging technology (eg 0-day vulnerability)

          • gregoryw3@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Not sure that counts? This was unfortunately due to a completely untested system, designed by one guy way over his head (ethically should have reported this to some governing body), and a company who lied about the non existent testing. This wasn’t just a singular bug but an entire failure throughout.

            • Norah - She/They@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              Yet, afterwards, the code running medical devices has been subject to the same standards that we set for tools themselves. The code embedded in a life support machine can’t fail.

              I think you also proved my point anyway, the problem was a system set up such that testing wasn’t done. Not that the testing itself wasn’t possible. It’s just expensive. So companies won’t do it unless they’re forced too by regulation.

              • gregoryw3@lemmy.ml
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Ohhh, yeah. I have no idea why back then code wasn’t seen for what it is. I’ve been told by older people that back then the idea that if it compiles it’s fine, was ok… or something along those lines. I think today we even still of a ton of those issues due to every framework and language being so different, lacking standardization.

                Throughout every thing I’ve ever learned, the biggest realization I’ve had was that without forcing policies, companies will do whatever is necessary to line their pockets.

        • Norah - She/They@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I was actually discussing this with my girlfriend and we were thinking of a system where it can be give you a two-factor authentication code via thought. That way you can use that to unlock it for updating the firmware.

    • CoderKat@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I’m hearing impaired and would love if some brain implant could fix me. I already almost have this, with a cochlear implant (it’s not technically in the brain, but it is an implant in my head). It’s not enough for me, though, cause my hearing still sucks.

      • Deepus@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Oh, I i thought the cochlear implants made it like being able to hear normally, is that not the case?

        • CoderKat@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          They can and I had hoped they were gonna for me. But my problem must be heavily neurological. The cochlear implant did help some, I’m a far cry from normal hearing (I especially struggle with accents, low tones, and when sounds overlap).

    • Uriel238 [all pronouns]@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      My vision’s been going since my forties, and since the notion of cyber-eyes in the 80s I’ve imagined one day getting some nice Canons or Nikons and being able to read at a KM.

      But we don’t have the kind of tech support now we did then, and instead get connected to some chatbot with a small troubleshooting tree. Also current brain interfaces might kill me or worse leave me alive and impaired.

  • Abnorc@lemm.ee
    link
    fedilink
    arrow-up
    36
    arrow-down
    1
    ·
    1 year ago

    The purpose of dystopian sci-fi is to help us understand these kinds of things. If Black Mirror helps you think about how technology will impact our future, all the power to you.

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    arrow-up
    25
    ·
    1 year ago

    I want to want this so badly. Made with full transparency by someone I can trust the idea is ssssooooo cool. Unfortunately this thing will be born fully enshittified and the ability to have the definitions of words I don’t understand just float into my field of vision as I’m reading them isn’t cool enough to negate the constant, nagging urge to buy myself a new Ford truck with the extra expensive trim package and for some reason when I think about financing it out for 10 years at 35% I get a feeling I haven’t felt since the first time I held my newborn child.

    • Common sense has always been a thought-terminating cliché. Even Disney has an old PSA cartoon about the notion (featuring the old term, horse sense ).

      At best, common sense reflects the notions that arguments should be obviously true and agreeable. But it’s better if we actually express them.

      So in the case of neuro-interfaced microchips, the installation procedure is still high risk, and there aren’t any useful purposes for them worth the risk. Maybe at the point it restores consciousness to a comatose patient, or eyesight to the blind.

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Outside of having some debilitating problem that can only be fixed with a microchip in my brain; I’m opting out.

    But if I was blind and it allowed me to see, sign me up.