I found this while browsing Reddit, and going past the first reaction of “this is terrible” I think it can spark an interesting discussion on machine learning and how our own societal problems can end up creating bad habits and immortalizing those issues when building such systems.

So, what do you guys think about this?

  • Dragon@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    4 years ago

    If the google english dictionary allowed for a gender neutral pronouns, this wouldn’t be a problem. I’m less mad about an algorithm accurately representing a sexist culture, and more upset that it inaccurately interpreted the original text.

    • joojmachine@lemmy.mlOP
      link
      fedilink
      arrow-up
      4
      ·
      4 years ago

      As I stated in another comment, in cases like this with gender neutrality it could easily use “he/she” instead of assuming, it would interpret the text more accurately while being respectful and without going out of the defined dictionary by using “them” or something like this.

      • Flelk@lemmy.ml
        link
        fedilink
        arrow-up
        11
        ·
        4 years ago

        As a lifelong grammarian, I’ve always hate hate hated that English lacks a technically “correct” gender-neutral third person singular pronoun, and I’m frankly rather relieved to see an emerging consensus forming around “they.” It may seem awkward for now, but this is how languages evolve - a grammatical “error” gets wedged into a niche to serve a linguistic need. The change is already happening, and in fifty years no one will remember or care that it used to be wrong.

        • joojmachine@lemmy.mlOP
          link
          fedilink
          arrow-up
          7
          ·
          4 years ago

          Boy, I wish I could say the same for portuguese (my native language). It has the same issue but the second someone tries to bring the idea up they are instantly treated like the “twitter cancer trying to destroy our language”.

  • pancake@lemmy.ml
    link
    fedilink
    arrow-up
    8
    ·
    4 years ago

    The problem with AI is that we don’t “program” it directly. It learns on its own, absorbing any data you throw at it and naïvely interpreting it. Just like a small child might make inappropriate comments based on what they have heard, since being respectful to other people requires awareness of them.

    • joojmachine@lemmy.mlOP
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      4 years ago

      Exactly. The problem is, with a small child, you can properly teach it what’s right and wrong, while with AI it’s much more complicated to do so. There should be some consideration taken by people who develop this kind of software (in this case Google) about the issues it can create, since it basically parrots societal behaviors.

  • k_o_t@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    4 years ago

    in the case of google translate (or any translation tool for that matter) it’s not even an issue with the ml algorithm itself: separate translations can be created specifically for languages that have non-gendered pronounces, to say something like they or he/she or whatever, for other non-concrete cases it’s a different issue of course

    i am actually against using ml wherever it is remotely makes sense, imo the entire movement has been made worse by the hype around it and skewed it’s applications away from topics where its use could be very helpful and bring improvements to society (things like science and medicine), toward things which are easy to monetize, and we now have people with phds in ml trying to discover new ways to keep users longer on youtube to watch more ads

    my point being that, if you could throw away all the unnecessary applications of ml where gender/race/ethnicity bias could be a problem (like automated job hiring, crime profiling, information gathering for monetization purposes), there aren’t that many things left, and the ones that left the easy fix would be just getting more non standard data, where [semi]supervised learning is concerned of course

    but maybe i’m wrong, i’m curious what you think

    • joojmachine@lemmy.mlOP
      link
      fedilink
      arrow-up
      5
      ·
      4 years ago

      I’m not that knowledgeable about ML but from what I’ve seen, I wholeheartedly agree. For tasks where any bias is an issue it shouldn’t be used, unless it can be developed in a way that properly deals with those biases. The lack of doing so always end up reinforcing the issues you mentioned.

  • AgreeableLandscape@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    4 years ago

    Google Translate is based on AI, so someone on Mastodon suggested it might be a gender bias in the training data.

    Also, English has “they” for gender neutral, get on that, Google.

  • Metawish@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    4 years ago

    Haha they is a gender neutral single pronoun, its in the dictionary. I don’t mind machine learning in a sense because it helps to illuminate things like this. if it just spits out what it’s learned, that’s literally what humans do and what we are learning in turn. The only difference is humans can review things like this and change.

  • DrKozaky@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    4 years ago

    Sorry, but i laughed a bit when you said “immortalizing those issues”, it would be so funny if our society issues went into machines algorithms lmao.

  • xarvos@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    4 years ago

    The data fed to the algorithm is probably not balanced in terms of gender.

  • AlmaemberTheGreat@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    4 years ago

    Bojler eladó :)

    Come one though, nobody takes gtranslate seriously. This is probably the smallest translation error to exist.

    • joojmachine@lemmy.mlOP
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      4 years ago

      It could and should still do better than this, specially considering the stereotypical assumptions it took.

      In such cases it should, for example, make it clearer that it isn’t gender defined by using “(he/she)” instead of just assigning one.

      • aronkvh@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        4 years ago

        which would help understanding it. maybe a single-use they could be better too than a random pronoun. and also I tested some other sentences: Egy ember (a person) was translated to a man. but Translate improved a lot in the last couple years since the translations make sense at least