• shufflerofrocks@beehaw.org
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    No lol

    ChatGPT sucks at proper answers in my experience, I tried using it to generate code or summarise documents for me, but it sucked at both.

    I can google well enough to almost always get what I need, but I can also see areas where google search is pretty shit right now - almost everything non-tech related that I google gives me a shit feed of SEO-keywords-bloated pages that have no actual content.

    Problem is, I don’t think any search engine still comes close to Google. I’ve tried DuckDuckGo, and it’s crap in my experience.

    I’ve had good luck with Yandex, but everything else is meh.

    What are your search engine recommendations?

  • cavemeat@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I moved to a non-google search engine. The problem with ChatGPT is that it sounds very plausible and truthful, but is often just making shit up.

    • noodlejetski@beehaw.org
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      it sounds very plausible and truthful, but is often just making shit up.

      I’ve seen someone call it “mansplaining as a service”.

    • Mersampa@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      It helps to understand what ChatGPT is, and what it isn’t.

      ChatGPT does not understand anything you say. And it only does one word (technically part of a word, but to keep it simple) at a time.

      What it’s doing is it is guessing the most likely next word based on the words that have come before it. If you think of you phone’s keyboard, it probably has word suggestions for what to say next. ChatGPT is like hitting the recommended word over and over until it has an answer. It’s spouting words based on how likely the word is to come next. That is all.

      It uses advanced machine learning to do that, but whether it counts as AI is for the reader to decide. But it’s certainly not planning out a thoughtful answer for you.

      And that’s not even taking into account that the training data largely comes from the internet, the place where people continuously make shit up.

  • Mindless_Enigma@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Not at all. ChatGPT is a great tool for helping with brainstorming or creating a base for further work, but I wouldn’t rely on it at all for accurate answers to questions. ChatGPT’s main function is to create responses to prompts that look like real answers. It has no inclination to give you a correct answer and really has no clue what a correct or incorrect answer is. With the amount of effort you have to put in to research and verify the answer ChatGPT gave you is correct, you’re probably better off just skipping it and just doing the research yourself.

  • jjsearle@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I find too often that ChatGPT will just make things up when asking things outside of the basics. Most of the time it is just quicker to search for official documentation and read it than relying on ChatGPT’s answers being right.

  • noodlejetski@beehaw.org
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    given that ChatGPT often gives you answers that sound right but are completely wrong, including - but not limited to - dates and causes of death of the very people who ask for biographical notes about themselves, made up articles, or made up legal cases, I don’t see how can anyone use it as a search engine replacement.

    • SolarSailer@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Glad someone mentioned the lawyer that screwed up by including ChatGPT’s fake cited sources. It will be interesting to see what comes from this.

      Additionally not a lot of people realize that they’ve signed an indemnification clause when using ChatGPT (or what that means).

      Basically OpenAI can send you the legal bills for any lawsuits that come from your use of ChatGPT. So if you “jailbroke” ChatGPT and posted an image of it telling you the recipe to something illegal. OpenAI could end up with a lawsuit on their hands and they would bill that user for all of the legal fees incurred.

      Possibly the first case of this we’ll see will be related to the defamation case that a certain Mayor from Australia could have against OpenAI. https://gizmodo.com/openai-defamation-chatbot-brian-hood-chatgpt-1850302595

      Even if OpenAI wins the lawsuit they will most likely bill the user who posted the image of ChatGPT defaming the mayor.

    • argv_minus_one@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Imagine being told that you’re dead.

      “You died on 2 August 1979. You were on a fishing ship at sea that sank with all hands.”

      You are a computer programmer, you were born in 1985, you’ve never been within 200 miles of any body of water larger than a river, and come to think of it, you’ve always had a peculiar fear of oceans and lakes.

      Did the AI make up a story of your death…or is it somehow aware that that’s the day your previous life ended?

  • JackFromWisconsin@midwest.social
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Not because of ChatGPT. But I encourage everyone to stop using Google and use one of the many other search engines. Break that Google supremacy.

  • Hexorg@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I use ChatGPT to discover new programming libraries that fit my requirements as it’s much easier to relay my requirements to ChatGPT than Google - e.g. a lightweight REST framework for C, not C++, that provides a steaming json parser.

    Then ChatGPT lists suggestions and then I go look up those libraries and refine my chat if needed.

    But for a fact-check it’s just much more reliable to use a search engine.

  • SolarSailer@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I switched to other non-google search engines (Brave Search/DuckDuckGo) before chatGPT.
    I use ChatGPT for ideas and suggestions, or when I get stuck on something. I treat it like a very knowledgeable person who usually gets their sources mixed up.
    ChatGPT points me in the right direction and that’s enough to dig deeper into whatever it is I’m trying to do.

    It’s excellent for pseudo-code, but in my experience it isn’t reliable with actual code.

    • Mersampa@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Yes! It’s who you ask when you aren’t sure what the idea, concept, or term is. Once you are armed with that information, use a proper search engine.

      • SolarSailer@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Yeah, another use that I know I’ll be using it for (or at least Bing’s Chat) will be summarizing large documents (especially in the sense of becoming a more informed voter).

        I don’t have time to read through the thousands of pages of legalese that our lawmakers come up with. But instead of having to wait or only rely on summaries from others I can run it through an AI to give summaries of each section and then read into anything that piques my interest.

        It might be interesting to even train a smaller LLM that does this more efficiently.

        The next step would be a LLM that pays more attention to unintended consequences of laws due to the way they’re written. But for something really effective I imagine that would require the assistance of a large number of experts in the field… And/Or a lot of research on laws being overturned, loopholes fixed, etc.

        Even then it’s important that we understand that these tools are far from perfect. And we should question results rather than accepting them at face value.

  • Barbarian@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    ChatGPT is absolutely NOT the right tool for the job if you want answers to questions. People need to understand that ChatGPT is not an AI. It doesn’t even have the concept of truth and fiction, let alone the ability to differentiate.

    It is a very sophisticated and very advanced autocomplete, using probabilities to attempt to predict the next word (or tokens, if you want to get technical) over and over again until probability says it’s done. It’s great for writing boilerplate documents without any facts it has to get right, creative writing (although it tends to produce very cliche text, understandably) and boilerplate code that’s been written millions of times before in its training data. Do NOT use it for anything where facts matter.

  • raj@lemmy.one
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I prefer chatGPT, but it’s more.cumbersome to use on the phone than Google Assistant.

  • bappity@lemmy.one
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    sometimes I use it to write small blocks of code to save a BUNCH of time, also saves having to look through stack overflow >_>

  • potcandan@lemmy.one
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    No but I started almost googling questions in a chatGPT format before stopping myself and realizing it’s a chatGPT question. Like for example asking a anthropology question about “what do we know about x species from x period?” I could google it but then do some clicking before I find my answer, or not. But chatGPT can give me something to then go ahead and google for more detail. One day soon google (or the worlds largest search engine) will = an advanced chatGPT. Also chatGPT is too early to be reliable, it’s wrong waaay too much and in the dumbest ways. If you don’t also use google to find sources you can verify stuff with you may end up looking really silly.