• jaemo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 months ago

    Ollama is actually pretty decent at stuff now, and comparable in speed to chat gpt on a sort of busy day. I’m enjoying having a constant rubber duck to bounce ideas off.

    • harsh3466@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      That’s cool. I haven’t looked at any local/foss llms or other generators, largely because I don’t have a use case for them.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        If your concern is that we’re “not getting anything” in exchange for the training data AI trainers have gleaned from your postings, then those open-source AIs are what you should be taking a look at. IMO they’re well worth the trade.

        • harsh3466@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Agree. When I feel like playing and/or have a use case for myself I’ll be looking at open source ai.

        • jaemo@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          I’ve been playing with a locally installed instance of big agi really like the UI but it’s missing the RAG part. I’m also cobbling my own together for fun and not profit to try to stay relevant in these hard times. Langchain is some wild stuff.

        • harsh3466@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Thank you. Gonna save the link for when I have a use case and/or want to play around