• paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    2 months ago

    I really want to like AI, I’d love to have an intelligent AI assistant or something, but I just struggle to find any uses for it outside of some really niche cases or for basic brainstorming tasks. Otherwise, it just feels like alot of work for very little benefit or results that I can’t even trust or use.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      edit-2
      2 months ago

      It’s useful.

      I keep Qwen 32B loaded on my desktop pretty much whenever its on, as an (unreliable) assistant to analyze or parse big texts, to do quick chores or write scripts, to bounce ideas off of or even as a offline replacement for google translate (though I specifically use aya 32B for that).

      It does “feel” different when the LLM is local, as you can manipulate the prompt syntax so easily, hammer it with multiple requests that come back really fast when it seems to get something wrong, not worry about refusals or data leakage and such.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          Soldered is better! It’s sometimes faster, definitely faster if it happens to be lpddr.

          But TBH the only thing that really matters his “how much VRAM do you have,” and Qwen 32B slots in at 24GB, or maybe 16GB if the GPU is totally empty and you tune your quantization carefully. And the cheapest way to that (until 2025) is a used MI60, P40 or 3090.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      edit-2
      2 months ago

      I receive alerts when people are outside my house, using security cameras, Blue Iris, CodeProject AI, Node-RED and Home Assistant, using a Google Coral for local AI. Entirely local - no cloud services apart from Google’s notification system to get notifications to my phone while I’m not home (which most Android apps use). That’s a good use case for AI since it avoids false positives that occur with regular motion detection.

      • WalnutLum@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I’ve been curious about google coral, but their memory is so tiny I’m not sure what kinds of models you can run on them

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          A lot of people use them for the use case I described (object detection for security cameras), using either Blue Iris or Frigate. They work pretty well for that use case.

          Wake word detection is a good use case too (eg if you’re making your own smart assistant).

          The Coral site lists a few use cases.