With a fair amount of system integration (no wake word available) missing, of course. Which rather sounds like a feature.

  • 1984@lemmy.today
    link
    fedilink
    arrow-up
    22
    ·
    11 months ago

    I’m just happy to see Google drop the ball on so many things. Fuck Google.

        • Big P@feddit.uk
          link
          fedilink
          English
          arrow-up
          8
          ·
          11 months ago

          First of all fuck companies that just run on investor money making a huge loss to price out anyone else. Fuck openAI especially for doing that and somehow convincing a bunch of respectable companies to integrate their AI bullshit in places it doesn’t belong by making the price so low, which will come back and bite consumers when openAI inevitably puts the price up. Fuck them for overselling the capability of their AI and fuck them for taking advantage of the openness of the Internet to create it.

  • batcheck@beehaw.org
    link
    fedilink
    arrow-up
    12
    ·
    11 months ago

    You can already somewhat do that with iOS and Shortcuts if you have the chatgpt app. But as OP says, it’s only to talk to. Can’t use it to set a timer or reminder. It’s neat but a lot of my voice assistant stuff is “call X person” or “reply to X”. If I want to talk to chatgpt, I usually open the app and turn on voice for a session.

    If ChatGPT can weasel itself into a true assistant with the ability to perform certain actions, then it might be a game changer for the voice assistant space. It’s so much better at understanding context than current assistants on your local device.

    • Pixel@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      its one of the use-cases that AI truly makes sense in to me, because it feels like voice assistant technology has really plateau’d, and an LLM seems like a good way to process natural language

      • batcheck@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        I somewhat bought into the hype early and convinced work to pay for ChatGPT plus. At first I struggled to use it. One day I somewhat went “I bet it can’t help with X”, it did. Now I’m at the point where I default to it. There is this odd assumption that it will only be right some of the time. To me it’s rare where it’s wrong. Usually it mainly misunderstood the direction I was trying to go in and once I fix it with follow-up prompt I get what I want.

        I don’t think I do prompt engineering per se. It’s like google fu though. You need to learn to be descriptive to the point where the LLM can infer some context then even a year later it feels surreal. So far GPT-4 is the top for me. llama does well and a lot of the open models are nice. But if I want code or think through some work problem, GPT-4 gets me where I want to get amazingly fast. I make it do online research for me and then I have it validate my thoughts. I have to keep in mind “hey, it’s mainly predicting the next word”. But I rarely go “wow it was truly off here”. Trust but verify is where I’m at.

        I’m at the point where I feel like I do my 40 hour work week in 25 or so. I have a ton more free time. I have to be careful not to share any direct work related info, but that’s easy. I give it generic info then fill in the blanks myself.

        • Baut [she/her] auf.@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          5
          ·
          11 months ago

          I make it do online research for me and then I have it validate my thoughts.

          That’s precisely the issue. The words sound convincing, but this way of thinking leads to it becoming a yes-man. Either it confirms what you think, or your prompt is wrong.

          • batcheck@beehaw.org
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            Honestly, I confirm it because I use it for work. I had it do some research on comparing bunch of VDI solutions (the VMware/Broadcom thing has forced us to rethink things). It did a really good job summarizing things. I used to work in consulting, so I already knew what the comparison. It saved me hours of having to write that report. I usually verify in the term that “does it make sense”. I would do the same with a stackoverflow post before posting the code and so on.

  • jarfil@beehaw.org
    link
    fedilink
    arrow-up
    9
    ·
    11 months ago

    Android users can install Bing right now, GPT4 for free!

    Whether you’d rather funnel all your queries to Microsoft instead of Google… maybe let’s wait for another option.

  • salarua
    link
    fedilink
    arrow-up
    8
    ·
    11 months ago

    I can sort of see the appeal if it were able to plug into your smart home or something so it could respond to queries like “where’s the dog”, but as a general knowledge assistant it’s worse than useless (unless it magically doesn’t confabulate anything anymore)

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      11 months ago

      Yea that’s what I was thinking

      “What’s the weather like”

      Searching the web for “Whats the weather like” [⬜⬜🔳🔳🔳]

  • flux@lemmyis.fun
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Alrewdy can. At least through Home Assistant, can even be set as your default assistant on Android through the HA companion app

    I just integrated it into Home Assistant last week and it can now control my lights and tell me temperature in different rooms and interact with any smart devices in my home, in addition to everything regular ChatGPT can do. I also integrated elevenlabs voice so it sounds like a posh British gentleman. Then I named him Jarvis of course.

    It’s been pretty entertaining, but more of a gimmick than truly useful.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    🤖 I’m a bot that provides automatic summaries for articles:

    Click here to see the summary

    ChatGPT started as a text-only generative AI but received voice and image input capabilities in September.

    Usually, it’s the Google Assistant with system-wide availability in Android, but that’s not special home cooking from Google—it all happens via public APIs that technically any app can plug into.

    The assistant APIs are designed to be powerful, keeping some parts of the app running 24/7 no matter where you are.

    Rahman found that ChatGPT version 1.2023.352, released last month, included a new activity named “com.openai.voice.assistant.AssistantActivity.”

    As with Bixby and Alexa, there are no good apps to host your notes, reminders, calendar entries, shopping list items, or any other input-based functions you might want to do.

    It’s also reportedly working on a different assistant, “Pixie,” which would apparently launch with the Pixel 9, but that will be near the end of 2024.


    Saved 67% of original text.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I’ll take it. Google assistant is fucking trash. At least ChatGPT would occasionally give really funny responses.