• 1 Post
  • 13 Comments
Joined 1 year ago
cake
Cake day: July 13th, 2023

help-circle




  • Using tools from physics to create something that is popular but unrelated to physics is enough for the nobel prize in physics?

    If only, it’s not even that! Neither Boltzmann machines nor Hopfield networks led to anything used in the modern spam and deepfake generating AI, nor in image recognition AI, or the like. This is the kind of stuff that struggles to get above 60% accuracy on MNIST (hand written digits).

    Hinton went on to do some different stuff based on backpropagation and gradient descent, on newer computers than those who came up with it long before him, and so he got Turing Award for that, and it’s a wee bit controversial because of the whole “people doing it before, but on worse computers, and so they didn’t get any award” thing, but at least it is for work that is on the path leading to modern AI and not for work that is part of the vast list of things that just didn’t work and it’s extremely hard to explain why you would even think they would work in the first place.






  • AI peddlers just love any “critique” that presumes the AI is great at something.

    Safety concern that LLMs would go Skynet? Say no more, I hear you and I’ll bring it up first thing in the Congress.

    Safety concern that terrorists might use it to make bombs? Say no more! I agree that the AI is so great for making bombs! We’ll restrict it to keep people safe!

    It sounds too horny, you say? Yeah, good point, I love it. Our technology is better than sex itself! We’ll keep it SFW to keep mankind from going extinct due to robosexuality!



  • I love the “criti-hype”. AI peddlers absolutely love any concerns that imply that the AI is really good at something.

    Safety concern that LLMs would go Skynet? Say no more, I hear you and I’ll bring it up in the congress!

    Safety concern that terrorists might use it to make bombs? Say no more! I agree that the AI is so great for making bombs! We’ll restrict it to keep people safe!

    Sexual roleplay? Yeah, good point, I love it. Our technology is better than sex itself! We’ll restrict it to keep mankind from falling into the sin of robosexuality and going extinct! I mean, of course, you can’t restrict something like that, but we’ll try, at least until we release a hornybot.

    But any concern about language modeling being fundamentally not the right tool for some job (Do you want to cite a paper or do you want to sample from the underlying probability distribution?), hey hey hows about we talk about the skynet thing instead?


  • It used to mean things like false positives in computer vision, where it is sort of appropriate: the AI is seeing something that’s not there.

    Then the machine translation people started misusing the term when their software mistranslated by adding something that was not present in the original text. They may have been already trying to be misleading with this term, because “hallucination” implies that the error happens when parsing the input text - which distracts from a very real concern about the possibility that what was added was being plagiarized from the training dataset (which carries risk of IP contamination).

    Now, what’s happening is that language models are very often a very wrong tool for the job. When you want to cite a court case as a precedent, you want a court case that actually existed - not a sample from the underlying probability distribution of possible court cases! LLM peddlers don’t want to ever admit that an LLM is the wrong tool for that job, so instead they pretend that it is the right tool that, alas, sometimes “hallucinates”.


  • YOU CAN DO THAT WITHOUT AI.

    Can they, though? Sure, in theory Google could hire millions of people to write overviews that are equally idiotic, but obviously that is not something they would actually do.

    I think there’s an underlying ethical theory at play here, which goes something like: it is fine to fill internet with half-plagiarized nonsense, as long as nobody dies, or at least, as long as Google can’t be culpable.