Summarizing requires understanding what’s important, and LLMs don’t “understand” anything.
They can reduce word counts, and they have some statistical models that can tell them which words are fillers. But, the hilarious state of Apple Intelligence shows how frequently that breaks.
There is, or maybe was, a YouTube channel that would run well known song lyrics through various layers of translation, then attempt to sing the result to the tune of the original.
In my experience, LLMs aren’t really that good at summarizing
It’s more like they can “rewrite more concisely” which is a bit different
Summarizing requires understanding what’s important, and LLMs don’t “understand” anything.
They can reduce word counts, and they have some statistical models that can tell them which words are fillers. But, the hilarious state of Apple Intelligence shows how frequently that breaks.
I used to play this game with Google translate when it was newish
There is, or maybe was, a YouTube channel that would run well known song lyrics through various layers of translation, then attempt to sing the result to the tune of the original.
Gradually watermelon… I like shapes.
Twisted translations
Sounds about right to me.
🎵Once you know which one, you are acidic, to win!🎵
translation party!
Throw Japanese into English into Japanese into English ad nauseum, untill an ‘equilibrium’ statement is reached.
… Which was quite often nowhere near the original statement, in either language… but at least the translation algorithm agreed with itself.
you mean hallucinate
If it isn’t accurate to the source material, it isn’t concise.
LLMs are good at reducing word count.
In case you haven’t seen it, Tom7 created a delightful exploration of using an LLM to manipulate word counts.