BrikoX@lemmy.zipM to Technology@lemmy.zipEnglish · 5 months agoResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comexternal-linkmessage-square8fedilinkarrow-up139arrow-down11file-textcross-posted to: machine_learning@programming.devtechnology@lemmy.world
arrow-up138arrow-down1external-linkResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comBrikoX@lemmy.zipM to Technology@lemmy.zipEnglish · 5 months agomessage-square8fedilinkfile-textcross-posted to: machine_learning@programming.devtechnology@lemmy.world
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up2·5 months agoOnly for maximum efficiency. LLMs already run tolerably well on normal CPUs and this technique would make it much more efficient there as well.
Only for maximum efficiency. LLMs already run tolerably well on normal CPUs and this technique would make it much more efficient there as well.