• 0 Posts
  • 11 Comments
Joined 11 天前
cake
Cake day: 2025年1月24日

help-circle
  • @llama@lemmy.dbzer0.com Depends on the inference engine. Some of them will try to load the model until it blows up and runs out of memory. Which can cause its own problems. But it won’t overheat the phone, no. But if you DO use a model that the phone can run, like any intense computation, it can cause the phone to heat up. Best not run a long inference prompt while the phone is in your pocket, I think.