- cross-posted to:
- hardware@lemmy.world
- cross-posted to:
- hardware@lemmy.world
cross-posted from: https://sopuli.xyz/post/18617680
In contrast to stuff like AI training or crypto, chips at least fulfill an actually useful function, so I don’t see the issue with their manufacturing consuming a lot of energy. Or should we compare the same for cars or medicine?
Right? I was just thinking that entire countries run on chips so it sort of sounds about right at least.
What exactly do you think these chips are used for?
Because it’s often enough AI, crypto and bullshit IoT.
Cars, manufacturing, microwaves, washer\dryer, dishwasher, cellphones/tablets, anything wireless. There are more non crypto/AI products than not.
The vast, vast majority of chips produced are “old generation” chips used for relatively mundane purposes. The high-tech stuff you see in the news is a minority (though it’s pricey enough that it doesn’t look that way in company earnings reports).
Think power supplies, middle-of-the-road CPUs, ASICs for common I/O like USB and ethernet, timing devices, and wireless communication modules.
And that’s mostly the “bullshit IoT” category. It’s not like the demand for phones and laptops exploded in the last years, it’s IoT, AI and other useless crap - regardless of the process node.
Even a non-IoT electronic device still runs on many different chips.
There are people who want AI, crypto, and IoT things. If there weren’t then there’d be no money to be made in selling it.
Okay. What are we supposed to do, not use chips? They’re kind of a main character of the 21st century.
This would be a great application of those nuke plants fuckin’ Google and Amazon want to build.
We could start by not requiring new chips every few years.
For 90% of the users, there hasn’t been any actual gain within the last 5-10 years. Older computers work perfectly fine, but artificial slow downs and bad software cause laptops to feel sluggish for most users.
Phones haven’t really advanced either. But apps and OSes are too bloated, hardware impossible to repair, so a new phone it is.
Every device nowadays needs wifi and AI for some reason, so of course a new dishwasher has more computing power than an early Cray, even though nothing of that is ever used.
Tech companies are terrified of becoming commodities, even though a good chunk of them basically are at this point.
Intel would probably be in a better spot if they’d just leaned into that rather than try to regain the market dominance they once had.
Do 7 nm chips are more energy intensive than older 100 nm?
Or it’s just scale, more chips to manufacture, more energy needed.Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.
Gamers Nexus has some good videos about chip manufacturing if you are interested
Older chips definitely consume more watt per processor power, newer are usually better on top of that too.
Talking about usage, not construction.
Are the largest power consumption steps of semi conductor product happening 24/7? Could we simply align manufacturing times with useful solar production times? So no need to store all the solar power, with the idea of consuming most of it immediately for manufacturing. Then pass a run that Semi conductor fabs have to build out their own solar arrays to cover most of their power consumption.