Yes power usage is constantly predicted by utilities. Production must match consumption exactly at every moment. This means weather forecasting is an essential part of managing a power grid, and doubly so with intermittent renewables.
I think the local overloading has something to do with transformers not being able to handle the massive local overproduction. It’s not just power not being consumed, it’s power being injected into the grid.
For current to flow out of your house the voltage inside the house has to be slightly higher than outside. Not by much, but a little. So the inverter has a higher output voltage than line voltage by design. If everyone does this and some of the power has nowhere to go, then the average voltage goes up measurably.
This wouldn’t be a problem if the grid had been designed to be able to bring power out of residential areas, but my casual understanding is that this doesn’t work very well with existing infrastructure, so with a bunch of extra power that has a hard time getting out the voltage keeps climbing until some inverters hit their safety shutoff.
Ah ye that makes sense! The grid is pushing 230v in, so to get power out you push harder back, so for example 240v. Thanks!
I know inverters have a safety feature to shutdown if the input voltage is not in range so it doesnt push power on a open net etc. Have had people tell me that inverters doing that was a problem, but discovered they shutdown if the input isnt right!
Yes power usage is constantly predicted by utilities. Production must match consumption exactly at every moment. This means weather forecasting is an essential part of managing a power grid, and doubly so with intermittent renewables.
I think the local overloading has something to do with transformers not being able to handle the massive local overproduction. It’s not just power not being consumed, it’s power being injected into the grid.
but it outputs 230v, how would that ever get to 250v? keep in mind, im not an electronics engineer just guessing with what i know
For current to flow out of your house the voltage inside the house has to be slightly higher than outside. Not by much, but a little. So the inverter has a higher output voltage than line voltage by design. If everyone does this and some of the power has nowhere to go, then the average voltage goes up measurably.
This wouldn’t be a problem if the grid had been designed to be able to bring power out of residential areas, but my casual understanding is that this doesn’t work very well with existing infrastructure, so with a bunch of extra power that has a hard time getting out the voltage keeps climbing until some inverters hit their safety shutoff.
Ah ye that makes sense! The grid is pushing 230v in, so to get power out you push harder back, so for example 240v. Thanks!
I know inverters have a safety feature to shutdown if the input voltage is not in range so it doesnt push power on a open net etc. Have had people tell me that inverters doing that was a problem, but discovered they shutdown if the input isnt right!