Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> When it is higher, there is a local surplus, when it is lower, there is a high load.

Or perhaps the voltage got too low, and an on-load tap changer in one of the transformers increased the output voltage. Voltage does not necessarily follow the load. AFAIK, the thing generators themselves use as the main feedback signal is not voltage, but frequency; but it's not a useful signal for consumers, given that generators are much stronger at keeping the frequency at its nominal value.



Frequency doesn't change with load.

Load causes the voltage to drop (that's what's happening when a "brown out" is triggered). Some loads cause the current to lead or lag the voltage wave (Inductive vs Capacitive loads, most are Inductive, particularly with heavy duty equipment). But that isn't changing the frequency but rather the phase of the current. This is all tied up with a number referred to the "Power factor" (see https://en.wikipedia.org/wiki/Power_factor ). essentially, the farther shifted current is from voltage, the more work is done by the power plants essentially heating grid wires (rather than doing something useful)

So, power grids will do 2 things. First, they'll work to keep the current and voltage phase in sync. They do this by adding extra capacitors/inductors.

Second, they work to maintain the voltage of their tie in to to the grid.

Generally speaking, the type of power plant matters as well. Base load plants will simply dump onto the grid at a constant rate (without really caring about what the voltage is) while peaker and load following plants will attempt to vary output relative to their voltage to try and keep the grid voltages stable.

You are correct, the voltage variance can be misleading at the customer level if the transformer is actively adjusting it's voltage ratio. I didn't consider that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: