We know the equation kva= kw/power factor. When power factor reduces, kva increases. The substation authorities have to pay a penalty if demand indicator (kva) increases over an integrating period. How the maximum demand (kva) is being calculated in substations. Are there any ways to avoid load shedding of critical loads in substations if the maximum demand indicator goes over a maximum value when power factor is reduced?
In a Transmission or Distribution substation there is very little they can do about frequency ... at least directly. They can't directly speed up or slow down the generators.
However what they can do if frequency is falling is to shed load so that the generators aren't loaded as much and hence can speed up.
Over frequency they can't manage - they can't artificially add load to slow the generators down, at least not easily as it would mean reconfiguring the networks to reassign loads somewhere from one section of the grid to another which may affect individual generator loading.
But over frequency may have other effects that need to be monitored in substations such as Transformer over fluxing as Volts/Hertz.
Over frequency occurs rarely and is usually caused by sudden loss of load somewhere, i.e. loss of a major TF or line to a large load. However large loads (individually or as a collective) are considered important and hence rarely are they supplied by a single source feeder so that complete sudden loss of load off the grid is rare.
It is a bit of semantics but we don't have excess power as such since it must be consumed as quickly as it is generated - we have excess generating capacity in one area that could be used to supply load in another area where there is insufficient generating capacity (and hence where frequency would fall otherwise) - in that case we do things that help power to flow in certain directions such as voltage control and SVCs etc.
However if we see over frequency happening, it is seen at the generator and they will generally take responsibility for slowing it down through their generator controls. They don't ring up the TX or Dx utility and ask to be switched around to have more loads connected to them - which would inherently take load off somewhere else.
However, whilst the generator can also correct under frequency to some degree, the generator has limited ability to speed it up if there is too much load causing under frequency - e.g. suppose there are two generators on line one 100 MVA rated and the other 10 MVA rated both at full power. The 10 MVA trips, the other one is going to be overloaded and slow down as an under frequency (as well as current increase depending on the controls and response times). To try to speed up the generator whilst it is supply the total 110 MVA is hard to do, but an UFLKS scheme will automatically help the generator "catch up".
But still, "minor" under/over frequency is controlled at the generator, not by adding/removing load. The whole point is to get power to the load, not remove the load.