Feeding solar power back into municipal grid: Issues and finger-pointing

Here's the problem:

Many of the load devices you find in a typical home (primarily electric motors that run cooling systems, air conditioners, fridges and freezers) are not capable of regulating their input voltage.

So when a secondary electricity source comes on-line (like a small PV system) then in order to push it's current into the local grid it will have to *try* to raise it's output voltage in order to see some current flow. It might only be a few volts, maybe less.

But does that mean there will be a measurable net reduction in the current being supplied by the high-voltage substation for that corner of the city?

Not if your typical load device in homes surround the PV system will simply operate at a higher wattage.

The only sort of load that can effectively be regulated by a slight increase in local grid voltage are electric heaters. When you raise their input voltage slightly, they will put out more BTU of heat, and if their heat output set-point doesn't change, then their operational duty cycle will change slightly.

But in the case of an AC compressor, the fact that it might be getting a slightly higher input voltage because a neighboring house is feeding PV power into the local grid won't mean that the AC compressor will reduce it's current consumption from the municipal utility supplier because of the extra current coming from a neighbor's roof-top solar array. It just means the motor will use BOTH sources of current and (I suppose) run a little hotter but in the end not do any extra cooling work in the process (it's rotational speed won't change).

Same theory would hold true for lighting (incandescent especially). If you raise the input voltage, you'll get more light output - the bulb will simply consume all the juice it would normally get from the utility in addition to that being supplied by the neighborhood PV system.

The only way that a neighborhood PV system can actually suppliment municipal utility power is when the PV system is wired up as a dedicated sole supply source for a few select branch circuits. The way I see it, you have to feed certain select loads 100% from a PV system (ie - disconnect them from the municipal energy source) if you're going to make a meaningful contribution to the supply-side of a municipal or city-wide grid.

Reply to
Home Guy
Loading thread data ...

This explains generator output regulation by varying voltage:

formatting link
and this the effect on current in or out of a synchronous generator caused by varying the leading/lagging phase angle between the internal magnetic field and the line voltage:
formatting link

jsw

Reply to
Jim Wilkins

This is a fatal flaw in your argument. Transformers are not infinite sources. A utility transformer might supply a fault current 20x the rated current (for a "5% impedance" transformer). (While a transformer will supply a fault current larger than the rated current that is not likely with PV. PV is basically a constant current source.)

Using a real transformer houses will have far less available fault current.

Cite where 100kA is required.

I agree that is very likely. One reason is that a higher rating is not necessary.

(SquareD, if I remember right, has a rating of 20kA downstream from both the main and branch circuit breaker.)

I doubt many Canadian house panels have fuse protection, or are different from US panels with circuit breaker protection rated around 10kA.

The interrupt rating required goes up with the service current rating. For a house, the utility is not likely to have over 10,000kA available fault current. The transformers become too large, many houses are supplied with longer wires and higher resistance losses, and the system is much less safe.

I believe it would take a rather massive amount of PV installations to cause a problem. The PV installations would all have to be on the secondary of the same utility transformer. The transformer is then not likely to support the PV current back to the grid. If the fault current is 20x the transformer full load current, and the PV current is equal to the transformer full load current, the PV supply would increase the fault current by about 5% (assuming the inverter doesn't shut down). If there were too many PV installations the utility could put fewer houses on a transformer. Seems like a problem that is not that hard to handle for the utility, at least until PV generation becomes rather common.

Reply to
bud--

A devastating analysis.

I am sure when the utilities read it they will stop paralleling generators, since that just causes the amount of electricity used to go up from what would be used by isolated systems.

Reply to
bud--

This is a fatal flaw in your argument. Transformers are not infinite sources. A utility transformer might supply a fault current 20x the rated current (for a "5% impedance" transformer). (While a transformer will supply a fault current larger than the rated current that is not likely with PV. PV is basically a constant current source.)

Using a real transformer houses will have far less available fault current.

Cite where 100kA is required.

I agree that is very likely. One reason is that a higher rating is not necessary.

(SquareD, if I remember right, has a rating of 20kA downstream from both the main and branch circuit breaker.)

I doubt many Canadian house panels have fuse protection, or are different from US panels with circuit breaker protection rated around 10kA.

The interrupt rating required goes up with the service current rating. For a house, the utility is not likely to have over 10,000kA available fault current. The transformers become too large, many houses are supplied with longer wires and higher resistance losses, and the system is much less safe.

I believe it would take a rather massive amount of PV installations to cause a problem. The PV installations would all have to be on the secondary of the same utility transformer. The transformer is then not likely to support the PV current back to the grid. If the fault current is 20x the transformer full load current, and the PV current is equal to the transformer full load current, the PV supply would increase the fault current by about 5% (assuming the inverter doesn't shut down). If there were too many PV installations the utility could put fewer houses on a transformer. Seems like a problem that is not that hard to handle for the utility, at least until PV generation becomes rather common.

Reply to
m II

A devastating analysis.

I am sure when the utilities read it they will stop paralleling generators, since that just causes the amount of electricity used to go up from what would be used by isolated systems.

------------------

Careful! Sarcasm does not work well in a text medium, at all!

People cannot see your facial expression and people are never sure unless it is totally ridiculous.

I agree with your point but it is made very poorly in a text only medium.

mike

Reply to
m II

So, you don't increase current by raising the voltage, but you increase current by having a higher potential.

Now, difference in potential is voltage?

Reply to
g

1) The actual voltage increase will relate to the ratio of grid impedance vs local impedance, i.e. your local power consumers (fridges, heaters etc) has a much higher impedance relatively, thus the grid will "take" the majority of the generated power. The _only_ increase in voltage you will see results from the voltage drop in the grid components.
2) Pretty complex calculation, but yes, _somewhere_ one or more generating pieces of machinery will reduce its output. Makes sense intuitively, does it not?
3) You just set your PV system to operate at max power, the grid system will balance out automatically. See 1) above
4) The grid voltage does actually fluctuate a bit, depending on load. Power companies have means of adjusting line voltages depending on load fluctuations. The average subscriber never knows this.
5) That will be a very inefficient way to utilize your PV system.

A simplified way is to look at the grid as a battery. When your PV system generates more power than your local consumers, the surplus will flow into the grid. At all other times the grid and the PV will both supply the needed power to the local consumers.

6) Fairly close to impossible. How do you match local power consumers to hit the 100% PV capacity?
Reply to
g

On 4/12/2011 9:47 AM g spake thus:

No, no, no: increasing the current doesn't increase the potential (that's voltage). It increases the *flow* of electricity (= current), at least the maximum possible current. But that's not the same thing as potential difference.

Example: Let's say you run your house off 12 volt batteries (just for illustration). The *potential* of your power circuit is 12 volts (assuming the batteries are fully charged, and they'll actually be closer to 13.something, but let's call it 12).

Now let's say you add some more stuff to your house and find that your lights are going dim because the battery can't provide enough *current* (= amps) to the load. So what you do is add another battery in parallel with the first one. This doubles the available current (= amps), but it does *nothing* to change the voltage; it remains at 12 volts (nominal, as explained above). This is true no matter how many batteries you add

*in parallel* with each other. But each battery increases the *available* current (= amps) you can draw from your power source.

Notice that adding more batteries does not "push" more current through the system; it increases the amount of current that can be "pulled" (drawn from) the batteries.

Which is exactly the situation when you connect your photovoltaic system to "the grid". It increases the *available current* to the grid. It does not change the voltage of the grid; there's no need for it to be at a higher voltage than (but it needs to be at about the *same* voltage as) the grid.

Yes. Please refer to any good basic guide to electricity for more details.

Reply to
David Nebenzahl

bud-- full-quoted:

How many utilities connect the output of new parallel generating sources to the 120/208 connection side of a grid, instead of at the sub-station high-voltage side?

Reply to
Home Guy

I'm not arguing that the grid can't or won't take any, the majority, or all of the generated power.

The question here is - what exactly must the invertors do in order to get as much current as the PV system can supply into the grid.

If our analogy is pipes, water, and water pressure, then we have some pipes sitting at 120 PSI and we have a pump that must generate at least

121 PSI in order to push water into the already pressurized pipes. So the local pipe system now has a pressure of 121 PSI. If you measure the pressure far away from your pump, it will be 120 psi.

Not sure I understand what you're trying to say there.

No, I don't agree.

Hypothetically speaking, let's assume the local grid load is just a bunch of incandecent lights. A typical residential PV system might be, say, 5 kw. At 120 volts, that's about 42 amps. How are you going to push out 42 amps out to the grid? You're not going to do it by matching the grid voltage. You have to raise the grid voltage (at least as measured at your service connection) by lets say 1 volt. So all those incandescent bulbs being powered by the local grid will now see 121 volts instead of 120 volts. They're going to burn a little brighter - they're going to use all of the current that the local grid was already supplying to them, plus they're going to use your current as well.

Doesn't matter if we're talking about incandescent bulbs or AC motors. Switching power supplies - different story - but they're not a big part of the load anyways.

I don't see how - not at the level of the neighborhood step-down transformer. I don't see any mechanism for "balancing" to happen there.

If you're getting paid for every kwh of juice you're feeding into some revenue load, then the concept of "efficiency" doesn't apply. What does apply is ergonomics and practicality. I agree that a small-scale PV system can't be counted on to supply a reliable amount of power 24/7 to a revenue load customer (or even a dedicated branch circuit of a revenue load customer) to make such an effort workable - but I still stand by my assertion that the extra current a small PV system injects into the local low-voltage grid will not result in a current reduction from the utility's sub station to the local step-down transformer.

The extra current injected by the PV system will result in a small increase in the local grid voltage which in turn will be 100% consumed by local grid loads (motors, lights) and converted into waste heat with no additional useful work done by those load devices.

Reply to
Home Guy

Bad analogy. The 1V will be lost in the internal resistance of the inverter connection, which is much higher than that of the grid. Think of pouring water from a bucket into a lake. There's NO measurable rise in the lake level.

jsw

Reply to
Jim Wilkins

If that were the case, then your 42 amps would be converted into a tremendous amount of heat as it burns up that internal resistance, and there would be no measurable current for your revenue meter to measure.

For me to pour water into a lake, I have to raise it higher than the lake level.

Think of height as eqivalent to voltage potential.

Unless water is compressible, there has to be a change in lake level. The fact that I may not have a meter sensitive enough to measure it doesn't mean there's no change in the level.

Reply to
Home Guy

If you do a little research I think you will change your mind.

formatting link
At the bottom of that page you will find this link

formatting link

More specificly I wrote, "forcing power back into the grid". Power is watts or KW. That's volts times amps. The current will only flow if there is a difference in voltage.

It takes very little voltage difference to flow a lot of current when the "load" is the grid.

Yes, I do. Now let's see if you can understand this.

Take two 12 volt car batteries with one discharged to 11 volts. Connect them in parallel and check the voltage. It will be some value between what they measured seperately. While they are connected like this current is flowing from the charged battery to the discharged battery. Power is being forced into it raising its state of charge. If the two batteries had been at exactly the same voltage there would have been no current flow. Now take 8 AAA batteries connected in series to give 12 volts. Disconnect the charged battery and connect the AAAs to the 11 volt battery. Measure the voltage. It will be for all practical purposes unchanged from 11 volts. The AAA cells are charging the bigger battery but they are so small compared to it that they seem insignificant. That is how your PV system looks to the grid.

The voltage on the grid can vary by up to + or - 10%. It is usually kept withing + or - 5%. Take a volt meter and check the voltage at your wall outlet. Check it several times during the day and you will find that it varies. Don't try to claim that it doesn't, check it and you will find that it does. As loads are put on the grid it drags the voltage down. The power company responds by generating more power to bring the voltage back up. As loads are taken off the voltage will climb, and it is brought back down by producing less power.

The point here is that there is no such thing as the grid not being able to accept the power you have produced. As long as you are connected you can always force your KW in.

Reply to
Bruce Richmond

I don't think the issue is whether or not you can force current into the grid via your 120/208 VAC service connection.

The question is:

a) does your power source need to overcome the instantaneous line voltage in order to achieve a flow of current (answer: yes, and to the extent that your power source has the capacity to do so, you raise the output voltage as high as you can, because if you don't - then you have excess capacity that is not going to make it out to the grid and hence you won't gain revenue for the entire potential of your generating system)

b) by raising the voltage on your local 120/208 grid, can your local stepdown transformer adjust it's own operation by sensing that higher voltage and reduce it's own output voltage in an attempt to regulate the system back down to the desired setpoint? (answer: I don't know - probably not. The neighborhood stepdown transformers probably weren't designed to compete with sources of current being connected to their distribution outputs).

c) So if the voltage on your local 120/208 grid is being raised slightly because of your PV system and it's desire to push as much current back into the grid as it can generate, then will this actually reduce the amount of current that the regional sub-station is sending to your local step-down transformer? (answer: the substation probably doesn't have a direct line to your local stepdown transformer, and any alterations it can make to it's output voltage is probably seen by many step-down transformers including yours that are all wired to the same circuit. So in reality it's doubtful that the regional substation would even sense that your PV system has raised the local grid voltage).

d) So your PV system is raising the local grid voltage, and you're probably pushing out 40 amps at 120 VAC or 20 amps at 240 VAC on a sunny summer day. So what is that extra juice doing? Well, it's flowing through the compressor motors of 10 to 20 of your neighbor's AC units - whether they need it or not. Because you've raised the local grid voltage slightly, that translates into a few extra watts (maybe 250 watts for each house that's fed from the same stepdown transformer). So all the fridge compressors and AC compressor motors, lights - all linear loads are going to blow away that extra line voltage as heat - instead of useful work.

Nuf said?

Reply to
Home Guy

The inverter must ensure that it transforms the DC from PV to the frequency and voltage of the grid. To ensure flow of current into the grid the voltage must be attempted to be raised. Because there are losses between the inverter and the grid, the voltage will be higher than the grid.

Fairly good analogy, and due to internal resistance in the pipe then that must be overcome by having a higher pressure. Don't forget that somewhere someone else has to reduce the water flow into the pipe system in order to avoid pressure buildup. Because the water in the pipe system is used up as it is supplied, at the same rate.

See the pipe analogy above, the power lines from the inverter has some resistance, which results in a voltage drop. Therefore the voltage measured at the inverter will be slightly higher than measured a distance away.

Why? take a hypothetical grid with 1 megawatt consumption. Generating machinery produce that energy at a set voltage. Mr Homeowner connects to the grid with a 10kW PV array. If no power utility adjustment took place then the overall voltage of the grid will increase. OK for small fluctuations, but if enough PV arrays came online, somewhere energy production has to decrease or bad things will happen due to high grid voltage.

You cannot unless your local load is zero. You must subtract the local load from the generated PV array power if the house load is lower. If the house load is higher than the PV array output then you will use all the PV array power with the difference supplied from the grid.

Correct, due to a slightly raised voltage if there is a voltage drop between the inverter and the grid. (There is some drop)

Not possible, the current is controlled by the internal resistance in the lamp. They will draw a current by the formula volt/resistance. So when the PV array produces current, grid current is reduced.

The voltage increase you will see at the output of the inverter is very small, but it does depend on the cables used.

An example: I have a 300 feet underground cable to the nearest utility transformer and a 100A service panel.

If I max out the power, I will have a voltage drop over the cable of about 6 Volts. Much higher than normal households.

When your PV array is producing full power, and your house load matches that, then the voltage difference between the grid and inverter is zero.

at any other house load, current will flow in the power utility lines, and the inverter voltage increase is a function of the loss in those lines.

Reply to
g

A lot of them were sold for mountain vacation homes. Thieves steal them often for the copper though there is not that much copper in them.

Reply to
JIMMIE

No, it most definitely is *not*. Turn the motor to the next commutator step and you'll see that the current reverses in the winding.

That's why there are no brushes in AC motors? ;-)

...and you need a rotating magnetic field. That is, you need AC. ;-)

Reply to
krw

Correct. David should refrain from any EE lectures. He hasn't got it in him.

Reply to
krw

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.