OT: Relocatable Power Tap | The Boneyard

OT: Relocatable Power Tap

Status
Not open for further replies.

Hans Sprungfeld

Undecided
Joined
Aug 26, 2011
Messages
12,956
Reaction Score
31,337
For the first time, I noticed a warning not to plug one grounded, UL-listed, power strip into another. Here's the full list of what is plugged into the two strips: turntable, 20 watt receiver, dubbing cassette deck, USB adapter for mini-plug charging, USB adapter for lightning cord charging, then second power strip which has Blu-ray disc player, 24-inch TV/monitor, Chromecast, and router extender. Maximum load would be watching video w/sound through stereo, while caring a device. Questions: do I have to unplug the s conditions strip from the first? If so, can I plug two strips into the same grounded wall outlet? Fwiw, very little else is on the circuit. It's simply a secondry audio-visual setup located in a large, well-ventilated built-in
 

gtcam

Diehard since '65
Joined
Sep 12, 2012
Messages
10,983
Reaction Score
29,031
I'm no electrician or expert but I have been plugging one power strip into another more than once over the past 10 years - never ran into a problem at all. In fact have that in two rooms plus my garage right now
 

tdrink

Pessimistic idealist
Joined
Sep 6, 2011
Messages
4,948
Reaction Score
1,186
You have to add up the amperage being drawn by the various devices. If it exceeds the number on the breaker you will likely trip the breaker at some point. Id guess that plugging a multi-strip into another has more effect on the surge protection of the multi-strip.
 

jleves

Awesomeness
Joined
Aug 27, 2011
Messages
4,262
Reaction Score
15,109
No - don't do it. Firstly, the normal circuit is good for 15 amps, but you should never go above 80% of capacity. A 15 amp circuit @80% on a 110v circuit is 1250 watts max. But most power strips are probably rated at 10 amps so at 80% you only want to put 836 watts on a strip. Secondly, the weakest part of a circuit like that is the plug. Wires are solid and don't have a weak spot. Plugs are not 100% efficient and there are drops in capacity on those connections. So now your first strip may be losing 10% efficiency and you're down to 752 watts. Plug the second one in and the you're down to 677 watts. You are simply asking for one of those connections to get hot and start a fire. Find another outlet for your second strip. It's simply not worth a fire.
 

CL82

NCAA Men’s Basketball National Champions - Again!
Joined
Aug 24, 2011
Messages
56,857
Reaction Score
208,248
In order to decide whether you can plug the two strips together you will have do a little math homework. Calculating how much power both you and the items plugged into the strips is necessary. It really is pretty simple if you know what to look for and how to add up the loads.

The first thing to know is that circuits should only be loaded at 80% of the total circuit load. To help you understand the concept, if you have a 15-amp circuit, the safe operating amperage would be no greater than 12 amps. Total wattage would be 1,800 watts, meaning the safe wattage usage would be 1,440 watts.

If you have a 20-amp circuit, the safe operating amperage would be no greater than 16 amps. The total wattage would be 2,400 watts, meaning the safe wattage usage would be 1,920 watts.

On a 30-amp circuit, the safe operating amperage would be no greater than 24 amps. The total wattage would be 3,600 watts, meaning the safe wattage usage would be 2,880 watts.

To determine the wattage, you take the voltage times the amperage. Check the tags on all of whatever you are plugging into the strips for the required amperage rating. Add all of the lighting load by adding the total wattage of the light bulbs in your home. Once you've determined the total load for your home, you'll know whether you can plug the two strips together.

I hope that helps.
 
Joined
May 6, 2015
Messages
1,142
Reaction Score
2,898
Gotta respectfully disagree with some of the previous advice.
If a properly wired circuit has a 15 amp breaker, then the safe load is any load less than 15 amps, which is when the breaker breaks. If 14.9 amps is unsafe, then why would they use a 15 amp breaker? The reason they suggest using 80% of max rating is not for safety, but to reduce the irritation of breaker trips when current surges in the circuit due to motor starts and the like cause a momentary spike over the breaker amperage. If, for example, you put devices rated at 14.9 amps together on a 15 amp circuit, you are going to eventually trot to the basement to reset that breaker because surges will take you over 15 amps.

So now let's talk about what the breaker is doing. What the breaker is doing is protecting the wire in the wall. That's it. What happens after the electricity leaves the wall is not the concern of the breaker. On a standard 15 amp breaker circuit with 14 gauge wiring, the breaker will not break unless 15 amps or more travels through the breaker. It doesn't matter how the 15 amps or more are connected - a single device pulling 16 amps. 20 devices pulling 1 amp each.

So where is the real danger? The real danger is when you have Joe 6 Pack and Suzy homemaker plug something in to a 15 amp receptacle that is itself dangerous. For example, if somebody puts a very thin cord on a lamp and then puts a 150 watt bulb in the lamp, the 15 amp breaker will not trip (150 watts is about 1.4 amps on a house circuit), and the bulb will pull 150 watts through a very thin cord. That will result in the cord heating up, which could result in melting, fire, arcing, and the like. All bad.

So now let's get to your specific case. What's the danger? Well, if the power strips are rated at least at the level of the circuit, then you should be fine. That is, if you are on a 15 amp circuit and the power strips are rated at least at 15 amps, then, theoretically, you should be able to plug 20 in a row and have no issues (other than voltage drop), because it would be impossible to overload your powerstrips without also overloading your breaker, which would trip at 15 amps - for which the circuit and both powerstrips are rated.

A problem will arise if you are using power strips rated at less than the level of your circuit. Let's say, for example., you are using a 20 amp circuit. You then put a power strip on it that is rated at 15 amps. If that power strip does not include a 15 amp breaker, or if the power strip has a faulty breaker, then you can draw over 15 amps - say 18 - without the main breaker breaking, thereby putting the power strip over its rating and risking overheating and fire. 18 amps is tough to draw with typical household items, however, which is why they don't want you plugging multiple strips together. If you daisy chain enough together, you'll eventually pull more current than the strip(s) can handle, and, for the reasons stated above, that is problematic.

Best bet - spend good money on a power strip that is rated at the same amperage as your circuit, that includes sufficient receptacles for all of your needs, and that has a built in breaker. Spend the money, then keep it for life.
 

August_West

Universal remote, put it down on docking station.
Joined
Aug 29, 2011
Messages
51,262
Reaction Score
88,562
Run some new taps from the street.

CQq0Qex.jpg
 
Joined
Aug 27, 2011
Messages
24
Reaction Score
38
The 80% is for anything that is a continuous load, used more that 3 hours. Which would be most branch circuits in a house. Code section 210.20(A) if I remember right. I work low voltage after graduating electrical school and don't really deal with code much at my job. It would not be a good thing to run 14.9 amps on a 15 amp breaker. Wires and breakers will heat up with time and trip the breaker. You can plug a 20 amp device into a 15 amp outlet and chances are it would not trip right away. It would usually take a little time to build up till it tripped breaker. The heat is what will cause damage over time and cause the weak spots. If you did the calculations as mentioned above it could be safe. You can buy meters to measure amps being used.
 

Hans Sprungfeld

Undecided
Joined
Aug 26, 2011
Messages
12,956
Reaction Score
31,337
This is why I keep coming back - first to be ignored, then to be teased, then amused, then informed, then overinformed to the edge of needless confusion.

Next up: drill a couple holes to reduce wire clutter and then safely position a suitable, newly purchased surge protector.

As always, thanks for the laughs, thanks for the help. And best wishes for the new year.
 
Joined
May 6, 2015
Messages
1,142
Reaction Score
2,898
Wires and breakers will heat up with time and trip the breaker.
This is inaccurate. While heat may be generated with time, heat causes copper to become less conductive, which means less current will flow through the circuit, and the breaker will be less likely to trip, not more. If something is rated for 15 amps, by definition it is rated to accept to tolerate the heat produced by running 15 amps through it.

You can plug a 20 amp device into a 15 amp outlet and chances are it would not trip right away. It would usually take a little time to build up till it tripped breaker.
First, you could not plug a 20 amp device into a 15 amp receptacle because 20 amp plugs require a different receptacle configuration. If, however, you were to jury-rig the plug and put it in, you would be able to run the device for as precisely long as it drew less than 15 amps, and not a moment more. A device warming up should not increase current draw - in fact, the opposite is usually true, with start up amperage exceeding run amperage.
 
Joined
Aug 27, 2011
Messages
24
Reaction Score
38
This is inaccurate. While heat may be generated with time, heat causes copper to become less conductive, which means less current will flow through the circuit, and the breaker will be less likely to trip, not more. If something is rated for 15 amps, by definition it is rated to accept to tolerate the heat produced by running 15 amps through it.



First, you could not plug a 20 amp device into a 15 amp receptacle because 20 amp plugs require a different receptacle configuration. If, however, you were to jury-rig the plug and put it in, you would be able to run the device for as precisely long as it drew less than 15 amps, and not a moment more. A device warming up should not increase current draw - in fact, the opposite is usually true, with start up amperage exceeding run amperage.

I'm aware of the different receptacles, it was just an example like if he overloaded his power strips. I did graduate electrical school. Yes motors will pull a lot more on start up and there is also code to account for that. Also other protection for that on bigger motors. In his situation that would not be the case, from what he was saying, and I was trying to keep it simple and pertaining to what he was asking. Not trying to get too technical.

Running close to amperage for long periods of time is not good. That is the point of the code and it is to protect from heat and other things. Heat is one of the leading causes of damage and fires. It could cause wire damage and loose connections at receptacles from expanding and contraction. Just was trying to keep it simple for him and not get to technical. I would not do it! With doing the calculations, basically under 16 amps on a 20 amp circuit for the whole branch not just the surge protectors, it could be safe. Running 16.1 to 19.9 amps on a 20 amp circuit, for extended amount of time, is not safe and heat would definitely be and issue. Again just trying to answer his question and not bore people with too much info.
 
Last edited:
Status
Not open for further replies.

Online statistics

Members online
490
Guests online
3,046
Total visitors
3,536

Forum statistics

Threads
156,875
Messages
4,068,556
Members
9,950
Latest member
Woody69


Top Bottom