Can I use a dimmer on a transformer light

ted

Registered User
Messages
223
Can I use a dimmer on a transformer light. I have a ceiling light that has a number of little bulbs in a wire ball but it's too bright. I heard that it's not possible to use a dimmer. Is this true?
 
It depends what you've got.
If the fitting has a regular transformer then it can be dimmed but be warned it will buzz.
If the fitting has an electronic transformer then it may or may not be a dimmable type. If it is built in to your fitting it probably won't be dimmable.
But you may be able to change the transformer to a dimmable type.
The simplest way to find out what you've got is to try it with a dimmer.
Either it will work or it won't, you won't damage anything.
 
Either it will work or it won't, you won't damage anything.

Suck-and-see comments on electrics worry me.

Dimmers are rated for specific wattages/loads. Under-rating them will physically burn them out.

Dimmers lower the voltage. Lowering the voltage on a primary side of a transformer may well be going below that primary winding's rating and it relative to it's load (secondary side).

Lowering the primary voltage lowers the secondary voltage and hence raises the current load on secondary and possibly above secondary cable rating, hence leading to overheating.

ted, I suggest you contact the light/transformer manufacturer before interfering with it's load.
 
Absolutely not true!

There's potential for overheating and consequent fire risk.

How is there potential for overheating?

Lowering the primary voltage lowers the secondary voltage and hence raises the current load on secondary and possibly above secondary cable rating, hence leading to overheating.

Lowering the voltage to a fixed load (i.e. a lamp) doesn't increase the current! Aside from being against the laws of physics, its counterintuitive to what every electrician knows from experience, 230V lamps fed at 115V take less current.

Unless you can contact the light fittings manufacturer its impossible to tell whether it will work on a dimmer from the info given so I'd be with Hoagy on this one; stick it on a dimmer and see if it works.
 
If Hoagy and yourself are practicing as electricians, I'm all for much tighter regulation. The above is 2nd year apprenticeship science.

First Year, actually, but in my case it was back in the 60's.

I think you are being unnecessarily personal about this, particulary since you're quite wrong.
I have no wish to engage in a slanging match, but to use the example you gave:
100w lamp on 100v = 1amp, hence lamp resistance R=V/I=100/1=100Ohms
Now if you put the same 100 Ohm lamp onto a 50volt supply the current will be I=V/R, 50/100 = 0.5 amps and the power will be VI, 50x0.5 or 25w.

Also, dimmer switches don't work by lowering the voltage, they chop the sine wave at varying points of the cycle and the peak voltage is actually
there through the range of the dimmer.
And I've never heard of a dimmer failing by being under-loaded.
 
Hoagy/Copper, sincerest apologies and have taken back what was said. Confused resistance with power.
When I said under-rating above I meant the dimmer being rated lower than the load.
Note. I am not a practicing electrician.
 
How is there potential for overheating?

It may be over 20 years since I got a degree in electronic engineering, and I'm the first to admit my knowledge of power elecrics was never that great, but without boring you with the details of inductive loadings on supplies I can assure you without knowing the characteristics of the transformer (and possibly the type of dimmer too), it's impossible to say what will happen.

You can certainly put dimmers on low-voltage lights (I have them myself), but only if the transformers are rated to take them. Not all transformers (even the "electronic" ones) are rated to do this.

Any advice other than to consult someone who knows the specifics of the products being used and is qualified to give an opinion, is downright dangerous.
 
Hoagy/Copper, sincerest apologies and have taken back what was said.

No bother, I've never seen, heard or tasted anything which couldn't be argued about, although it's probably not a good idea to question whether people should be in their profession, even if they're talking crap; other people could get offended!

Rambling explanation;

I think where the confusion lies is P=VI, where current and voltage are inversely proportional. But this will only apply when the power is constant. But the power is not constant, the resistance is, as Hoagy has shown.

So when they say a lamp dissipates 100W at 230V, it doesn't mean it will still dissipate 100W of power at all voltages, only at 230V. Otherwise if you connected that same lamp to a 1V supply it would draw 230 times its normal current! What about 0.001V then, 230,000 times its normal current? 0.000001V? You're getting into crazy amperages.

If that were to hold true I'm not sure what would happen, the universe would probably unravel into chaos or something because it makes no sense at all. Just asked my brother and he's studying it for his Junior Cert science at the moment.
 
Hoagy/Copper/SineWave, at the risk of boring everyone with a further dose of electical theory, I think you're all mistaken: I doubt there's any domestic dimmer for lighting that works simply by adjusting the voltage. That's not how they work (if they did, they'd have to dissipate all the excess energy as heat). Hence, simply quoting Ohm's law isn't that relevant to what's going on.

Dimmers work by switching on and off over the period of the AC signal from the mains. It is this switching, and the consequent severe distortion of the AC signal, that can cause transformers to operate outside their ratings (the science bit "due to the high harmonic content and out of phase components in the signal the transformer may saturate"). As an aside, it's this effecrt that can cause regular bulbls to "sing" sometimes when dimmed.

Another science bit: dimmers can also introduce a DC offset into the signal. The energy represented by this has nowehere else to go in a circuit with a transformer other than to be dissipated (as heat) by the transformer. Although transformers may have a thermal cut out, it will get hot, and potentially very hot, in these circumstances. Hence the danger of overheating, and consequent fire risk.

My reason for persisting with this by the way is that Hoagy's statement "Either it will work or it won't, you won't damage anything" is dangerous in my opinion.
 
Back
Top