$3.45 for a very nice, ~2000lm 97 CRI LED, about 99 lm/W. (Efficiency goes up quite a bit if you settle for 90 CRI.)
So that gives about 2000lm at about 25W, for <$30.
Wikipedia gives about 16 lm/W for incandescent, so 125W. At 10 hour per day, the LED options pays for itself quickly even at national average prices. In CA, it’s very fast.
To be fair, for high-end LEDs like this, the balance of the system is more expensive, because you need a heat sink. Incandescent lamps run very hot and don’t need heat sinks.
I think this is potentially promising, but I don’t think you can buy it:
This is just an anecdote, but I’ve had multiple “10 year” LED bulbs fail after just a year or two. I suspect much of the claims for these bulbs are theoretical as they just don’t hold up, probably for reasons the grandparent poster is pointing out.
Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves. I harvest the LEDs out of dead bulbs to use in hobby projects.
With Edison-style bulbs, anyway, the orientation they're mounted in makes a huge amount of difference. They're last a lot longer if they're oriented upright (base down) than in any other orientation because it reduces the heat buildup in the power supply.
This is the frustrating thing about LEDs that IDK we can change.
If there was a "DC" light socket in the house we could have LEDs outlasting owners, and for cheap. Nearly all the expense of LED bulbs is the power supply. Everything else is dirt cheap. A single home DC power supply with ~200W of output could light an entire house, flicker free.
What's even more frustrating is I think we could fix it. A national regulation for DC light sockets would fix it. Mandate a voltage, shape, and max amperage and BAM, you'll get 1000 different manufactures making standard compliant bulbs and home power supplies that will last an eternity.
I am designing an off-grid cabin with a solar panel array charging a bank of batteries with a propane generator backup. I run ethernet as power with a custom designed PCB that terminates at the outlet side where it exposes a 20 watt USB charging port and an ethernet port.
The lights are all basically cut 12v light strips inside of old light fixtures with a custom controller that also terminates PoE. The 48 volts that most PoE standards specify is more than enough to push power down the line for < 100 meter runs.
The advantage of PoE here is that anything under 50 volts is considered low voltage and does not need to follow the same rules as normal house wiring. I did not like that everything is hinging upon a beefy PoE switch so I actually made it passive PoE instead by design.
If you're willing to share your design, I'm sure there are other folks like myself who think this is a cool idea. I've wanted to do PoE (or passive PoE) for lights for a while now...
I am going to open source it. The goal was to be able to get all the SMD stuff available at JLPCB so you can just send it to be fabbed (with some thru-hole components you would just solder yourself) or I would also sell them at cost + 10%. My brother designed some 802.3at chips and was going to have him review my work first as I don't want to send out into the world a poorly design power system (there are enough of those things out there unfortunately).
We've had 2 standard DC outlets for a while now: 12V cigarette lighter and 5V USB. You do often see them in odd places. But the voltage and wattage of those specs is too low to be useful, so they haven't evolved into DC power distribution.
USB-C PD is at a useful voltage & wattage level, and so is Ethernet POE. I wouldn't be surprised to see them start to be used for general power distribution in niche applications, like RV's and off-grid cabins.
I don't think we're going to ever get a bulb standard, though.
Cars are starting to move to 48V DC. My under cabinet lighting in the kitchen are powered by DC from a power supply in the basement.
I could definitely see this becoming more common. Powering the ~100 watts of fixed lighting spread across my whole house on ten different 15A 120v circuits, each with their own arcfault breaker and 12 gauge copper electrical lines running back to the panel is fabulously expensive for what could be done with a bunch of CAT5 in each floor running to some conveniently located “POE injector” type devices.
You would want to be able to take a standard fixture and just push DC through it and use special bulbs with a standard A19 base, but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?
I would guess if for safety reasons it has to be a non-A19 connector, then your light fixture choices get cut down to almost nothing and no one will make the switch?
It’s really interesting to think about, most everything I’m plugging into AC outlets in my house, the first step is converting it to DC. A lot of my outlets I’ve switched to include USB ports so I don’t need the wall warts. If you have solar and battery backup even more-so you start to question why we are wasting so much money moving everything back and forth between DC/AC/DC within a house.
This is a fair point, breaking mechanical compatibility will at least stop any electrically exciting goofs from occurring from plugging a low voltage DC lamp into a (comparatively) high voltage AC socket.
> but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?
If by "standard" you mean a incandescent tungsten filament bulb, nothing at all.
For a true LED driver power supply, it would be constant current, so the tungsten filament would see 25mA (or whatever the constant current is set for) of DC, and nothing bad would happen (the filament also would not likely illuminate either).
Screwing in an LED bulb with integrated power supply, the external supply will still feed the constant current value, so what happens depends upon the design of the LED bulb's integrated power supply. If 25mA is enough to drive everything, the LED bulb might light up. If 25mA is not enough to drive everything, most likely nothing lights up.
48V without a current limit shouldn't be nothing, but you should expect less than 10% brightness.
For constant current, you'd need to drive at least 9 watts so it would be more like 250mA if not higher.
A 1600 lumen LED module might take as much or more current than a 60w incandescent. If your constant current supply can output between 0 volts and input volts, and it's set for a bulb with such a module, it would be able to power an incandescent bulb.
I suspect the results would be quite poor. Incandescent filaments increase their resistance when they get hotter, so driving them at constant RMS voltage means that the power will decrease as they heat up, which will give them a degree of stability. At constant current, though, the power will increase with increasing temperature.
(Of course, they’re quite hot and radiative cooling increases like T^4, so this isn’t necessarily a show stopper. But it’s probably not helpful.)
I wondered for a long time why we don't have standard built-in DC in building codes that could power our lights, and most electronic devices. Really the few things in my house that require full line voltage are all in the kitchen. Everything else has a transformer attached.
Me too - standard 12V and 5V rails run throughout the house would be great. I even thought about a wallpaper with conductive strips so the power could be invisibly delivered to any part of any room and "tapped" with a push-into-the-wall socket.
The usual counterargument is: voltage drop can become a problem. Trying to use one big power supply and use DC as your only distribution mechanism probably isn't a good idea.
But choosing a DC system for part of the house can make a lot of sense.
For one residential new construction room, it can be practical to have one shared power supply rather than one per LED. Say you have a 12 V, 5 A DC power supply. Using a star wiring topology, this can serve 10 lights (at 500 mA) fine with 16 AWG.
But how far can the power go before wire resistance causes too much Vdrop? Maybe one good transformer+rectifier per room? AC to the room and DC in the room. Those DC runs would be <5m each.
Until I see someone defining a representative example and running the numbers, I'm skeptical of their DC vs AC commentary.
I say this, because I was guilty of this exact shortcut thinking (in another comment). But I paused and thought to myself "I should run the numbers before just repeating the usual voltage drop criticism".
So I compared scenarios and it depends a lot on the topology, lengths, costs, and situation (new vs renovation).
Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.
I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)
I would have more confidence in an electrician apprentice on this one. I think they'd have more practical experience when it comes to figuring out what are the right questions to ask.
I did EE in college and do a fair bit of hands on residential electrical work.
P.S. How many sophomore level engineering students learn to do a sensitivity analysis?
>Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.
I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel? The comments I see seem to be advocating a whole-house solution, where a power supply is mounted in the breaker panel to supply LVDC to the whole unit. But this makes no sense for several reasons, especially the voltage drop.
>I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)
It's supposed to be dismissive, because this whole discussion is a bunch of software people trying to make up solutions for a perceived problem when they obviously don't know one of the most basic things about electrical theory, which makes all of their solutions unworkable. It's like a bunch of people trying to make a new kind of personal vehicle to replace cars when they don't even understand Newton's Laws. It's really annoying, because I see this kind of discussion pop up every so often, over many many years.
I have another comment here I don't feel like copy-and-pasting, but basically this whole discussion is silly because people are trying to make a solution using a very expensive power supply to fix a problem they see because they're buying cheap $2 light bulbs that burn out quickly, instead of just buying light fixtures that were properly engineered in the first place. With modern SMPSs, you're not going to get any kind of benefit by centralizing the power supply to drive individual LEDs, you're only going to get problems. LEDs need a driver circuit to provide constant current, and that means the power supply needs to be matched to the emitters and kept very close to it.
Where exactly are you going to put a power supply in a room? Make a special electrical box for it?
Switched-mode power supplies can be as small as your average Arduino board. They can fit inside the space used for wall outlets or light fixtures. Or you can put the DC transformer inside the light switch.
> I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel?
This sounds like a non-issue, specially considering the pervasive use of "unsightly" installations like air ducts, heating vents, radiators, electrical sockets, telecommunication service panels, routers, and even light fixtures.
If you intentionally dismiss obvious solutions, of course you only end up with problems without obvious solutions.
> It's supposed to be dismissive, because this whole discussion is (...)
It's not about DC vs AC, it's high-voltage vs low-voltage. The power dissipation by wire resistance scales with the square of the current ($P=RI^2$), and low line voltage means that you need large currents to transmit the same amount of power.
Whether or not that's feasible is going to depend a lot on the application. I don't think we'd ever fully rid homes of AC sockets, it's too useful for things like vacuum cleaners or space heaters.
But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine. Every light in a home, phone chargers, tablet chargers, computer monitors, televisions, computers? (maybe not gaming rigs, but certainly laptops and nucs).
>But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine.
How so? Exactly what benefit does it have over the current AC mains? With 48VDC, you'd still need to use DC-to-DC converters to power everything. I fail to see how that's any kind of improvement over the current switch-mode power supplies used. Instead, it'll just be less efficient because you'll get higher line losses in the power lines in the walls and all the way from wherever that 48VDC is coming from. If that's from a big SMPS in a closet somewhere, that's going to have its own losses. Overall, the entire system will have lower efficiency compared to the current system.
Exactly what problem are you trying to solve with this idea? If you think you're going to eliminate SMPSs in all your electronic equipment, you're not; that's a fantasy. Everything needs a power supply because electronics only work at very low voltages (5V, 3.3V, even 1.8V in places, now 20V with USB3) and most equipment has some kind of peculiar voltage requirements, and usually multiple different requirements inside the same device. There's no improvement in efficiency by running a computer, for instance, from 48VDC vs. 120VAC or 240VAC, in fact it's probably worse.
Also, DC and AC have differences in power transmission independent of resistance, some due to first principles (reactivity), and others related to devices for stepping voltage up or down (eg transformers).
Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC. Good designs for LED lighting have larger surface areas for heat dissipation and some physical/thermal separation between the LEDs and the power supply. A quality power supply does not produce flicker. As other comments have noted, dimming, or even predictable output requires some sort of power regulation even with DC input.
I think the way to change it is to replace sockets with hardwired LED fixtures. This is easy for something like a standalone ceiling light. It may be harder for other devices like ceiling fans that integrate a light bulb socket, but converting those devices to take DC power as in your proposal isn't easy either (most would just get discarded and replaced).
Doing it well is more expensive in the short-term than screw-in bulbs. A quick look on Amazon suggests integrated ceiling lights are about 10x the price of LED bulbs, though I suspect the longer service life pays for itself.
> Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC.
Absolutely, the incandescent light bulbs have that shape for a reason: the screw is small because there is nothing to put in it and it doesn't heat, the bulb is large to dissipate all the light and heat it generates. And the LED light bulbs have exactly opposite problems: almost all of the heat is generated near the screw while the bulb itself generates almost none and the light-emitter doesn't even need the bulb that large around of it. Oh, and the casing around the screw is plastic so the thermal conductivity is horrible. Honestly, it's a profoundly terrible form-factor which we're now stuck with.
It's also helpful to recognise that existing lighting fixtures and lamps were designed around the constraints of incandescent bulbs. The first generation of LED bulbs and lamps largely conform to these. As LEDs mature, both fixtures and lamps which address the limitations and requirements of the technology (transformers, perhaps dedicated 12v circuits, heat dissipation for the transformer rather than lighting elements themselves, and better light-temperature and intensity regulation) should emerge.
We're presently in the somewhat-messy half-emerged state. Think horseless carriages, wireless, and the days of dual gas/electric lighting and lamping systems (yes, these existed, and yes, the failure modes were ... much as you might imagine).
Already happens in New Zealand: lighting is usually low current 1mm2 wiring, and everything else is heavier gauge. Circuit breakers mostly care about Amps (all breakers could be rated to mains voltage if you wanted to avoid “weird”).
Also low voltage wiring can legally be done by anyone in NZ (a bonus when doing your own work, and a pitfall when buying a house?)
Maybe for you, but I have been considering just this. I would love to have dedicated 24v for lighting and charging of devices. My house already has various systems for lighting, such as xenox throughout kitchen under the cabinets and also the basement. Both are driven from separate transformers. Then I got the rest of the house with can lights utilizing br30 bulbs that are just a waste of 12 awg. The one place I was able to replace with dedicated LED fixture, I had to overpay for a decent product that wouldve been better off as a 24v basic LED light. When you consider most hvac systems operate at 24v, there is some real potential to create a decent standard serving multiple purposes.
And besides, idk if you have ever pulled 12ga wire, but it's a pita. Idk any electrician that would agree with you saying it would be a pain to cut back on heavy wire and pull half that with light 22 awg.
Lighting (on AC 110v / 220v circuits) also typically is specced for a lower peak amperage than utility or appliance outlets. For US codes, generally 15A rather than 20A. Lighting may use 20A, but isn't required to.
Other circuits must be 20A, e.g., kitchen outlets serving appliances.
This is what I want. A standard 48VDC socket would be a game changer for lighting.
Heck, with such a standard you could have 120VAC -> 48VDC converters and you'd be in the same position we are today with Leds, only better because you'd just have to replace the converter and not the whole bulb.
Not extremely thick. Wire losses remain similar at 12V as they were at 110V (Replace 100W bulb with a 10W bulb at 12V, current remains ~1A so wire losses stay the same as the were). Wire losses might be say 1W for 1mm2 cabling. 240V example: https://ausinet.com.au/voltage-drop/
Agree that it is worth upping voltage to chase a few more percent savings, but still need to consider other constraints.
There are also these type of "ceramic substrate" bulbs which claim to give longer life. I suspect other compromises in the construction may negate that.
I don't think we're exactly stuck with the old form factor. We can start phasing them out. Replacement of screw sockets with modern fixtures is well within the capabilities of the average DIYer (though perhaps some places it's illegal for anyone but a professional electrician to touch anything hardwired).
Well, one of the main sales point of the LED bulbs was compatibility with existing E14/E27/etc sockets: no need to change the wiring, or the fixtures, just buy a new, better light bulb and screw it right in! It will also serve longer and be better for the environment, what's not to like? We'll even ban the sales of 100W and higher incandescent light bulbs to help you make the right choice!
That's also the pitch of the smart bulbs: a sane way would be to make a smart light switch but what if you can't do that (e.g., you rent the apartment)? So we'll shove the controller chip into a disposable light bulb, that's still perfectly fine for the environment.
By the way, I don't know how things turned out in your part of the world but over here, after the ban went into the force the manufacturers of incandescent lightbulb started selling 95W light bulbs 8D
Probably going to sound crazy, but we could start running water pipes in front of the walls and under the ceilings and mounting the LED's directly on the pipes for cooling. Creativity, thinking wholistically... the entire contemporary western house design needs a rethink frankly, from DC circuits to electrification to modular, mass-produceable utility drop-in pods, all with an eye towards integrated systems design paired with scalable modularity.
One of the problems is that in some countries like the US, ceiling lamps are hard-wired and not "user-replaceable", so people have to resort to using those stupid bulbs in their old fixtures.
I live in Japan, and instead of just a pair of wires coming out of the ceiling, there is a standardized "ceiling socket" [0] which can also support the weight of a lamp. This means that swapping out light fixtures is plug and play, so the standard LED lamp is something like this [1] where you have a nice big flat metal plate backing the hardware is mounted to for heat-sinking.
I don't own any LED bulbs at all - all our lamps are of this type so I wouldn't have anywhere to put one.
It was the same when I lived in Sweden - a standard ceiling light outlet (IIRC there is a EU standard for this now called DCL) so that replacing light fixtures was easy. Moving into an apartment, often they wouldn't even come with light fixtures, you'd bring your own.
In the Netherlands we have just a pair of wires coming out of the ceiling but everyone replaces their own lamp fixtures anyway. Most people should be able to manage clamping or screwing down the brown or black wire to the L and the blue wire to N.
You can't have a low-voltage DC power supply supplying the entire home: the voltage drop between the supply and the LED would be huge. There's a reason we use higher voltages for long wire lengths: to increase efficiency and reduce line losses, since losses increase geometrically with the square of the current (according to Ohm's Law: P = R * I^2). Higher voltage means proportionally lower current, and geometrically lower losses.
And since we need high voltage (at least 100V) to keep line losses very low and allow the use of thinner-gauge copper wiring, we need a switching power supply at every light fixture, so it really doesn't matter if it's AC or DC, since modern SMPS (switch-mode power supplies) work equally well with either.
Finally, on top of all that, LEDs are current-driven devices, and need a constant-current power supply. So the power supply must be very close to the diodes, or else fluctuations in supply voltage will have very negative effects.
Low voltage DC lighting is a thing that has existed for a very, very long time. That most houses don't have it is more cultural than anything else, in my opinion.
That means it's totally fixable. You can install such a system in existing buildings right now, and it's not crazy expensive unless you want to run the wires inside the walls.
If we could shift cultural expectations around this, adding a LV system in new construction would not significantly increase the construction costs. It will start to be done if buyers start demanding it.
12V requires quite a lot of amps for enough light, so low DC is not optimal. Also LEDs are current driven devices, i.e. they will be sensitive to voltage changes (even with a current limiting resistor)
Low-voltage doesn't necessarily mean 12V. I think it's anything below about 50, although lighting systems currently marketed as "low voltage" are usually 12 or 24 volts.
The constant current thing is true, but that's not a terribly difficult problem.
For example: the stairwell shin-height lights in this 90s house are 12 VDC. There's a transformer plugged into a wall outlet in the nearby storage closet.
That works OK because the transformer is relatively close to the lights. If it were a reasonably-large house, and the transformer were on the opposite side of the house, you'd have a problem with a noticeable voltage drop. All these ideas people are throwing out here involve a single whole-house power supply. If it were for 48VDC, it would probably be fine, but 12V would result in significant line losses.
We already see transformers for a run of e.g. track lights, low voltage lights on tension wires, and so on. That's been a thing ever since halogens came to market.
Having multiple transformers is perfectly doable and commercially viable -- though I would appreciate more product availability for something easy to stash in the hollow space of a ceiling, like recessed lighting is installed.
I still don't see the point of all this. If you have a handful of lights in a room, and drive them with a single power supply, you're still going to have big problems: the line lengths to each fixture will be different, resulting in different voltages. You can't drive LEDs that way with good results: they need fixed current. And you can't daisy-chain them either: if one emitter dies, then the remaining ones will suddenly have different current, and probably die quickly. The proper way to drive LEDs is with a power supply very close to the emitters and designed specifically for those emitters and the (short) wire length to them, not 4 meters away and not with some variable-length wire that can't be designed for.
Everyone here is complaining about ultra-cheap LEDs that don't last very long because they're poorly engineered, but that's exactly what you're all trying to do here by using a separate, shared power supply. You could get away with that in the 1980s using incandescent bulbs, but you can't do it now unless you want the same crappy lifespan and reliability you're all complaining about.
The solution is very simple: buy fixtures that are engineered well. Switch-mode power supply electronics are not expensive at all, but when mfgs cheap out or do a crappy job designing them, you get bad results, usually short lifetime of either the power supply or the LED. What you're trying to do here is buy a really expensive power supply, which has to be engineered to a far greater degree and for a far wider range of operating conditions (since they don't know what you're going to connect to it), just because you had a bad experience buying some $2 light bulb that had a crappy power supply built-in. This really makes no sense.
LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.
LEDs are like 15% efficient and power supplies are >95%. They just need to be separated slightly so the LEDs aren't heating the power supply. Most recessed LED lighting now has a separate junction box with the power supply.
> LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.
I think the biggest problem is that many cheap power supplies cycle at lower frequencies that cause flickering which is perceptible subconsciously. A modern switchmode power supply might operate in the 50-500khz range which will not cause perceivable flickers.
The really cheap stuff actually doesn't even have a power supply!
There's a breed of LEDs that takes straight AC and rectifies it using the LEDs themselves. By using a large number of tiny LEDs in series (typically in COB form), you can easily reach close to 110v or even 220v, and then you add a small current limiting controller in series that's dirt cheap compared to magnetics...
These are super cheap, and appear bright, but they flicker at 120hz, which can be annoying when there's motion or if you're sensitive to it.
I'd say it's a very bad choice for a bedroom or living room light, but I have nothing against it for the outdoor lights, signage and a bunch of other applications where cost is king.
I have a serious problem with it for outdoor lighting and signage: it gives me a headache. Enough exposure will make me feel actively sick. The effect is not subtle.
Branding matters. If your brand is a light that flickers, you might want to consider the old adage penny wise, pound foolish. As a consumer, why would I choose to shop at an establishment that has flickering lights when I could shop at a different one that did not? Unless of course, I had no choice.
But then, a wise entrepreneur would recognize paying extra to have non-flickering signage would attract some customers.
Flickering lights can induce migraines in susceptible people, so literally, saving a penny here actively drives away business.
I think this falls apart in the details. LEDs want constant current power supplies, and their owners frequently want them to dim. So you will still need a power supply.
You can fudge it with resisters like in an LED strip, but you lose efficiency and dimming quality.
That being said, I expect that power supplies with 48VDC input or so would be cheaper.
>Maybe this could be a prosumer retrofit thing, where the AC voltage gets converted to DC in the junction box, and then DC is sent down to the fixture.
The problem is that in 99.99% of homes outlets are on the same circuits as light fixtures, you would need to do some major rewiring.
No, I'm saying you put a module into the junction box that the light fixture is attached to that serves as an AC/DC adapter, current limiting driver, and possibly a dimming sensor that would then provide downstream DC voltage to retrofit A19 bulbs.
Those bulbs would then have no internal switching systems to burn out and rely entirely on the module hidden behind the wall to handle their power needs.
> This is the frustrating thing about LEDs that IDK we can change.
I think that non-bulb LED fixtures are relatively common. For example, a style exists where you cut a hole in the ceiling and friction-fit the LEDs with the power supply up in the attic (presumably with infinite convective airflow): https://www.lowes.com/pd/Utilitech-Canless-Color-Choice-Inte...
These power supplies aren't going to die from overheating because the power supply is nowhere near the heat-producing LEDs. And, it's not like $30 for your entire light fixture is going to break the bank.
Yes. For the commercial DC lighting installations I've seen they were using power over ethernet. That's not necessarily the only way to deliver DC power but whatever you do it's going to be wired differently from 120 VAC.
I used to do electrical installs in commercial buildings and this was starting to catch on, mainly because the the practice of running ethernet (including the 8P8C aka RJ45 connector, patch paneling, etc) is already established. This always felt very roundabout and requires expensive networking equipment just to run lights which I do not personally like because it will just cause confusion.
> Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves.
It's true that the power supply versions are so poorly designed and inefficient that heat is a problem. Design and quality control effort could reduce heat generated by the entire assembly to a fraction of what the socket, fixture, and wiring can sink.
It's more common now to find bulbs that have no power supply at all. They're literally a rectifier made of LED's in series. If the bulb flashes at 2 * mains frequency, that's likely what you have. They die out quickly because the LED strings add up to a maximum voltage a bit over mains voltage, but that's RMS not peak. It's a natural outcome, as using enough LED's to accomodate peak voltage reduces light output by underdriving them, increases obvious flicker from dwell time below minimum voltage, and increases cost.
Hotwired LED strings are cheaper to design, source, assemble, bad parts fail fast more consistently with no effort wasted on quality control, and the market's so flooded and volatile that there's no room for consumer side quality awareness effective enough to make the negative outcomes matter. Power supplies in these bulbs are going away. Ubiquitous 2 * mains frequency strobing, short-lived, hotwired LED bulbs is where the home LED lighting market is taking us.
>> I harvest the LEDs out of dead bulbs to use in hobby projects.
This is a great idea and I would love it if you would post a Youtube how-to video. It might encourage a bunch of hobbyists to do something useful with those dead bulbs.
I've had a number of LED's fail after only a year or two, in fact more quickly than the average incandescent bulb. Seems like it defeats the whole purpose of "upgrading" and in fact may be more of a downgrade.
They're remarkably heat sensitive, especially cheap ones. Some bulbs would gladly run for 10 years in a room slightly above freezing temperature, but put them in a semi-enclosed fixture in a normal living space, and they're dead in a few months. Fully enclosed fixtures destroy them in no time, unless you buy really exotic bulbs with truly massive aluminum heatsinks, rated for high temp operating environments. I can't even find domestic suppliers for those, and had to order from China.
The LEDs are surface mount (although big surface mount components, so not particularly difficult to work with). I desolder them with hot air (although you can totally do it with a soldering iron), then use them later as any other surface mount LED. I don't have access to YouTube right now, so can't search for you, but there are tons of videos covering how to desolder and solder surface mount components. I'd be willing to bet there are multiple videos covering this for LEDs specifically, too.
>Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves.
I have the exact opposite experience, virtually every single light bulb I have torn down - one LED (all in series) has a black dot, if I shorten it - it will 'work' again. The bulbs I have seen tend to drive the LEDs so hard that some of the latter fail, power supplies might have huge ripple but generally don't fail catastrophically.
Edit: now thinking, it can be a US thing, with the voltage being ~120. Lower AC voltages means worse efficiency for the power supply (and all of them tend to be universal, unless totally cheapen out on the primary capacitor [250V] for the US market). Generally speaking low AC voltages have mostly disadvantages.
It could very well be more of a US problem. In the bulbs that I've torn down, they all have used a capacitive dropper power supply, and it's usually been the capacitor that failed.
I have had lamps that lived long enough to see LED failures (the "black dot of death") but that's not the most usual failure mode that I've personally encountered.
I've been considering following in the footsteps of Big Clive and modifying new LED bulbs to stop them from overdriving the LEDs, but my interest in doing that hasn't yet overcome my inherent laziness.
You're right though I want to mention the LEDs are also damaged by the heat, their color temperature will wander, lifetime will be reduced, and brightness per watt will also be reduced. Still useful for projects and areas where perfect lighting isn't as important.
Indeed. And, as another commenter pointed out, LED bulbs often overdrive the LEDs in order to maximize light output -- but doing so significantly decreases the lifespan of the LEDs themselves.
Sometimes it is the power supply, but I've also had a number that died simply because one LED burned out and failed open. Because they are wired in series it only takes one failed LED to take out the entire bulb. If you're a cheapskate you can sometimes get a bulb working again by testing the circuit and bypassing the burned out LED with a jumper wire.
If the bulb dies but you notice that all of the elements are still just barely on (like a dim spot of light in the middle of each one) then that's a good indication that you have a dead LED.
Are they in fixtures designed for incandescents? "Boob lamps" for example are highly efficient LED bulb destroyers, since they don't breathe, and the bulbs overheat.
More anecdata, but I had ~20 lights, of various quality (many Hue, some cheaper Home Depot specials) in boob lamps that survived at least 11 summers in a New England house without AC. Still there probably, but I moved out so I can only vouch for 11 years.
That seems especially lucky. I'd guess they're slightly different in design or installation from the ones in my previous apartment (also northeast) that killed several and various LEDs... They stopped dying when I had the idea to shim the cover and create some ventilation. Mine were fairly heavy glass domes and had some insulating material against the ceiling.
I wish I could say that. I have a 3 year old house that is about 3k square feet wiht alot of bulbs. Every bulb installed was LED and I have replaced most of them at this point, and some more then once. I have even had the electrical company come out thinking there was something wrong with the power in my house or the breaker box. Nothing...
It usually comes down to the brand, factors you can't be aware of like component choice, and the light fixture itself. A lot of LED edison socket lights die quickly in recessed lighting or other tight fixtures because the heat is death to them. Manufacturers build the worst technically functional capacitors into the power supplies with a low temperature rating, meaning they really can't handle anything above ambient.
This is also the same industry and the same players that were perfectly fine with agreeing to not improve incandescent light past 1000 lifetime hours, illegally. I have no doubt that there is a tacit agreement not to make good lighting, as that would extremely disrupt the industry.
I'm repeating myself a lot in this thread, so I'm sorry. What fixtures are the bulbs in? Are they a generic design meant for incandescents? A huge number of fixtures out there don't allow for enough heat to convect away and the bulbs overheat.
I've had some generally good experiences with LEDs as well. The only places that I've had somewhat higher failure rates for LEDs were places where I wanted a lot of light but the existing fixture had the bulbs trapped deep inside an enclosed fixture. I ended up buying a different brand than I normally do since it seemed the bulbs I had been going with just couldn't survive that hotbox, but since trying another brand the bulbs have lasted a couple of years so far.
Otherwise, for probably at least 40 or so bulbs swapped for LEDs over the years, I've experienced maybe 4 or 5 failures. The vast majority of my bulbs have been Feit and GE. I never buy smart bulbs. My best experiences have usually been to just buy LED fixtures though, I replaced a lot of my flush mount ceiling fixtures and ceiling fans for ones with integrated LEDs and have not had a single failure so far after a few years, knock on wood.
I had some problems with my old dimmer switches, but upgrading dimmers to newer ones which advertised good LED dimming and ensuring I had bulbs which stated dimming compatibility it eliminated my noise and flicker issues. There's a recent standard out there, NEMA SSL 7A, which seeks to ensure good compatibility. I set my dimmers to this SSL 7A mode and I've had no problems since.
I think there might be something about the wiring in some homes. Some of my LED bulbs have been going for a decade now without issues. I have a few fixtures where bulbs keep hauling in specific sockets after a few months. I have one fixture in my bathroom where a bulb was fine for a few months and then two replacement bulbs failed instantly and three third one failed again after a year or so. Maybe the voltage is wrong and keeps breaking the power supply?
It could be sensitive power circuitry failing due to power quality, but is more likely a heat buildup. LEDs bulbs fail rapidly without good convective cooling ability, particularly in locations where you have the bulb on for great lengths of time.
My understanding is that the quality of LED bulbs has been going down over time. In other words, newer bulbs are less likely to stand the test of time than older bulbs.
Similarly, I started buying Philips Hue bulbs back around 2015 and none of those have failed in the time since, even being used every night since then.
They're all in freestanding floor lamps installed in a horizontal orientation, which might have something to do with it. That seems like it'd dissipate heat a lot better than e.g. a pot light housing in the ceiling.
* bulbs with the UK-standard bayonet fitting in light sockets that are suspended from cables from the ceiling with lampshades -- these I don't think I've ever had fail on me yet
* 4.6W bulbs with a GU10 fitting in recessed spotlights -- these fail on me more frequently (perhaps every few years to every five years)
My assumption is that this is all down to the spotlight-fitting bulbs being in a confined space and getting a lot hotter. I use Philips bulbs in both cases.
- Older LEDs house bulbs were much worse than newer ones; far more prone to failure from "things". I had many of them fail after only a few months because our power was "flickery" and their power supplies could not handle it. That's _far_ less common now.
- The power supply / controller circuitry is not a fan of heat. Don't mount them upside down (so the heat floats up to the circuit) and never mount them in a recessed mount. The heat buildup will destroy them a lot quicker. That being said, this advice can be ignored is you're paying attention... mounts that have a way to heat to escape; bulbs that are designed to go in upside-down mounts (maybe?), etc.
- While you certainly don't want to always buy the most expensive bulb, you also don't want to buy the cheap ones. They are far more likely to be made from poor, failure prone components.
The bulbs not being suited to certain uses doesn't make them bad bulbs, it makes them more limited. There are bulbs that are good for external use and ones that are not; but that doesn't make the external ones "better", it makes them different. There's tradeoffs. And the tradeoffs for LEDs have gotten better over time, but are still there.
You don't go buy offroad vehicle, then complain it doesn't drive as comfortably on the highway and say it's an objectively worse vehicle. It was designed for a different goal than the 4 door sedan you're comparing it to. It does better at that goal, and worse at others. And, over time, offroad vehicles have gotten better on highways; they'll just never be as good.
3 year warranty instead of 10, but I've had a lot of problems with Philips LED Flicker-Free Dimmable BR30 Indoor Light Bulb. They consistently die and need to be replaced within a year or so. I replaced a couple under warranty but just gave up after the hassle involved. I've tried other brands without success and would love to know what a good reliable alternative would be.
10 years isn't a minimum threshold for every item. Any individual item could fail any time, and the overall distribution will have a shape somewhere between a bell curve and a long tail.
I don't know if there are any regulations around the 10-year claim, but if there are then I'd expect that it's either an average or something like a one-standard-deviation threshold, like 68% last past that but 32% don't.
"Guaranteed 10 years" doesn't actually say anything about expected lifetime at all, just that they'll do a warranty replacement if it fails sooner.
I have had at least 10 bulbs die on me within months, while others have lasted much longer, but the average lifespan on bulbs in our house can't be over 18 months. So I don't think people are complaining to be snobs, just noting that led bulbs don't last nearly as long as claimed. I have no idea why you needed to fall into personal attacks rather than concluding that bulbs readily available 11 years ago might be made better than those readily available now, and that most led bulbs are a lot newer than yours.
When I moved into my current place 5 years ago a lot of the lighting was 12V MR16 halogen bulbs. I replaced most of them with high CRI Philips Master LEDspots (specifically marketed as having a longer lifetime aimed at commercial installations but they weren't significantly more expensive vs "consumer" versions at the time if you were looking for high CRI anyway) and kept the transformers in place. I've had one fail out of probably 50 or so bulbs in that time, which feels about par for the course to me.
All the big name brand(Cree etc.) bulbs I've bought have been going strong for 5-12 years. Out of dozens the only ones I've ever had fail were off brand or special purpose like LIFX wifi bulbs.
Your comment is just regurgitating tech specs. In reality, the bulbs that are at hand vary so much in quality, that tech spec discussions are almost useless. The flickering is a real issue. I'm not aware of any standard way of rating the flickering of LED bulbs; they can vary from really bad (literally dark 50% of the duty cycle because one stupid diode) to decent (bidirectional diodes), to very good (full voltage regulator).
First, this driver actually specifies flicker, and it has a credible number. Second, I own several and have tested them. Performance is excellent. It dims well, too. If you want a crappy driver, you don’t need to spend $25 for it :)
Second, this LED chip is a serious one, with a serious data sheet, intended for people building their own fixtures.
Thanks, your posts were illuminating. I'm not looking forward to replacing my ancient quartz floor lamp, but I'm not sure I'll be able to buy a 3rd replacement bulb, when it finally goes out.
Sadly, with San Fran anywhere from 4.5x, or more than where I live (Quebec), and with LED products lastly barely longer than incandescent bulbs, it is typically a loss.
Maybe a 5 year warranty on LED bulbs should be a law, to ensure better quality control and build. The competitors can compete around that requirement.
Where exactly would you mount a light like the one you linked?
10 hours per day sounds like a crazy amount of time to use a light. I think we use some lights in our house maybe 4 hours per day on average max. Maybe I just have a lot of windows and don't live in Alaska in the winter.
10 hours per day sounds like a crazy amount of time to use a light.
I think the Alaska point is close. Yet even in (for example) southern Canada, the sun just doesn't get high over the horizon in winter. So you have 7 hour days, but those days are mostly dim and dark.
I'm in Oregon, just about on the 45th parallel. Not nearly as far north as some people, but winters can be pretty hard light-wise, and SAD is a bitch. I really should move to Arizona or Mexico in the winter.
In a screw base, maybe. But compare:
https://www.digikey.com/en/products/detail/luminus-devices-i...
$25 for an excellent 700mA driver, 86% efficient.
https://www.digikey.com/en/products/detail/bridgelux/BXRH-30...
$3.45 for a very nice, ~2000lm 97 CRI LED, about 99 lm/W. (Efficiency goes up quite a bit if you settle for 90 CRI.)
So that gives about 2000lm at about 25W, for <$30.
Wikipedia gives about 16 lm/W for incandescent, so 125W. At 10 hour per day, the LED options pays for itself quickly even at national average prices. In CA, it’s very fast.
To be fair, for high-end LEDs like this, the balance of the system is more expensive, because you need a heat sink. Incandescent lamps run very hot and don’t need heat sinks.
I think this is potentially promising, but I don’t think you can buy it:
https://tlo.mit.edu/technologies/high-efficiency-incandescen...