There's a lot wrong with LEDs in general and retrofit (E27 bulbs) in particular. In no particular order
- LED emitters driven hard for cost reasons, age and fail quickly
- Power supplies driven hard for cost reasons, age and fail quickly
- Poor CRI and SSRI
- Flickering
- Dim-to-warm is uncommon
- Poorly designed power supplies that age and fail quickly
- The same light quality is vastly more expensive to achieve with LEDs, even if you account for high electricity prices. Good indoor lighting is now something only people with plenty of disposable income can afford.
- It is quite difficult to even buy high quality LEDs as a mere mortal
- Retrofits generally work poorly on principle
- LEDs mix exceptionally poorly, making things even more expensive
$3.45 for a very nice, ~2000lm 97 CRI LED, about 99 lm/W. (Efficiency goes up quite a bit if you settle for 90 CRI.)
So that gives about 2000lm at about 25W, for <$30.
Wikipedia gives about 16 lm/W for incandescent, so 125W. At 10 hour per day, the LED options pays for itself quickly even at national average prices. In CA, it’s very fast.
To be fair, for high-end LEDs like this, the balance of the system is more expensive, because you need a heat sink. Incandescent lamps run very hot and don’t need heat sinks.
I think this is potentially promising, but I don’t think you can buy it:
This is just an anecdote, but I’ve had multiple “10 year” LED bulbs fail after just a year or two. I suspect much of the claims for these bulbs are theoretical as they just don’t hold up, probably for reasons the grandparent poster is pointing out.
Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves. I harvest the LEDs out of dead bulbs to use in hobby projects.
With Edison-style bulbs, anyway, the orientation they're mounted in makes a huge amount of difference. They're last a lot longer if they're oriented upright (base down) than in any other orientation because it reduces the heat buildup in the power supply.
This is the frustrating thing about LEDs that IDK we can change.
If there was a "DC" light socket in the house we could have LEDs outlasting owners, and for cheap. Nearly all the expense of LED bulbs is the power supply. Everything else is dirt cheap. A single home DC power supply with ~200W of output could light an entire house, flicker free.
What's even more frustrating is I think we could fix it. A national regulation for DC light sockets would fix it. Mandate a voltage, shape, and max amperage and BAM, you'll get 1000 different manufactures making standard compliant bulbs and home power supplies that will last an eternity.
I am designing an off-grid cabin with a solar panel array charging a bank of batteries with a propane generator backup. I run ethernet as power with a custom designed PCB that terminates at the outlet side where it exposes a 20 watt USB charging port and an ethernet port.
The lights are all basically cut 12v light strips inside of old light fixtures with a custom controller that also terminates PoE. The 48 volts that most PoE standards specify is more than enough to push power down the line for < 100 meter runs.
The advantage of PoE here is that anything under 50 volts is considered low voltage and does not need to follow the same rules as normal house wiring. I did not like that everything is hinging upon a beefy PoE switch so I actually made it passive PoE instead by design.
If you're willing to share your design, I'm sure there are other folks like myself who think this is a cool idea. I've wanted to do PoE (or passive PoE) for lights for a while now...
I am going to open source it. The goal was to be able to get all the SMD stuff available at JLPCB so you can just send it to be fabbed (with some thru-hole components you would just solder yourself) or I would also sell them at cost + 10%. My brother designed some 802.3at chips and was going to have him review my work first as I don't want to send out into the world a poorly design power system (there are enough of those things out there unfortunately).
We've had 2 standard DC outlets for a while now: 12V cigarette lighter and 5V USB. You do often see them in odd places. But the voltage and wattage of those specs is too low to be useful, so they haven't evolved into DC power distribution.
USB-C PD is at a useful voltage & wattage level, and so is Ethernet POE. I wouldn't be surprised to see them start to be used for general power distribution in niche applications, like RV's and off-grid cabins.
I don't think we're going to ever get a bulb standard, though.
Cars are starting to move to 48V DC. My under cabinet lighting in the kitchen are powered by DC from a power supply in the basement.
I could definitely see this becoming more common. Powering the ~100 watts of fixed lighting spread across my whole house on ten different 15A 120v circuits, each with their own arcfault breaker and 12 gauge copper electrical lines running back to the panel is fabulously expensive for what could be done with a bunch of CAT5 in each floor running to some conveniently located “POE injector” type devices.
You would want to be able to take a standard fixture and just push DC through it and use special bulbs with a standard A19 base, but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?
I would guess if for safety reasons it has to be a non-A19 connector, then your light fixture choices get cut down to almost nothing and no one will make the switch?
It’s really interesting to think about, most everything I’m plugging into AC outlets in my house, the first step is converting it to DC. A lot of my outlets I’ve switched to include USB ports so I don’t need the wall warts. If you have solar and battery backup even more-so you start to question why we are wasting so much money moving everything back and forth between DC/AC/DC within a house.
This is a fair point, breaking mechanical compatibility will at least stop any electrically exciting goofs from occurring from plugging a low voltage DC lamp into a (comparatively) high voltage AC socket.
> but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?
If by "standard" you mean a incandescent tungsten filament bulb, nothing at all.
For a true LED driver power supply, it would be constant current, so the tungsten filament would see 25mA (or whatever the constant current is set for) of DC, and nothing bad would happen (the filament also would not likely illuminate either).
Screwing in an LED bulb with integrated power supply, the external supply will still feed the constant current value, so what happens depends upon the design of the LED bulb's integrated power supply. If 25mA is enough to drive everything, the LED bulb might light up. If 25mA is not enough to drive everything, most likely nothing lights up.
48V without a current limit shouldn't be nothing, but you should expect less than 10% brightness.
For constant current, you'd need to drive at least 9 watts so it would be more like 250mA if not higher.
A 1600 lumen LED module might take as much or more current than a 60w incandescent. If your constant current supply can output between 0 volts and input volts, and it's set for a bulb with such a module, it would be able to power an incandescent bulb.
I suspect the results would be quite poor. Incandescent filaments increase their resistance when they get hotter, so driving them at constant RMS voltage means that the power will decrease as they heat up, which will give them a degree of stability. At constant current, though, the power will increase with increasing temperature.
(Of course, they’re quite hot and radiative cooling increases like T^4, so this isn’t necessarily a show stopper. But it’s probably not helpful.)
I wondered for a long time why we don't have standard built-in DC in building codes that could power our lights, and most electronic devices. Really the few things in my house that require full line voltage are all in the kitchen. Everything else has a transformer attached.
Me too - standard 12V and 5V rails run throughout the house would be great. I even thought about a wallpaper with conductive strips so the power could be invisibly delivered to any part of any room and "tapped" with a push-into-the-wall socket.
The usual counterargument is: voltage drop can become a problem. Trying to use one big power supply and use DC as your only distribution mechanism probably isn't a good idea.
But choosing a DC system for part of the house can make a lot of sense.
For one residential new construction room, it can be practical to have one shared power supply rather than one per LED. Say you have a 12 V, 5 A DC power supply. Using a star wiring topology, this can serve 10 lights (at 500 mA) fine with 16 AWG.
But how far can the power go before wire resistance causes too much Vdrop? Maybe one good transformer+rectifier per room? AC to the room and DC in the room. Those DC runs would be <5m each.
Until I see someone defining a representative example and running the numbers, I'm skeptical of their DC vs AC commentary.
I say this, because I was guilty of this exact shortcut thinking (in another comment). But I paused and thought to myself "I should run the numbers before just repeating the usual voltage drop criticism".
So I compared scenarios and it depends a lot on the topology, lengths, costs, and situation (new vs renovation).
Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.
I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)
I would have more confidence in an electrician apprentice on this one. I think they'd have more practical experience when it comes to figuring out what are the right questions to ask.
I did EE in college and do a fair bit of hands on residential electrical work.
P.S. How many sophomore level engineering students learn to do a sensitivity analysis?
>Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.
I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel? The comments I see seem to be advocating a whole-house solution, where a power supply is mounted in the breaker panel to supply LVDC to the whole unit. But this makes no sense for several reasons, especially the voltage drop.
>I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)
It's supposed to be dismissive, because this whole discussion is a bunch of software people trying to make up solutions for a perceived problem when they obviously don't know one of the most basic things about electrical theory, which makes all of their solutions unworkable. It's like a bunch of people trying to make a new kind of personal vehicle to replace cars when they don't even understand Newton's Laws. It's really annoying, because I see this kind of discussion pop up every so often, over many many years.
I have another comment here I don't feel like copy-and-pasting, but basically this whole discussion is silly because people are trying to make a solution using a very expensive power supply to fix a problem they see because they're buying cheap $2 light bulbs that burn out quickly, instead of just buying light fixtures that were properly engineered in the first place. With modern SMPSs, you're not going to get any kind of benefit by centralizing the power supply to drive individual LEDs, you're only going to get problems. LEDs need a driver circuit to provide constant current, and that means the power supply needs to be matched to the emitters and kept very close to it.
Where exactly are you going to put a power supply in a room? Make a special electrical box for it?
Switched-mode power supplies can be as small as your average Arduino board. They can fit inside the space used for wall outlets or light fixtures. Or you can put the DC transformer inside the light switch.
> I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel?
This sounds like a non-issue, specially considering the pervasive use of "unsightly" installations like air ducts, heating vents, radiators, electrical sockets, telecommunication service panels, routers, and even light fixtures.
If you intentionally dismiss obvious solutions, of course you only end up with problems without obvious solutions.
> It's supposed to be dismissive, because this whole discussion is (...)
It's not about DC vs AC, it's high-voltage vs low-voltage. The power dissipation by wire resistance scales with the square of the current ($P=RI^2$), and low line voltage means that you need large currents to transmit the same amount of power.
Whether or not that's feasible is going to depend a lot on the application. I don't think we'd ever fully rid homes of AC sockets, it's too useful for things like vacuum cleaners or space heaters.
But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine. Every light in a home, phone chargers, tablet chargers, computer monitors, televisions, computers? (maybe not gaming rigs, but certainly laptops and nucs).
>But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine.
How so? Exactly what benefit does it have over the current AC mains? With 48VDC, you'd still need to use DC-to-DC converters to power everything. I fail to see how that's any kind of improvement over the current switch-mode power supplies used. Instead, it'll just be less efficient because you'll get higher line losses in the power lines in the walls and all the way from wherever that 48VDC is coming from. If that's from a big SMPS in a closet somewhere, that's going to have its own losses. Overall, the entire system will have lower efficiency compared to the current system.
Exactly what problem are you trying to solve with this idea? If you think you're going to eliminate SMPSs in all your electronic equipment, you're not; that's a fantasy. Everything needs a power supply because electronics only work at very low voltages (5V, 3.3V, even 1.8V in places, now 20V with USB3) and most equipment has some kind of peculiar voltage requirements, and usually multiple different requirements inside the same device. There's no improvement in efficiency by running a computer, for instance, from 48VDC vs. 120VAC or 240VAC, in fact it's probably worse.
Also, DC and AC have differences in power transmission independent of resistance, some due to first principles (reactivity), and others related to devices for stepping voltage up or down (eg transformers).
Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC. Good designs for LED lighting have larger surface areas for heat dissipation and some physical/thermal separation between the LEDs and the power supply. A quality power supply does not produce flicker. As other comments have noted, dimming, or even predictable output requires some sort of power regulation even with DC input.
I think the way to change it is to replace sockets with hardwired LED fixtures. This is easy for something like a standalone ceiling light. It may be harder for other devices like ceiling fans that integrate a light bulb socket, but converting those devices to take DC power as in your proposal isn't easy either (most would just get discarded and replaced).
Doing it well is more expensive in the short-term than screw-in bulbs. A quick look on Amazon suggests integrated ceiling lights are about 10x the price of LED bulbs, though I suspect the longer service life pays for itself.
> Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC.
Absolutely, the incandescent light bulbs have that shape for a reason: the screw is small because there is nothing to put in it and it doesn't heat, the bulb is large to dissipate all the light and heat it generates. And the LED light bulbs have exactly opposite problems: almost all of the heat is generated near the screw while the bulb itself generates almost none and the light-emitter doesn't even need the bulb that large around of it. Oh, and the casing around the screw is plastic so the thermal conductivity is horrible. Honestly, it's a profoundly terrible form-factor which we're now stuck with.
It's also helpful to recognise that existing lighting fixtures and lamps were designed around the constraints of incandescent bulbs. The first generation of LED bulbs and lamps largely conform to these. As LEDs mature, both fixtures and lamps which address the limitations and requirements of the technology (transformers, perhaps dedicated 12v circuits, heat dissipation for the transformer rather than lighting elements themselves, and better light-temperature and intensity regulation) should emerge.
We're presently in the somewhat-messy half-emerged state. Think horseless carriages, wireless, and the days of dual gas/electric lighting and lamping systems (yes, these existed, and yes, the failure modes were ... much as you might imagine).
Already happens in New Zealand: lighting is usually low current 1mm2 wiring, and everything else is heavier gauge. Circuit breakers mostly care about Amps (all breakers could be rated to mains voltage if you wanted to avoid “weird”).
Also low voltage wiring can legally be done by anyone in NZ (a bonus when doing your own work, and a pitfall when buying a house?)
Maybe for you, but I have been considering just this. I would love to have dedicated 24v for lighting and charging of devices. My house already has various systems for lighting, such as xenox throughout kitchen under the cabinets and also the basement. Both are driven from separate transformers. Then I got the rest of the house with can lights utilizing br30 bulbs that are just a waste of 12 awg. The one place I was able to replace with dedicated LED fixture, I had to overpay for a decent product that wouldve been better off as a 24v basic LED light. When you consider most hvac systems operate at 24v, there is some real potential to create a decent standard serving multiple purposes.
And besides, idk if you have ever pulled 12ga wire, but it's a pita. Idk any electrician that would agree with you saying it would be a pain to cut back on heavy wire and pull half that with light 22 awg.
Lighting (on AC 110v / 220v circuits) also typically is specced for a lower peak amperage than utility or appliance outlets. For US codes, generally 15A rather than 20A. Lighting may use 20A, but isn't required to.
Other circuits must be 20A, e.g., kitchen outlets serving appliances.
This is what I want. A standard 48VDC socket would be a game changer for lighting.
Heck, with such a standard you could have 120VAC -> 48VDC converters and you'd be in the same position we are today with Leds, only better because you'd just have to replace the converter and not the whole bulb.
Not extremely thick. Wire losses remain similar at 12V as they were at 110V (Replace 100W bulb with a 10W bulb at 12V, current remains ~1A so wire losses stay the same as the were). Wire losses might be say 1W for 1mm2 cabling. 240V example: https://ausinet.com.au/voltage-drop/
Agree that it is worth upping voltage to chase a few more percent savings, but still need to consider other constraints.
There are also these type of "ceramic substrate" bulbs which claim to give longer life. I suspect other compromises in the construction may negate that.
I don't think we're exactly stuck with the old form factor. We can start phasing them out. Replacement of screw sockets with modern fixtures is well within the capabilities of the average DIYer (though perhaps some places it's illegal for anyone but a professional electrician to touch anything hardwired).
Well, one of the main sales point of the LED bulbs was compatibility with existing E14/E27/etc sockets: no need to change the wiring, or the fixtures, just buy a new, better light bulb and screw it right in! It will also serve longer and be better for the environment, what's not to like? We'll even ban the sales of 100W and higher incandescent light bulbs to help you make the right choice!
That's also the pitch of the smart bulbs: a sane way would be to make a smart light switch but what if you can't do that (e.g., you rent the apartment)? So we'll shove the controller chip into a disposable light bulb, that's still perfectly fine for the environment.
By the way, I don't know how things turned out in your part of the world but over here, after the ban went into the force the manufacturers of incandescent lightbulb started selling 95W light bulbs 8D
Probably going to sound crazy, but we could start running water pipes in front of the walls and under the ceilings and mounting the LED's directly on the pipes for cooling. Creativity, thinking wholistically... the entire contemporary western house design needs a rethink frankly, from DC circuits to electrification to modular, mass-produceable utility drop-in pods, all with an eye towards integrated systems design paired with scalable modularity.
One of the problems is that in some countries like the US, ceiling lamps are hard-wired and not "user-replaceable", so people have to resort to using those stupid bulbs in their old fixtures.
I live in Japan, and instead of just a pair of wires coming out of the ceiling, there is a standardized "ceiling socket" [0] which can also support the weight of a lamp. This means that swapping out light fixtures is plug and play, so the standard LED lamp is something like this [1] where you have a nice big flat metal plate backing the hardware is mounted to for heat-sinking.
I don't own any LED bulbs at all - all our lamps are of this type so I wouldn't have anywhere to put one.
It was the same when I lived in Sweden - a standard ceiling light outlet (IIRC there is a EU standard for this now called DCL) so that replacing light fixtures was easy. Moving into an apartment, often they wouldn't even come with light fixtures, you'd bring your own.
In the Netherlands we have just a pair of wires coming out of the ceiling but everyone replaces their own lamp fixtures anyway. Most people should be able to manage clamping or screwing down the brown or black wire to the L and the blue wire to N.
You can't have a low-voltage DC power supply supplying the entire home: the voltage drop between the supply and the LED would be huge. There's a reason we use higher voltages for long wire lengths: to increase efficiency and reduce line losses, since losses increase geometrically with the square of the current (according to Ohm's Law: P = R * I^2). Higher voltage means proportionally lower current, and geometrically lower losses.
And since we need high voltage (at least 100V) to keep line losses very low and allow the use of thinner-gauge copper wiring, we need a switching power supply at every light fixture, so it really doesn't matter if it's AC or DC, since modern SMPS (switch-mode power supplies) work equally well with either.
Finally, on top of all that, LEDs are current-driven devices, and need a constant-current power supply. So the power supply must be very close to the diodes, or else fluctuations in supply voltage will have very negative effects.
Low voltage DC lighting is a thing that has existed for a very, very long time. That most houses don't have it is more cultural than anything else, in my opinion.
That means it's totally fixable. You can install such a system in existing buildings right now, and it's not crazy expensive unless you want to run the wires inside the walls.
If we could shift cultural expectations around this, adding a LV system in new construction would not significantly increase the construction costs. It will start to be done if buyers start demanding it.
12V requires quite a lot of amps for enough light, so low DC is not optimal. Also LEDs are current driven devices, i.e. they will be sensitive to voltage changes (even with a current limiting resistor)
Low-voltage doesn't necessarily mean 12V. I think it's anything below about 50, although lighting systems currently marketed as "low voltage" are usually 12 or 24 volts.
The constant current thing is true, but that's not a terribly difficult problem.
For example: the stairwell shin-height lights in this 90s house are 12 VDC. There's a transformer plugged into a wall outlet in the nearby storage closet.
That works OK because the transformer is relatively close to the lights. If it were a reasonably-large house, and the transformer were on the opposite side of the house, you'd have a problem with a noticeable voltage drop. All these ideas people are throwing out here involve a single whole-house power supply. If it were for 48VDC, it would probably be fine, but 12V would result in significant line losses.
We already see transformers for a run of e.g. track lights, low voltage lights on tension wires, and so on. That's been a thing ever since halogens came to market.
Having multiple transformers is perfectly doable and commercially viable -- though I would appreciate more product availability for something easy to stash in the hollow space of a ceiling, like recessed lighting is installed.
I still don't see the point of all this. If you have a handful of lights in a room, and drive them with a single power supply, you're still going to have big problems: the line lengths to each fixture will be different, resulting in different voltages. You can't drive LEDs that way with good results: they need fixed current. And you can't daisy-chain them either: if one emitter dies, then the remaining ones will suddenly have different current, and probably die quickly. The proper way to drive LEDs is with a power supply very close to the emitters and designed specifically for those emitters and the (short) wire length to them, not 4 meters away and not with some variable-length wire that can't be designed for.
Everyone here is complaining about ultra-cheap LEDs that don't last very long because they're poorly engineered, but that's exactly what you're all trying to do here by using a separate, shared power supply. You could get away with that in the 1980s using incandescent bulbs, but you can't do it now unless you want the same crappy lifespan and reliability you're all complaining about.
The solution is very simple: buy fixtures that are engineered well. Switch-mode power supply electronics are not expensive at all, but when mfgs cheap out or do a crappy job designing them, you get bad results, usually short lifetime of either the power supply or the LED. What you're trying to do here is buy a really expensive power supply, which has to be engineered to a far greater degree and for a far wider range of operating conditions (since they don't know what you're going to connect to it), just because you had a bad experience buying some $2 light bulb that had a crappy power supply built-in. This really makes no sense.
LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.
LEDs are like 15% efficient and power supplies are >95%. They just need to be separated slightly so the LEDs aren't heating the power supply. Most recessed LED lighting now has a separate junction box with the power supply.
> LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.
I think the biggest problem is that many cheap power supplies cycle at lower frequencies that cause flickering which is perceptible subconsciously. A modern switchmode power supply might operate in the 50-500khz range which will not cause perceivable flickers.
The really cheap stuff actually doesn't even have a power supply!
There's a breed of LEDs that takes straight AC and rectifies it using the LEDs themselves. By using a large number of tiny LEDs in series (typically in COB form), you can easily reach close to 110v or even 220v, and then you add a small current limiting controller in series that's dirt cheap compared to magnetics...
These are super cheap, and appear bright, but they flicker at 120hz, which can be annoying when there's motion or if you're sensitive to it.
I'd say it's a very bad choice for a bedroom or living room light, but I have nothing against it for the outdoor lights, signage and a bunch of other applications where cost is king.
I have a serious problem with it for outdoor lighting and signage: it gives me a headache. Enough exposure will make me feel actively sick. The effect is not subtle.
Branding matters. If your brand is a light that flickers, you might want to consider the old adage penny wise, pound foolish. As a consumer, why would I choose to shop at an establishment that has flickering lights when I could shop at a different one that did not? Unless of course, I had no choice.
But then, a wise entrepreneur would recognize paying extra to have non-flickering signage would attract some customers.
Flickering lights can induce migraines in susceptible people, so literally, saving a penny here actively drives away business.
I think this falls apart in the details. LEDs want constant current power supplies, and their owners frequently want them to dim. So you will still need a power supply.
You can fudge it with resisters like in an LED strip, but you lose efficiency and dimming quality.
That being said, I expect that power supplies with 48VDC input or so would be cheaper.
>Maybe this could be a prosumer retrofit thing, where the AC voltage gets converted to DC in the junction box, and then DC is sent down to the fixture.
The problem is that in 99.99% of homes outlets are on the same circuits as light fixtures, you would need to do some major rewiring.
No, I'm saying you put a module into the junction box that the light fixture is attached to that serves as an AC/DC adapter, current limiting driver, and possibly a dimming sensor that would then provide downstream DC voltage to retrofit A19 bulbs.
Those bulbs would then have no internal switching systems to burn out and rely entirely on the module hidden behind the wall to handle their power needs.
> This is the frustrating thing about LEDs that IDK we can change.
I think that non-bulb LED fixtures are relatively common. For example, a style exists where you cut a hole in the ceiling and friction-fit the LEDs with the power supply up in the attic (presumably with infinite convective airflow): https://www.lowes.com/pd/Utilitech-Canless-Color-Choice-Inte...
These power supplies aren't going to die from overheating because the power supply is nowhere near the heat-producing LEDs. And, it's not like $30 for your entire light fixture is going to break the bank.
Yes. For the commercial DC lighting installations I've seen they were using power over ethernet. That's not necessarily the only way to deliver DC power but whatever you do it's going to be wired differently from 120 VAC.
I used to do electrical installs in commercial buildings and this was starting to catch on, mainly because the the practice of running ethernet (including the 8P8C aka RJ45 connector, patch paneling, etc) is already established. This always felt very roundabout and requires expensive networking equipment just to run lights which I do not personally like because it will just cause confusion.
> Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves.
It's true that the power supply versions are so poorly designed and inefficient that heat is a problem. Design and quality control effort could reduce heat generated by the entire assembly to a fraction of what the socket, fixture, and wiring can sink.
It's more common now to find bulbs that have no power supply at all. They're literally a rectifier made of LED's in series. If the bulb flashes at 2 * mains frequency, that's likely what you have. They die out quickly because the LED strings add up to a maximum voltage a bit over mains voltage, but that's RMS not peak. It's a natural outcome, as using enough LED's to accomodate peak voltage reduces light output by underdriving them, increases obvious flicker from dwell time below minimum voltage, and increases cost.
Hotwired LED strings are cheaper to design, source, assemble, bad parts fail fast more consistently with no effort wasted on quality control, and the market's so flooded and volatile that there's no room for consumer side quality awareness effective enough to make the negative outcomes matter. Power supplies in these bulbs are going away. Ubiquitous 2 * mains frequency strobing, short-lived, hotwired LED bulbs is where the home LED lighting market is taking us.
>> I harvest the LEDs out of dead bulbs to use in hobby projects.
This is a great idea and I would love it if you would post a Youtube how-to video. It might encourage a bunch of hobbyists to do something useful with those dead bulbs.
I've had a number of LED's fail after only a year or two, in fact more quickly than the average incandescent bulb. Seems like it defeats the whole purpose of "upgrading" and in fact may be more of a downgrade.
They're remarkably heat sensitive, especially cheap ones. Some bulbs would gladly run for 10 years in a room slightly above freezing temperature, but put them in a semi-enclosed fixture in a normal living space, and they're dead in a few months. Fully enclosed fixtures destroy them in no time, unless you buy really exotic bulbs with truly massive aluminum heatsinks, rated for high temp operating environments. I can't even find domestic suppliers for those, and had to order from China.
The LEDs are surface mount (although big surface mount components, so not particularly difficult to work with). I desolder them with hot air (although you can totally do it with a soldering iron), then use them later as any other surface mount LED. I don't have access to YouTube right now, so can't search for you, but there are tons of videos covering how to desolder and solder surface mount components. I'd be willing to bet there are multiple videos covering this for LEDs specifically, too.
>Almost all LED bulb failures are because the power supply died due to overheating, not the LEDs themselves.
I have the exact opposite experience, virtually every single light bulb I have torn down - one LED (all in series) has a black dot, if I shorten it - it will 'work' again. The bulbs I have seen tend to drive the LEDs so hard that some of the latter fail, power supplies might have huge ripple but generally don't fail catastrophically.
Edit: now thinking, it can be a US thing, with the voltage being ~120. Lower AC voltages means worse efficiency for the power supply (and all of them tend to be universal, unless totally cheapen out on the primary capacitor [250V] for the US market). Generally speaking low AC voltages have mostly disadvantages.
It could very well be more of a US problem. In the bulbs that I've torn down, they all have used a capacitive dropper power supply, and it's usually been the capacitor that failed.
I have had lamps that lived long enough to see LED failures (the "black dot of death") but that's not the most usual failure mode that I've personally encountered.
I've been considering following in the footsteps of Big Clive and modifying new LED bulbs to stop them from overdriving the LEDs, but my interest in doing that hasn't yet overcome my inherent laziness.
You're right though I want to mention the LEDs are also damaged by the heat, their color temperature will wander, lifetime will be reduced, and brightness per watt will also be reduced. Still useful for projects and areas where perfect lighting isn't as important.
Indeed. And, as another commenter pointed out, LED bulbs often overdrive the LEDs in order to maximize light output -- but doing so significantly decreases the lifespan of the LEDs themselves.
Sometimes it is the power supply, but I've also had a number that died simply because one LED burned out and failed open. Because they are wired in series it only takes one failed LED to take out the entire bulb. If you're a cheapskate you can sometimes get a bulb working again by testing the circuit and bypassing the burned out LED with a jumper wire.
If the bulb dies but you notice that all of the elements are still just barely on (like a dim spot of light in the middle of each one) then that's a good indication that you have a dead LED.
Are they in fixtures designed for incandescents? "Boob lamps" for example are highly efficient LED bulb destroyers, since they don't breathe, and the bulbs overheat.
More anecdata, but I had ~20 lights, of various quality (many Hue, some cheaper Home Depot specials) in boob lamps that survived at least 11 summers in a New England house without AC. Still there probably, but I moved out so I can only vouch for 11 years.
That seems especially lucky. I'd guess they're slightly different in design or installation from the ones in my previous apartment (also northeast) that killed several and various LEDs... They stopped dying when I had the idea to shim the cover and create some ventilation. Mine were fairly heavy glass domes and had some insulating material against the ceiling.
I wish I could say that. I have a 3 year old house that is about 3k square feet wiht alot of bulbs. Every bulb installed was LED and I have replaced most of them at this point, and some more then once. I have even had the electrical company come out thinking there was something wrong with the power in my house or the breaker box. Nothing...
It usually comes down to the brand, factors you can't be aware of like component choice, and the light fixture itself. A lot of LED edison socket lights die quickly in recessed lighting or other tight fixtures because the heat is death to them. Manufacturers build the worst technically functional capacitors into the power supplies with a low temperature rating, meaning they really can't handle anything above ambient.
This is also the same industry and the same players that were perfectly fine with agreeing to not improve incandescent light past 1000 lifetime hours, illegally. I have no doubt that there is a tacit agreement not to make good lighting, as that would extremely disrupt the industry.
I'm repeating myself a lot in this thread, so I'm sorry. What fixtures are the bulbs in? Are they a generic design meant for incandescents? A huge number of fixtures out there don't allow for enough heat to convect away and the bulbs overheat.
I've had some generally good experiences with LEDs as well. The only places that I've had somewhat higher failure rates for LEDs were places where I wanted a lot of light but the existing fixture had the bulbs trapped deep inside an enclosed fixture. I ended up buying a different brand than I normally do since it seemed the bulbs I had been going with just couldn't survive that hotbox, but since trying another brand the bulbs have lasted a couple of years so far.
Otherwise, for probably at least 40 or so bulbs swapped for LEDs over the years, I've experienced maybe 4 or 5 failures. The vast majority of my bulbs have been Feit and GE. I never buy smart bulbs. My best experiences have usually been to just buy LED fixtures though, I replaced a lot of my flush mount ceiling fixtures and ceiling fans for ones with integrated LEDs and have not had a single failure so far after a few years, knock on wood.
I had some problems with my old dimmer switches, but upgrading dimmers to newer ones which advertised good LED dimming and ensuring I had bulbs which stated dimming compatibility it eliminated my noise and flicker issues. There's a recent standard out there, NEMA SSL 7A, which seeks to ensure good compatibility. I set my dimmers to this SSL 7A mode and I've had no problems since.
I think there might be something about the wiring in some homes. Some of my LED bulbs have been going for a decade now without issues. I have a few fixtures where bulbs keep hauling in specific sockets after a few months. I have one fixture in my bathroom where a bulb was fine for a few months and then two replacement bulbs failed instantly and three third one failed again after a year or so. Maybe the voltage is wrong and keeps breaking the power supply?
It could be sensitive power circuitry failing due to power quality, but is more likely a heat buildup. LEDs bulbs fail rapidly without good convective cooling ability, particularly in locations where you have the bulb on for great lengths of time.
My understanding is that the quality of LED bulbs has been going down over time. In other words, newer bulbs are less likely to stand the test of time than older bulbs.
Similarly, I started buying Philips Hue bulbs back around 2015 and none of those have failed in the time since, even being used every night since then.
They're all in freestanding floor lamps installed in a horizontal orientation, which might have something to do with it. That seems like it'd dissipate heat a lot better than e.g. a pot light housing in the ceiling.
* bulbs with the UK-standard bayonet fitting in light sockets that are suspended from cables from the ceiling with lampshades -- these I don't think I've ever had fail on me yet
* 4.6W bulbs with a GU10 fitting in recessed spotlights -- these fail on me more frequently (perhaps every few years to every five years)
My assumption is that this is all down to the spotlight-fitting bulbs being in a confined space and getting a lot hotter. I use Philips bulbs in both cases.
- Older LEDs house bulbs were much worse than newer ones; far more prone to failure from "things". I had many of them fail after only a few months because our power was "flickery" and their power supplies could not handle it. That's _far_ less common now.
- The power supply / controller circuitry is not a fan of heat. Don't mount them upside down (so the heat floats up to the circuit) and never mount them in a recessed mount. The heat buildup will destroy them a lot quicker. That being said, this advice can be ignored is you're paying attention... mounts that have a way to heat to escape; bulbs that are designed to go in upside-down mounts (maybe?), etc.
- While you certainly don't want to always buy the most expensive bulb, you also don't want to buy the cheap ones. They are far more likely to be made from poor, failure prone components.
The bulbs not being suited to certain uses doesn't make them bad bulbs, it makes them more limited. There are bulbs that are good for external use and ones that are not; but that doesn't make the external ones "better", it makes them different. There's tradeoffs. And the tradeoffs for LEDs have gotten better over time, but are still there.
You don't go buy offroad vehicle, then complain it doesn't drive as comfortably on the highway and say it's an objectively worse vehicle. It was designed for a different goal than the 4 door sedan you're comparing it to. It does better at that goal, and worse at others. And, over time, offroad vehicles have gotten better on highways; they'll just never be as good.
3 year warranty instead of 10, but I've had a lot of problems with Philips LED Flicker-Free Dimmable BR30 Indoor Light Bulb. They consistently die and need to be replaced within a year or so. I replaced a couple under warranty but just gave up after the hassle involved. I've tried other brands without success and would love to know what a good reliable alternative would be.
10 years isn't a minimum threshold for every item. Any individual item could fail any time, and the overall distribution will have a shape somewhere between a bell curve and a long tail.
I don't know if there are any regulations around the 10-year claim, but if there are then I'd expect that it's either an average or something like a one-standard-deviation threshold, like 68% last past that but 32% don't.
"Guaranteed 10 years" doesn't actually say anything about expected lifetime at all, just that they'll do a warranty replacement if it fails sooner.
I have had at least 10 bulbs die on me within months, while others have lasted much longer, but the average lifespan on bulbs in our house can't be over 18 months. So I don't think people are complaining to be snobs, just noting that led bulbs don't last nearly as long as claimed. I have no idea why you needed to fall into personal attacks rather than concluding that bulbs readily available 11 years ago might be made better than those readily available now, and that most led bulbs are a lot newer than yours.
When I moved into my current place 5 years ago a lot of the lighting was 12V MR16 halogen bulbs. I replaced most of them with high CRI Philips Master LEDspots (specifically marketed as having a longer lifetime aimed at commercial installations but they weren't significantly more expensive vs "consumer" versions at the time if you were looking for high CRI anyway) and kept the transformers in place. I've had one fail out of probably 50 or so bulbs in that time, which feels about par for the course to me.
All the big name brand(Cree etc.) bulbs I've bought have been going strong for 5-12 years. Out of dozens the only ones I've ever had fail were off brand or special purpose like LIFX wifi bulbs.
Your comment is just regurgitating tech specs. In reality, the bulbs that are at hand vary so much in quality, that tech spec discussions are almost useless. The flickering is a real issue. I'm not aware of any standard way of rating the flickering of LED bulbs; they can vary from really bad (literally dark 50% of the duty cycle because one stupid diode) to decent (bidirectional diodes), to very good (full voltage regulator).
First, this driver actually specifies flicker, and it has a credible number. Second, I own several and have tested them. Performance is excellent. It dims well, too. If you want a crappy driver, you don’t need to spend $25 for it :)
Second, this LED chip is a serious one, with a serious data sheet, intended for people building their own fixtures.
Thanks, your posts were illuminating. I'm not looking forward to replacing my ancient quartz floor lamp, but I'm not sure I'll be able to buy a 3rd replacement bulb, when it finally goes out.
Sadly, with San Fran anywhere from 4.5x, or more than where I live (Quebec), and with LED products lastly barely longer than incandescent bulbs, it is typically a loss.
Maybe a 5 year warranty on LED bulbs should be a law, to ensure better quality control and build. The competitors can compete around that requirement.
Where exactly would you mount a light like the one you linked?
10 hours per day sounds like a crazy amount of time to use a light. I think we use some lights in our house maybe 4 hours per day on average max. Maybe I just have a lot of windows and don't live in Alaska in the winter.
10 hours per day sounds like a crazy amount of time to use a light.
I think the Alaska point is close. Yet even in (for example) southern Canada, the sun just doesn't get high over the horizon in winter. So you have 7 hour days, but those days are mostly dim and dark.
I'm in Oregon, just about on the 45th parallel. Not nearly as far north as some people, but winters can be pretty hard light-wise, and SAD is a bitch. I really should move to Arizona or Mexico in the winter.
As an amateur EE, I have analysed some of LEDs when I wanted to light up my kitchen counters. I wanted flicker-free LEDs with high CRI and temperature matching the rest of my apartment.
I ordered samples of a lot of LEDs and found that almost all of them are using their parts, especially capacitors, well above the specs.
Driving caps at well above their specs, at high temperature, basically ensures speedy failure. Not only that, but undersized smoothing capacitor causes visible 100Hz flicker.
What's even more interesting is at the price point putting better caps was almost inconsequential to the price of the product. I have ordered capacitors that should have been there in the first place and replaced the original ones with the new ones. Not only LEDs are flicker free now, I suspect they will be serving me for much longer.
>almost all of them are using their parts, especially capacitors, well above the specs
Well, the LEDs themselves will last up to 20 years, so they have to make something in the bulb fail before that. Can't have people only buying replacement bulbs every other decade.
Not sure why you are being downvoted, when that is exactly what has happened with incandescent light bulbs one century ago:
> How exactly did the cartel pull off this engineering feat? It wasn’t just a matter of making an inferior or sloppy product; anybody could have done that. But to create one that reliably failed after an agreed-upon 1,000 hours took some doing over a number of years. The household lightbulb in 1924 was already technologically sophisticated: The light yield was considerable; the burning time was easily 2,500 hours or more. By striving for something less, the cartel would systematically reverse decades of progress.
In some situations you want planned obsolescence, such as in parts which are critical yet should not be used beyond a certain time period (a filter in a medical device) and it is standard practice to design them to stop working in a controlled way before they become dangerous so that they will be replaced on a timely basis. I've also heard that one reason the Soviet Union failed is they relied heavily on standardized components. This meant that everything was easily replaceable, if your washing machine motor failed you could replace it with the same electric motor taken from an old car. But this also meant Soviet engineers had trouble designing new products since new products with new parts could not compete with those made from low cost and massively manufactured standardized parts. If you needed an inbetween motor size you were stuck with just the standard motor sizes. A higher performing motor that used less energy and lasted longer would have to compete on price with standardized massively produced motors. To some extent this limited the development of new technologies and products. The article linked above mentions that customers also drive product design; if people will always buy whatever is cheaper and pay no attention to product longevity then it is difficult for a manufacturer to compete with a long lasting product; the benefit is not immediately apparent to the purchaser and claims about longer life are hard to prove for the seller (many sellers lie). A lot depends on the specific type of product and peoples perceptions. Many people are willing to spend more on tools that last because they have seen poorly made tools wear out or had it demonstrated how much better a properly made tool works. They are not willing to pay more for long lasting LED light bulbs because the experience with incandescent bulbs is they always wear out so they are used to having to replace them and they are not going to track the individual lifetimes of each bulb type/maker, though that is starting to change as people notice LED bulbs not having the claimed lifetime (hence this discussion).
So some things need to have a limited lifetime, some things are more efficient in terms of manufacturing cost versus lifetime when designed with a limited lifetime, sometimes a limited lifetime leaves room for invention and improvement, and sometimes a longer lifetime uses less resources and is more efficient and makes life easier. Longevity and standardization can work both ways, for and against the minimization of resource use. Capitalism has flaws, and many of them are tied to profit motive, but it does improve some efficiencies and encourage invention. A lot of it is up to people who decide how much they are willing to pay for things. Not everyone can pay the price for longevity, a cheap screwdriver can be used to fix things right now while an expensive screwdriver may mean not also having the use of a cheap hammer right now. Do you live with the house falling apart or buy the cheap tools? Cheap cellphones meant everyone could have one, and replacing them every few years meant the design of cellphones could advance quickly. Once cellphones reached a plateau in design (remember when each new model had more sensors and cheap models had fewer sensors?) the focus should have shifted to longevity.
However, after saying all that, and considering the climate crisis, society and corporations need to be leaning more towards making things last than they are currently. Making things more easily recyclable, making parts reusable, making products last longer. It has to be approached on a product by product basis though, and affect designs where it makes sense. Bring back bumpers on cars that actually prevent damage to the body:
> There's a lot wrong with LEDs in general and retrofit (E27 bulbs) in particular. In no particular order
If all that's true, it explains my experience that LEDs have totally failed to live up to their promise. Sure, they use less power than incandescents, but they're far more expensive and also more finicky. They were supposed to last a decade, but I'm lucky if I get a year or two out of them. I wonder what the environmental impact is when you factor in e-waste and manufacturing costs.
About the only clear win for me is they run much cooler, which is nice when you have underpowered AC (or no AC).
> - LEDs mix exceptionally poorly, making things even more expensive
I'm not sure exactly what you mean here, but (compared to incandescents), different models of LED differ significantly in light characteristics and start up time. More than once I've had to replace all the bulbs in a fixture, because I couldn't buy and equivalent replacement for one that failed.
They've started marking LED bulbs as "not for enclosed fixtures" which is .... 90% of existing fixtures?
They overheat and die really fast if used in something that's not vented/cooled. You need fixtures that fully expose the bulb so it doesn't burn itself out.
Amusing that LED bulbs, the energy savers, die from excess heat.
I've taken to just replacing fixtures instead of trying to make bulbs work with existing, though I'm --> <-- this close to just throwing them all away and going back to kerosene lanterns and some incandescents.
It's because despite being far more efficient than incandescent they are still only about 30% efficient at producing visible light. The rest is heat, and unlike incandescent an LED does not want to be hot. The device must be cooled or it becomes less efficient and wears out faster. There is no good way to get the heat out of the front of the device because that's the side you are supposed to see, so in practice all the heat is removed from the back, i.e. the part inside the fixture.
Other solutions to this include using much larger devices, but that costs proportionally more and has application issues because people want their light bulbs to act like either line or point sources, not as areal sources. So most lights on the market use a single small LED, unless they are targeted to a buyer demanding high efficiency and long life, like a city streetlight.
So having a minimum CRI of 80-90 is a good starting point, there are issues with the CRI measure itself:
> Ra is the average value of R1–R8; other values from R9 to R15 are not used in the calculation of Ra, including R9 "saturated red", R13 "skin color (light)", and R15 "skin color (medium)", which are all difficult colors to faithfully reproduce. R9 is a vital index in high-CRI lighting, as many applications require red lights, such as film and video lighting, medical lighting, art lighting, etc. However, in the general CRI (Ra) calculation R9 is not included.
It is interesting you cite that the cost has went up for lighting. Where I live the government owned utility often raises their rates per kwHr. One of the reasons they cite is the increase in efficiency leading to a drop in revenue each year.
The same utility pays for those efficiency projects.
My network and kWh cost are split and the network cost have risen far less than kWh cost. Also the network cost are constant. Looking at where those are used for the main cost are in peak usage capacity which efficiency actually lowers.
This would make much more sense, but if they billed by network cost I think people would quickly figure out there is no real reason to conserve electricity here. The cost of generation is so low (in fact negative sometimes) that it doesn't make a bit of difference.
On my first read of this, it makes sense though. The mileage of infrastructure is the same regardless of use. Those powerlines still need maintenance even if LEDs are making homes more efficient.
Burgers and bread loafs compete on all their costs, different competing outlets actually differ in those costs, and people choose which ones to use.
Utilities don't have that. Everyone is stuck paying for both infrastructure and use, so it makes sense to charge them separately.
If each neighborhood had exactly one restaurant that everyone used for all their food, maybe it would make sense to split the burger flipper costs evenly.
What’s an example of what you consider to be a high quality led? I’m pretty happy with everything that I have in my home but I’m curious what you’re talking about
I can't imagine paying $150 for a six pack, when Home Depot's private label LEDs are $12 for a six pack. I could replace it 10 times before I hit that mark - not sure it's worth that.
And this is the exact reason that the market for good quality LEDs is so small. You care about price but not light quality (primarily CRI but also flicker and dimmability). That's fine, it's totally your decision to make. But the two products are incredibly far from equivalent.
I mean there’s gotta be some reasonable price limit where you stop blaming the consumer. If each supposedly higher-quality LED bulb is $50,000 are you still blaming the consumer? Especially when the consumer has can’t realistically even know if they’re going to get a higher quality bulb or just a $5 bulb resold for $50,000.
If you told me you could make one room of my house consistently color-balanced with LED lighting that I would have no reason to hate, I would ball up a couple hundred dollar bills and throw them at you.
(Edit: I’m also coming from buying Philips Hue bulbs for precisely this reason, so in fairness, it’s not as big a price jump.)
I have a few Hues and they are great, and last much longer. But in this house I cannot justify even that. In my kitchen/breakfast nook alone I have 10 lightbulbs plus an overhead flushmount.
Keep in mind: all LEDs leak blue light even the warm ones (color is achieved by average so you can have high red and high blue and it looks balanced visually). These bright blue leaking LEDs are great during the day — especially those with high CRI and R9 values. But not at night when you’re trying to go to bed! Switch to incandescent only in the evenings until they figure out blue-less dimmable LEDs
Not all, only white LEDs. You can use any cheap LED RGB strip without white components and set it to yellow/orange light with blue completely off. It has poor CRI though.
That’s the problem - sometimes they are great. They are more complex devices that use parts from random Chinese suppliers.
Incandescent bulbs were very simple. Cheap vs name brand. White vs soft white. 120v vs 130v. LEDs have at least 7 attributes, some of which are not documented well or at all.
Tell me what LED bulb is ideal to produce 1500 lumens of output in an outdoor semi-enclosed fixure? It will take about 10 minutes of googling around if you are in the know. The average consumer doesn’t have a chance.
In the 6 years I've been in my current house, having installed these everywhere in the house upon moving in (total around 80 bulbs), I've had less than ten to flicker or otherwise go bad. Maybe I'm just lucky.
I would prefer incandescents though. The only thing I don't miss about them is the heat. Everything else was superior.
I hear you, but I bought two waveforms just to try them out, and they're absolutely incredible. It's a shockingly better light than the $2 home depot light. It's the equivalent of going from an underpowered computer to one that's up to the job - you don't really notice how bad the old one was until you get something up to the task.
I bought many of the Home Depot private label LEDs...and had to replace every single one of them. Outright failure, buzzing, flickering. They're just terrible. I've replaced them with Philips.
I had one room that had a flicker problem. Replaced the fixture (was going to do this anyway because The Wife Said So) and it went away. I guess that's why some of the complaints seem foreign to me.
In contrast to the other child comment of this... Thank You!! Any additional suggestions for high quality LED's would be super appreciated. I'm still on mostly halogen lighting in my home. I keep trying to switch the LED's but for some reason with my vision, the low CRI of even "decent" LED bulbs make it so I feel like I can't actually see anything.
I recently replaced all my bulbs with Waveform Lighting bulbs. They're good but IMO overrated and overpriced, and their shipping prices are absurd. You can get high-CRI bulbs at Home Depot, it's just a matter of trying a couple until you find one that doesn't flicker (use your phone's slow motion camera) if you don't want to go the route of specialty bulbs. My Cree bulbs all flickered but my Philips bulbs did not; to my eye, there's no difference between the cheaper Philips bulbs and the much more expensive Waveform Lighting bulbs.
One of my Waveform Lighting bulbs arrived defective, and it flickers all the time. I couldn't detect the Cree flicker with the naked eye but the defective Waveform bulb flickers visibly. Not sure if Waveform's QA is up to snuff.
Ketra was good, smart bulbs like Hue with an open API, but far better than Hue. Lutron bought them, killed the API and and proceeded to require inferior and costly priority controls
I didn't hit many of these issues (our house has 100% LED bulbs, from different manufacturers).
I made sure they were all the same color temperature, and also all >> 90 CRI.
The main issue I've seen is that dimmer switches are usually not compatible with the electronics in high-end fixtures, and that high-end fixtures often take a long time to power on. (Like, walk across the room and open the fridge amounts of time.)
They should choose a standard way of dimming bulbs that doesn't result in noticeable 60hz flicker, and that dictates a max 100ms turn on latency, then ban the sale of "dimmer compatible" LED bulbs, or "LED compatible" dimmer switches that are not compliant with that standard.
Also, bulb reliability should be tracked, and any product with a > 5% failure rate in the first 5 years should either be banned, or the company should have to put replacement funds into escrow.
(Current bulbs have a ~ 5-10% failure rate from what I've seen.)
Yeah, I love Alec's Technology Connections video on some bulbs with that feature, but he pointed it partly because some of the few bulbs that offered it seemed to be getting phased out.
Its much like a bunch of other points on the list. There are a fair few that would only add a small amount of additional cost, but because the companies can save money by not doing it, they don't.
It does not actually cost all that much more to add a few more diodes, to avoid severely overdriving the ones on the board, or to improve the power supply circuitry so that it will likely last longer.
But it really sucks that even if you chose to buy the more premium tier bulbs being offered at the big box store, they often don't fix some of these issues. They may have a better CRI, but are still often overdriven, with questionable power supply designs.
Primarily it is the E27 bulbs that are the problem. Designed to ease people into simple replacement into the old light sockets 10+ years ago. Now in 2023 the new LED products with the well designed power supplies work much better and efficiency. The author mentions renovations, but still using ancient fixtures, wiring and switches. A new house, or partial renovation, should now be wired with 24v for all wall and ceiling lights.
The same topic about LEDs has so many entries on HN in the recent years. I have posted about it a lot. To add to the list
- low power factor (usually 50%).
- cheap passives, caps/coils
- terrible heat dissipation, e27/e14 are no good target, but see overdriven again
- close to no input protection (see power supplies, again), so motors totally wreck them with their induction kickback
OTOH, constant (not over)driven LEDs with dedicated power supplies (pref. isolated, so safer), with decent area, aluminum PCBs can last long.
A cheap advice if you have to buy a retrofit LED bulb, buy the heaviest one, i.e. get a scale with (at least) gram precision and weight them. More mass - better heat dissipation, better passives.
We're mostly on the same page, but there are some caveats to buying the heavier bulbs, even assuming the weight is all heat sink- because that won't matter if the heated air has no where to go!
An expensive bulb with a nice heat sink will fail just as quickly as a cheap one when you put it in a well-sealed can light or something else that traps all the hot air.
If one bothers to care about heat sinking, they might actually test the thing and opted not overdrive it. It's just a good totally layman indicator.
Funny enough most mains/350V DC, chips tend to have a limiting resistor for the current drive - lower resistance = high current. Most (if not all) have two resistors in parallel (for a better control, and less power per resistor) - desoldering one would greatly improve the lifespan for a minimal luminosity loss. So by picking a larger heatsink, they might picked a bit large value for the resistors as well.
Simpler and better light, but bad energy efficiency.
Although, as was pointed out to me at some point, because LEDs are more efficient, people feel less guilty about having more of them; replacing 1x40 watt bulb with 8x5 watt LEDs means the net result is the same. I've got like 7 cute LED spotlights in my TV closet for example, I wouldn't have had that setup if I was forced to use incandescent lamps.
Maybe I am missing something from this conversation, but all of my LED bulbs produce far _less_ heat than incandescent bulbs. The lamp in my bedroom no longer keeps me warm!
Yes, but incandescent bulbs don't need to be cooled at all because they're just tungsten and glass. High powered LEDs require a heat sink to not damage the diode.
By cooling the issue is that LEDs and the driving electronics will become damaged with heat, while an incandescent has simple parts which can easily survive inside an oven. This matters when used in fixtures like recessed ceiling cans where there’s no ventilation. The LEDs just cook themselves while incandescents don’t care.
When my garage+office was built a few years ago, the electrician used a bunch of faux-recessed LED fixtures (the brand name is "I Can't Believe It's Not Recessed!", which is certainly memorable). They surface-mount over standard ceiling junction boxes, but appear similar to recessed lights once installed. We have ~20 of these fixtures, both interior and exterior. They're quiet, flicker-free, and have a great dimming curve (with the standard Z-wave dimmers I've used). We've had no failures so far after almost four years, so they've passed the leading edge of the bathtub curve.
I think it's much easier to design entire fixtures than retrofit bulbs, as there's much more control over heat dissipation and so on. Finding trusted manufacturers (and supply chains that resist counterfeits) is also extremely important.
The faux-recessed LED fixtures are a really interesting case because they're either going to be the most reliable LED in your house, or one of the least! This is because heat is the LED diode killer (as well as the power supplies driving the diodes)
Can lights have historically been an issue for insulation of houses, as they provide a channel for the warm ceiling air to enter the plenum space between floors or the attic. Thats bad for insulation, but actually amazing for a retrofitted LED light, because it's the only fixture that will provide airflow to cool it!
On the other hand, faux-recessed LEDs can also be installed directly on top of the ceiling drywall, without any penetration. Thats the worst case scenario for heat build up, as heat rises and it's completely trapped by the dish of the light and the ceiling.
You will find that you will need to search a bit harder to find an LED light that is rated to work with enclosed fixtures. Enclosed fixtures don't allow the same amount of cooling as a normal lamp.
I've moved three times with one set of cheap LED bulbs without having to replace a single one. I'd have gone thorough dozens of incandescent bulbs in the same time period. I'd have also burned my hand on a few.
LEDs are definitely simpler than incandescent from a user perspective.
- LED emitters driven hard for cost reasons, age and fail quickly
- Power supplies driven hard for cost reasons, age and fail quickly
- Poorly designed power supplies that age and fail quickly
These are all features from the producers POV. Planned obsolescence.
- Poor CRI and SSRI
This is true for all cheap lights, you gotta pay for that.
I replaced nearly all bulps with WiZ bulps more than 2 years ago.
- I don't see flicker on any of my cameras. The light is actually really nice for filming too.
- The price is way less than hue (I own a few, and don't think they are any better)
- I get way more light per watt than with any other bulp type. Not sure what you mean with luxury.
- I love that each of them has an independent API on their own IP. Works perfectly for my smart home design.
I've had several cheap (in build quality, not price) bulps fail on me meanwhile. Not a single WiZ had failed so far. As said I own some hue too, but I wasn't willing to spend over $1000 just on bulps for my house but then I found the WiZ brand.
Not sure if am just lucky. But I really enjoy multicolour plus warm and cold LEDs in the whole house.
There are YouTube channels dedicated to repairing non-functional LED bulbs. In every case the issue is usually that one of n leds has failed, and if you solder a bypass then the remaining leds work fine. After that the only real problem is that all the adhesives used in the construction of the bulb more or less require that you destroy the bulb in order to get to the point you can repair or bypass the one LED.
As a general answer, dimming. Incandescent bulbs are fantastically sensitive to applied voltage; Wikipedia's article on the subject (https://en.wikipedia.org/wiki/Lamp_rerating) notes that bulb lifespan is inversely proportional to the applied voltage to the fourteenth power or so.
In an ordinary home you can't directly reduce the supply voltage, but dimming a higher-rated bulb will get you somewhere in the ballpark through a reduction in the duty cycle.
However, this comes at the expense of luminous efficiency. Reducing the applied electrical power reduces the filament temperature, and the black-body spectrum of a lower-temperature filament has proportionally more output in the infrared region.
That page is also why the popular depiction of the Phoebus cartel (as in: they intentionally made 1000h bulbs, but could have made 3000h bulbs instead that were otherwise identical, like e.g. Veritasiums popular video) is wrong. For classic incandescent bulbs, color temperature and lifetime limit each other. It is not possible to make a 2500K incandescent bulb that lasts longer than about 1000 hours, and a 3000 hour bulb will always have a dim and warm output. Because that's how it gets to 3000 hours.
I wanted to upgrade the super faint positional lights in my two garage openers, and I need to stay <= 10W, so I tried some LEDs. But they kill the 433 MHz remote signal, sadly. Tried 3 different brands, a couple of which don't actually fully turn off, or give off a loud hum to boot.
The openers use rear car light bulbs, for some reason (BA15s).
How much of this is driven by the actual cost of properly provisioning emitters and fielding a good power supply vs the inability of consumers to hold manufacturers accountable?
Does the average person remember the brand of lightbulb they purchased at Walmart or the hardware store? I would hope the buyers at stores would have better sense to buy half decent brand vs utter trash available through 3rd party sellers online. Not much hope though.
> - It is quite difficult to even buy high quality LEDs as a mere mortal
I'm going through this again now. At one point I found Philips EyeComfort bulbs on Amazon which checked all my boxes (2700-3000K, 60W, dimmable, almost non-existent flicker). I've had a couple bulbs die on me now, and I cannot for the life of me find replacements, it's like they stopped manufacturing them. I have no clue what to replace them with now
> The same light quality is vastly more expensive to achieve with LEDs, even if you account for high electricity prices. Good indoor lighting is now something only people with plenty of disposable income can afford.
Where? How? I can no longer buy quality LED lighting at any price. I have a bunch of Sylvania Ultra Sunset Effects bulbs purchased ~15 years or so ago that nothing since even comes close to.
At the limits of their ratings. They could make LED bulbs last many orders of magnitude longer and be more efficient, but they don't (unless forced to[1]) because they prefer planned obsolescence.
Or because, as with many products and services, many people go into Home Depot or wherever and buy whatever is cheapest--especially in a world where higher price does not necessarily equate to higher quality or longer life.
> especially in a world where higher price does not necessarily equate to higher quality or longer life.
There's the kicker; how can you tell when something is better quality anymore? Qualifiers like "is this device run at max capacity or is there leeway" are never listed on packaging or product features.
It's often hard to know, especially for items you're not going to individually research in great depth, whether you're actually paying for quality or for a name on the package even though it actually came off the same assembly line in China as any number of knock-offs. And, even if it is higher quality by some standard, does that really affect consumer outcomes?
I do not, nor will I ever, excuse penny-pinching by companies by agreeing that they’re forced to do it because people will always buy the cheapest thing they can. It’s trotted out as the lame excuse for bag check fees and other declining flying services, cheap consumer goods, cheap electronics, you name it.
To accept the premise is to believe that anything made of quality will never get bought/used which is manifestly not the case. And it strangely completely ignores the incentives companies have to make things as shitty as possible, namely lower expenses and planned obsolescence.
>To accept the premise is to believe that anything made of quality will never get bought/used which is manifestly not the case.
I disagree. It is a ratio of quality to price. People have different opinions about what the acceptable minimum ratio is, and it varies by product, and by time. For example, many people find Costco to hit the right ratio most of the time.
For example, I have been using LEDs and dimmable LEDs from soft white (~2700K) to cool white (~4000K) with no problem, all purchased at Home Depot/Lowes/Costco. Some have failed earlier than anticipated, but nowhere near enough to cancel out the cost savings.
And it's a matter of individual consumer priorities.
Some consumers will happily pay for business class seating on planes. Others will generally overlook inconvenience and less comfort if they can save $50.
Yes, another example is clothing. I have no interest in buying high quality clothing that I have to spend time taking care of. I want whatever lasts longest, while still being able to throw in the washer and dryer on default settings without having to separate colors.
The problem is evaluating quality before purchase. There's not a great way of expressing the sorts of factors that differentiate between good and bad LED bulbs that consumers can easily understand, let alone anything to encourage different manufacturers to use the same measurements. If the consumer can't tell what is quality, what's to get them to spend the money for it?
More recently Philips has started selling the Dubai-style bulbs worldwide, branded as the "Ultra Efficient" range. They're expensive though, as you'd expect.
I believe BigClive did a video on those, and while they did indeed use more LEDs for improved efficiency, they also had a more complex and thus failure-prone driver than the original Dubai ones.
In other words, more efficient, but not longer lifespan.
Are you sure those are the same bulbs? The ultra efficient versions sold on Amazon have reviews going back to 2017. Supposedly the Dubai-style bulbs were only available in Dubai even a few years ago.
I've been installing Kauf brand smart-bulbs, which come pre-flashed with ESPHome for integration with Home Assistant. Some of my earlier bulbs failed, and I recently noticed that the founder of the company commented on the issue and said he specced a more robust capacitor after early failures: https://github.com/KaufHA/kauf-rgbww-bulbs/issues/31#issueco...
I haven't contacted them for replacements yet, but seeing their comment makes me much more likely to purchase them in the future, despite my early issues.
Thank you for making me aware those exist. I have a mix of Hue & cheap Walmart color wifi bulbs. The Hue bulbs are undoubtedly much high quality (in both output & reliability) but you pay for it. The Walmart ones are 1/5 the cost, but very hit & miss on whether you can easily flash tasmota/esphome - and there is no way of knowing until you try because of newer firmwares being shipped in the same packaging.
Placed under operating conditions very close to their specified limits.
Like if you were to drive your car in 2nd gear on the freeway, at 6000rpm.
The engine would wear out much quicker than if you drove in 5th gear, at 1500rpm.
100 years of training make most people think of light bulbs as a trivial purchase. And now a product that cost $0.50 20 years ago is $10, and often performs worse for its purpose.
So the economics just drive cost down no matter what. And even a picky consumer is hard pressed to get what he wants when you go to the bulb aisle at Lowe’s. They literally went from 10 SKUs to 250, with no meaningful standards.
This is my biggest barrier to finding decent bulbs. The search engines on sites like homedepot.com offer very little help, especially since they always show promoted items higher in the results, even if they don't match any keywords I put in my search. Then, if I do find what looks like the right thing, they're invariably out of stock everywhere.
I agree on most point but dim to warm is pretty undesirable in my point of view. I'd like to just be able to set the color temperature independently of brightness. Which I can do with my zigbee lights.
I have every single light bulb in my house (and outside my house) as an LED RGB Alexa-addressible light, and I love it.
"Set all the lights to red" and every single bulb in my house and porch and walkway and garage etc, all turn red.
"Turn on/off all the light"
Set kitchen to firebrick...
Etc.
I LOVE IT.
During the day I rarely have any lights on at all - but at night I have precise control over every bulb in my house with alexa voice.
I initially would never have put alexa in my home, but now that I have it and all bulbs on it, as well as several alexa-fied power outlets, its just a very nice thing to have.
Im not too concerned over "lighting quality" - as I get exactly what I want.
The bulbs I bought were from Costco, where they had them on sale for $5 for a (2) pack. so I replaced all CFLs with RGB Wifi LEDs with alexa, and it was ~$70 to do the whole house (27) bulbs.
EDIT: Dimmability "Alexa Set Kitchen to 10%" --> I can dim or brighten all the lights at once "Alexa set house to 100%" etc...
Add to that spiky spectrum. Incandescent bulbs give black body radiation, a solid spectrum. Regular LED lights spike in RGB to achieve a neutral color. Can cause metamerism in photos and just looks bad IMO.
A bunch of this is driven by power efficiency requirements, creating a flickering low quality mess. Like no-flicker LED lights usually have a worse power factor.
A bulb gets a burst of power every 120Hz, so it would only use about .075 joules per cycle. Half the time is spent above 110v, and half is under 110v, so we need to store less than 1/240th of a second of power to have a perfect output and a perfect power factor.
Let's put a capacitor before the regulator to store that power, and design the regulator to compensate for how the voltage will vary over each cycle. Since we don't want to drain our capacitor entirely, let's spec it for .05 joules at 100v, which means 10µF.
Digikey says a 10µF 200v capacitor costs ten cents.
If there's flicker, I blame the voltage regulator or lack thereof, not the requirement of power efficiency.
Everybody already knows these things. Like, why do democrats (not politicians, I mean you people) have a need to enact new regulations everytime they get a chance?
A misspelling of "SSI", Spectral Similarity Index, another color accuracy metric.
Basically the industry figured out how to win at the CRI game without actually creating the same underlying spectral distribution of light. So they same up with another metric to try to optimize called SSI (also TLCI, etc.) SSI is mostly relevant in the digital cinema space, where the observer is a digital camera, not a human eye, as they can't be tricked the same way because they have different underlying RGB spectral sensitivities.
- LED emitters driven hard for cost reasons, age and fail quickly
- Power supplies driven hard for cost reasons, age and fail quickly
- Poor CRI and SSRI
- Flickering
- Dim-to-warm is uncommon
- Poorly designed power supplies that age and fail quickly
- The same light quality is vastly more expensive to achieve with LEDs, even if you account for high electricity prices. Good indoor lighting is now something only people with plenty of disposable income can afford.
- It is quite difficult to even buy high quality LEDs as a mere mortal
- Retrofits generally work poorly on principle
- LEDs mix exceptionally poorly, making things even more expensive