I have a theory that cheap LED lights with low quality drivers are bad for dogs. LEDs with low quality drivers very often have a high amplitude flicker at 50/60Hz, which is about the flicker fusion rate for humans so we don't perceive it (at least usually), but dogs are known to have a substantially higher flicker fusion rate and probably perceive the flicker. Probably worth considering, especially if you have a dog with epilepsy.
(Incandescents also flicker at 50/60Hz of course, but the thermal inertia of the filament makes this a lower amplitude flicker.)
They also flicker really badly if your power is not perfect, like you have a decent sized training rig on a different circut.
Incandescents are basically little light inductors and I would imagine the luminosity curve would be sinusoidal vs whatever hell a LED driver chip puts out.
I have a theory that they're bad for humans too. I think the same is true for computer screens too. Something about the flickering and the "unnatural" color spectrum messes with people's heads (anecdotal/subjective). Maybe our brains do extra processing work that detracts from other systems.
The newer phosphor-coated "filament" LED's also have this "inertia" which is nice. And no other electronics to fail also since they are just series-connected LED's.
The phosphors do have some inertia, but I'm not sure if it's really enough, since fluorescents have phosphor coatings too and those are notorious for noticeable flicker. Different kinds of coatings though, different thickness and quantities of phosphors, maybe different composition too. I don't know how comparable they really are.
Besides the epilepsy, what do you mean with "bad for dogs?"
Sidenote: My partner and me both "feel" the difference of cheap LEDs. Its not something we can pinpoint down, but it got way better with hue lights.
We live with multiple dogs and iam really curious. One of the dogs that we often had around had severe epilepsy that very strong medication was needed and the dog died anyways way to early around the age of 2. She had less sizures than in her original home when she was with us which ofcourse might be unrelated to the lighting. But your thought is interesting.
When you notice flicker, your iris is trying to expand and contract at that rate to compensate. It was one of the things people were pointing fingers at for sick office syndrome, back when fluorescent tube lighting was popular. People, and I assume dogs, have different thresholds when this problem kicks in and flicker becomes noticeable.
Sibling comments have no clue what they're talking about. I actually engineer these things.
The short answer is cost and heat. What you're describing is a linear constant current driver, which are the gold standard for flicker-free operation. Drawback is a much more complex circuit and a transistor burning off a ton of waste heat.
A constant current driver requires sensing the current across the LED, which involves sensitive analog circuitry. A PWM drive requires essentially nothing more than a FET tied directly to your microcontroller.
There are switch-mode constant current drivers, best of both worlds. Essentially a buck-boost converter in current mode instead of voltage mode. These are slightly more expensive, so they don't appear in consumer lighting products.
All that aside, the reality is that if your PWM frequency is high enough, it doesn't matter. Above several KHz it's imperceptible. The reason that this still isn't done universally is that it's a third of a cent more expensive to use a controller that can switch above 1KHz. All hail the glorious race to the capitalistic bottom.
At work I just finished up a constant current driver circuit. At home I'm building a custom lighting system with bespoke driver circuits. Despite having a CC driver I can pull off the shelf, I still chose 10KHz PWM. It's easier and more efficient, and neither you or I could tell the difference in the quality of the output light.
Dimming. And even without that, the "continuous DC" is usually provided by switch-mode power supplies that themselves have a PWM ripple. How much depends on the quality of the supply.
Since AC is pulsating you need to store some energy to get continuous DC, usually in a smoothing capacitor. And that capacitor is relatively big and when durable, then not cheap. And it requires some further complications (like avoid inrush current).
You know, it's funny - there's a couple places where this kind of thing seems to have come up. There's research around crop nutrition levels that shows decreases in nutrient levels as yield per acre goes up, there's research on supplements and vitamins that shows synergistic effects between seemingly unrelated substances, we've seen surprising effects from adding or removing species from an ecosystem. One begins to suspect that the "reductive" method of science - the sort of physics- or mathematics-type "reduce the variables of the problem until we can isolate effects" approach - isn't particularly well-suited to dealing with biological systems. You see it in bioinformatics as well - we've sequenced the genomes for many organisms, and have learned a lot from doing so, but we're also learning the limits of that approach pretty strongly - the organism isn't defined just by its 'code', but its environment; the presence, distribution, and concentration of various chemicals; etc. I suspect as we move more towards the "biological" century here we're going to have to readjust how we approach things to start trying to find those synergistic effects earlier in the process, rather than pull everything down to its constituent parts and then experimenting pairwise with various combos. I get the difficulties in doing that, but I feel like we've repeatedly found the stuff we've discarded as irrelevant to the problem ("things that are not in the visual wavelength the eye perceives") in fact do wind up being relevant (wider full-spectrum light has effects outside the mere spatial perception of objects).
You are basically hitting on what has been referred to as high modernism, which promotes a level of confidence in science and technology that can only be maintained by eschewing all the inherent complexity of the world. The scientific method can really only study systems by modifying a handful of variables at a time and keeping the rest fixed, and isn't really capable of handling hundreds of interacting variables. Rather than acknowledge this limitation, high modernism embraces simplification even to the detriment of its products.
> The scientific method can really only study systems by modifying a handful of variables at a time and keeping the rest fixed
Not true. Statistical measures of large systems are a routine thing in the natural sciences. However that's higher effort and tends to make it more difficult to communicate the results to others so it's avoided whenever possible.
Also high dimensional models carry a distinct risk of overfitting.
Could it be the pendulum swinging hard before selling in the middle?
Once we learn from our mistakes we can find the frequencies that do yield the best outcome and at the same time consume (say) ¼ the energy of an incandescent bulb.
Or the minimum set of species yielding optimal outcomes, without the answer being "all of them"
Sounds like a holistics approach will be the next big swing of scientific endeavor.
That was the original point to begin with, right? Learn what makes things tick and then build on that. We've got enough of it down to the atomic level that maybe we should zoom back to the supermolecular.
There is a 15-30% difference between the groups at baseline (fig 8c-9c, 8d-9d), about the same magnitude as the claimed effect of the experimental condition.
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
The push toward LED seems to be primarily for emission target related reasons. It is very hard to buy incandescent bulbs in the UK; even for those of us that accept the cost implications. Also, many less expensive LEDs flicker at the rate of the frequency supply of the current (ie 240 or 120 Hz). This is very annoying and related to the instantaneous response of LED vs the averaging effect of the alternating current through an actual glowing hot filament. It is interesting to read on the development of blue and white LED technology.
In the EU this was indeed done for energy efficiency/emissions. Incandescent bulbs were gradually banned from normal sale, starting with the most energy hungry (diffused 100W) and gradually expanding until only low-wattage and special-purpose bulbs were left. Special-purpose bulbs cover a large variety for everything where switching didn't make sense, like machine shops or historic buildings. LEDs aren't mandated per se, but they are the most attractive alternative. And because this all happened before brexit the UK has the same rules, unless they revised any of them post-brexit
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
> LEDs aren't mandated per se, but they are the most attractive alternative.
Yeah, basically what the EU did was to say: For X Watts of electricity at least X Lumen of light has to be produced. And this number was gradually increased. Since old school light bulbs are quite inefficient when it comes to producing light, they slowly had to be phased out.
> The push toward LED seems to be primarily for emission target related reasons
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling [2], in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
>Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
I have not found that LED bulbs last noticably longer than incandescents. I'm still replacing bulbs, and though I don't keep records it feels about the same.
LEDs are just terrible in every way except electrical consumption.
It does seem an easy win for govts to easily conform.
I buy the ones that are suitable for dimmable switches (even tho I don't have dimmers) because there is discernible flicker with most other LED bulbs if you for eg wave your arm through the air or made a saccade. There is a certification (i think) for LED bulbs that are closer to
sunlight in their emission spectrum
LED bulbs, even though cheaper in the long term, used to habe high enough shelf prices enough that most houdeholds wouldn’t have switched without a government push. Incandescents are literally banned now for most uses, while the economies of scale have helped drive LED prices down.
It costs less to run because less energy is used; I'm pretty sure incandescent bulbs aren't emitting anything by themself!
"The push" is from the government, perhaps consumer demand is "the pull".
The flickering is solely a result of cost cutting in the power supplies of these LED lights. The problem is totally solvable with a constant current switching power supply. But the filtering circuitry adds cost.
The problem is that consumers usually cannot know this about a particular light (or a lot more) at the point of purchase, so even if you are willing to pay a premium for this you cannot.
I would pay a premium for longer life, and at least in some cases (e.g. lights I read by) for better quality. How do I do so? I would love to be pointed at sources of better ones (in the UK).
> The push toward LED seems to be primarily for emission target related reasons. It is very hard to buy incandescent bulbs in the UK; even for those of us that accept the cost implications.
Can you even buy them without buying new old stock? In the US they're banned and there's zero production.
I recall there was a guy in the EU who tried to get around the regulations by selling "heat bulbs" that were exactly the same as traditional incandescent bulbs but marketed as a heat source, but I think he was slapped down.
At least in Germany you can still fairly easily get 20W incandescent lamps. Sold as lamps for fridges and ovens, but they are available with standard sockets.
If you look around a bit you can also get 60W or 100W lamps, sold as "industrial lamps" or "extreme temperature lamps", labeled as unsuitable for household use. But those are specialty lamps that you won't find in your local supermarket. Not sure if those are new old stock or imported
Certain size/watt combos are still available for things like appliances and nightlights, but I think that includes 20W E26/A-something bulbs, and the bulbs for plug-in night lights. I can still find them on the Home Depot and some other places. No idea about quality but I still prefer how they look. There are so many other horrible energy efficiency problems with heating my home that the inefficiency of a few incandescents in key places doesn't bother me in comparison to the enjoyment I get from the nice light.
If I were able to see the flicker of mains supplied LED lighting (which I cannot), then I would be very tempted to install low-voltage DC LED lighting, which presumably does not flicker.
An AC/DC power converter works the same, either built into the bulb or in a separate unit. But yes, a separate power converter is almost certainly going to do a much better job of removing the 50/60Hz voltage drop. Not sure if it would be cheaper, given the economies of scale on AC bulb manufacture. Higher quality AC bulbs may come out ahead for flicker free lighting.
It only doesn't flicker if there's no power driving circuitry - eg just LEDs and a resistor.
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
Very interesting. I've always thought that there was something a bit "off" about LED torches and car headlamps; the brightness is there, but something about the light just doesn't seem to illuminate as well as an old dim incandescent or even fluorescent tube.
It's usually the Color Rendering Index (the spectrum of frequencies that the light puts out). Incandescent bulbs more of less mimic that of the Sun, they are "black body radiators". Cheap LEDs tend to be missing a lot of the red spectrum.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
There's a massive difference between the 2600K of regular incandescent bulbs, and the 6000K of sunlight. That's why hollywood used HMIs until they migrated to LED.
They're saying that the visual performance is indirectly affected by invisible wavelengths somehow. Not that you can see the difference between two types.
They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
I get that they're more efficient in some sense, but man the LED streetlights and other big lamps are so irritating and make things like like such ass compared to mercury vapor or even sodium lights.
True. Yet, somehow more and more cities install them blindly because efficiency. I remember when I moved to Odense Denmark in 2013 - they had LED street lights all over the place. I thought - this is the future compared to my uderdeveloped post soviet Latvia. And yet, I remeber when I moved back, streets at night looked so yellow because the city still relied on sodium lights. And my eyes felt much more comfortable. At the time I wrote it off to nostalgia or something, and here we are.
Yes lots of them use cheap LEDs with poor CRI, high color temperature, and a huge blue spike in the spectrum. All of that leads to a very bright looking light that also doesn't let you see detail very well.
Just to point to anybody that comes here directly, the article has no relation at all with perceived illumination, color fidelity, or anything else people complain about leds.
It's an interesting niche topic that you may want your working place to notice if you work indoors.
I’ve always been mildly bothered by the LED lighting in my home, as if it’s simultaneously bright but not illuminating. In simple consumer terms, if I wanted to shop for a variant that more closely replicated incandescent lighting, what exactly am I looking for on the packaging? Or does this not exist?
It’s called SSI, spectral similarity index. SSI is specified for a color temperature, eg 3200 or 5600. 100 is identical to tungsten or sunlight. Values above 85 are good.
In the UK I've not been able to find high wattage (10-20W) LED lightbulbs with high CRI, some don't even mention it in listings, let alone SSI, which I have never seen.
Where are you seeing these? Is this industrial/commercial suppliers?
I buy the "warm" light LEDs, which look (to my eye) closer to incandescents.
Standard LEDs bulbs are bright white, almost bluish, and yes "bright but not illuminating" describes them well. I feel many modern car headlights have the same issue.
The human eye doesn't focus the blue end of the spectrum very well.
There is no such thing as a “standard LED lamp”. LED lamps come in a huge variety of shapes, various bases, power usage/lumen output, color rendering index, and color temperature.
Lots of companies sell cheap crappy A19 E26 base 5000K lamps, that doesn’t make them the ‘standard’.
Nothing on the box really means anything, so many bulbs claim high CRI and everything but in reality have terrible spectrum. So you can only go off of actual real life testing from a third party.
No mention of CRI which seems kind of odd. LEDs for lighting are increasingly graded by how natural their emission spectrum is. Older lights are quite bad, newer ones sacrifice a tiny bit of performance for more uniform spectrum.
They use rf numbers, which is a newer standard, so that's probably good.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
I think CRI is not important here as thats a measure in the visual spectrum. The paper talks about all the missing wavelength outside of the visual spectrum.
CRI is a pretty bad rating system. They are showing the full spectrum graphs which is what you'd want anyway. Spectral Similarity Index (SSI) is the better number
Sure, but I don't see them mention what they're actually using for LEDs at all. They mention a "colour fidelity index" but I'd expect a manufacturer part number or something so I can pull the datasheet.
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
What is the relationship between CRI and how broad (or narrow) the spectrum output by the LED is? Is CRI automatically better for broader-spectrum LEDs? Or is that too simplistic?
Slightly overly simplistic because the broader-spectrum LEDs could be broad-but-spikey for their output, resulting in light that is broad spectrum, but has a bad CRI (because it's eg really blue).
(Incandescents also flicker at 50/60Hz of course, but the thermal inertia of the filament makes this a lower amplitude flicker.)
They also flicker really badly if your power is not perfect, like you have a decent sized training rig on a different circut.
Incandescents are basically little light inductors and I would imagine the luminosity curve would be sinusoidal vs whatever hell a LED driver chip puts out.
Sidenote: My partner and me both "feel" the difference of cheap LEDs. Its not something we can pinpoint down, but it got way better with hue lights.
We live with multiple dogs and iam really curious. One of the dogs that we often had around had severe epilepsy that very strong medication was needed and the dog died anyways way to early around the age of 2. She had less sizures than in her original home when she was with us which ofcourse might be unrelated to the lighting. But your thought is interesting.
Psychological or physiological unease at least, I assume this from the way rapidly flickering dying florescent lights make me feel.
But also note that not all LED lights flicker.
The short answer is cost and heat. What you're describing is a linear constant current driver, which are the gold standard for flicker-free operation. Drawback is a much more complex circuit and a transistor burning off a ton of waste heat.
A constant current driver requires sensing the current across the LED, which involves sensitive analog circuitry. A PWM drive requires essentially nothing more than a FET tied directly to your microcontroller.
There are switch-mode constant current drivers, best of both worlds. Essentially a buck-boost converter in current mode instead of voltage mode. These are slightly more expensive, so they don't appear in consumer lighting products.
All that aside, the reality is that if your PWM frequency is high enough, it doesn't matter. Above several KHz it's imperceptible. The reason that this still isn't done universally is that it's a third of a cent more expensive to use a controller that can switch above 1KHz. All hail the glorious race to the capitalistic bottom.
At work I just finished up a constant current driver circuit. At home I'm building a custom lighting system with bespoke driver circuits. Despite having a CC driver I can pull off the shelf, I still chose 10KHz PWM. It's easier and more efficient, and neither you or I could tell the difference in the quality of the output light.
Nope. Lots of people see the strobing. Its causes headaches if you focus on the lights.
Further reading: https://en.wikipedia.org/wiki/Seeing_Like_a_State
Not true. Statistical measures of large systems are a routine thing in the natural sciences. However that's higher effort and tends to make it more difficult to communicate the results to others so it's avoided whenever possible.
Also high dimensional models carry a distinct risk of overfitting.
https://news.ycombinator.com/item?id=46764382
Once we learn from our mistakes we can find the frequencies that do yield the best outcome and at the same time consume (say) ¼ the energy of an incandescent bulb.
Or the minimum set of species yielding optimal outcomes, without the answer being "all of them"
That was the original point to begin with, right? Learn what makes things tick and then build on that. We've got enough of it down to the atomic level that maybe we should zoom back to the supermolecular.
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
Yeah, basically what the EU did was to say: For X Watts of electricity at least X Lumen of light has to be produced. And this number was gradually increased. Since old school light bulbs are quite inefficient when it comes to producing light, they slowly had to be phased out.
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling [2], in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
[1] https://en.wikipedia.org/wiki/Class-T_amplifier
[2] https://www.diyaudio.com/community/threads/darling-1626-amp.... and https://imgur.com/gallery/oh-darling-tube-amplifier-Lq2Sx
its the same thing. If it uses less electricity it both reduces the cost to you and reduces emissions from generating electricity.
I think most people would have switched over gradually anyway, but effectively banning incandescents speeded it up.
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
https://en.wikipedia.org/wiki/Ecodesign_Directive
LEDs are just terrible in every way except electrical consumption.
I buy the ones that are suitable for dimmable switches (even tho I don't have dimmers) because there is discernible flicker with most other LED bulbs if you for eg wave your arm through the air or made a saccade. There is a certification (i think) for LED bulbs that are closer to sunlight in their emission spectrum
I would pay a premium for longer life, and at least in some cases (e.g. lights I read by) for better quality. How do I do so? I would love to be pointed at sources of better ones (in the UK).
Can you even buy them without buying new old stock? In the US they're banned and there's zero production.
I recall there was a guy in the EU who tried to get around the regulations by selling "heat bulbs" that were exactly the same as traditional incandescent bulbs but marketed as a heat source, but I think he was slapped down.
If you look around a bit you can also get 60W or 100W lamps, sold as "industrial lamps" or "extreme temperature lamps", labeled as unsuitable for household use. But those are specialty lamps that you won't find in your local supermarket. Not sure if those are new old stock or imported
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
https://news.ycombinator.com/item?id=46764382
Deleted Comment
Deleted Comment
It's an interesting niche topic that you may want your working place to notice if you work indoors.
Where are you seeing these? Is this industrial/commercial suppliers?
Standard LEDs bulbs are bright white, almost bluish, and yes "bright but not illuminating" describes them well. I feel many modern car headlights have the same issue.
The human eye doesn't focus the blue end of the spectrum very well.
Lots of companies sell cheap crappy A19 E26 base 5000K lamps, that doesn’t make them the ‘standard’.
https://www.thesmarthomehookup.com/25-soft-white-led-light-b...
Nothing on the box really means anything, so many bulbs claim high CRI and everything but in reality have terrible spectrum. So you can only go off of actual real life testing from a third party.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
a) How do Philips Hue bulbs stack up?
b) Did Philips update them generationally and assuming they are decent now, how recently?