Why does every small appliance or useful home electronics item have the BRIGHTEST LEDs in them?
I bought a new fan for our bedroom Sunday. It has 4 speed settings, and LEDs to display which setting you’re on.
Just like every other electrical device in our bedroom, I had to cover the LEDs with electrical tape because they are TOO DAMM BRIGHT. That one light was more than bright enough for me to see in the room with all the lights off.
I can’t sleep well if there’s a lot of light like that, especially blue light, and it’s like every fucking electronics manufacturer used the same extra bright blue LEDs.
All of our power strips have them. Same brightness.
The fans have them.
Don’t even get me started on digital clocks and the plague of bright LEDs that they bring about
Many charging plugs have them built into the plug itself.
Even some fucking light switches have them now!
I have about 6 different things in our bedroom that have electrical tape over their completely unnecessary LEDs.
Why has this become such a common thing? Is this really something most people want? To have a room that is never actually dark even with the lights turned off?
I design electronics sometimes. Generally, people want an indicator light on their product, since it’s a cheap way to show the state of a system.
The main problem is, the human eye adapts to darkness. You can still clearly see an LED in a dark room when a few microamperes pass through them, but then they are useless in brighter light in that case. There’s no specific amount of current that produces light that’s bright enough in a lit room, but isn’t too bright in a dark room.
I can fix that by occasionally turning off the LED and measuring voltage across it (LEDs detect light in addition to emitting it), then dimming it if I’m in a dark room. However, this is quite complicated to do and requires a capable microcontroller and a pretty ninja embedded systems programmer. Most product developers I know won’t think of specifically doing this.
Finally, I can save 0.1 cents (plus board space plus assembly complexity, which cost more) by connecting an LED directly to the pins of a microcontroller instead of using a resistor to limit current. Some microcontrollers specifically allow this, up to 10 or 20 milliamperes, which is enough to be too bright in some contexts already. Margins on hardware manufacture are extremely thin, so optimizing even 1 cent off a board is pretty important.
All of this together leads to a lot of LED proliferation, which I’ don’t like either. The stuff I build for myself often has a way to control the LED brightness, although this would be too expensive to add to a consumer product as a general rule. For small devices, there’s a tilt switch inside that turns off the indicator LEDs if you turn it upside down and hold it for a few seconds. That way you can just reach over at night and fix it without fiddling for switches or controls.
I love lemmy for bringing back the old informative internet like this comment.
A photoresistor would be handy for adjusting indicator led brightness.
Sure – and that’s an easy way to do it. However if I’m going to make it automatic, I like the elegance of using an LED as it’s own sensor for how bright it should be. It also uses up fewer microcontroller pins – for example, I can use pulse width modulation to give the LED a default brightness. Then during the OFF part of the cycle, reconfigure the pin to act as an ADC and make a measurement of the ambient light and adjust the duty cycle as needed.
It’s the kind of optimization I enjoy! Another neat trick is using the watchdog timer and counting CPU cycles to allow really low duty cycles for lights you want to keep very dim, without using a resistor to limit current (you are instead using the IV curve on the datasheet and a little math). I use this plus magnets and coin cells to make little lights I can stick to things to avoid hitting my head on them, usually doorframes (I’m very tall and live in Southeast Asia). They run for 3+ years off the cell, and have configurable brightness!
If the device already has a microcontroller then I agree the “high tech” method is more appealing, while for something like a desk fan I think the analog route might be more elegant or at least more robust.
Yeah know what you mean. However these days I can generally get a microcontroller for a lower price than a cds photo resistor, and with a 100 year expected lifetime – also usually it consumes less power too.
I could do it with a phototransistor more easily than a photo resistor. That would be a solid competitor to using an MCU in terms of cost, performance, and power consumption in a simple system!
Anyway in practice I rarely get to use analog or discrete components professionally. The MCUs are just too damn good.
Good points. I didn’t realize even using a dedicated MCU just for that would be the better option.
This struck me as super weird too. It still ‘feels’ wrong to use a whole CPU instead of a few logic gates or a 555.
It took some getting used to. Maybe soon I’ll dive into the world of one-time-writeable Chinese MCUs (the ones I normally use have rewriteable flash). Those are 9 cents a piece!
If they get any cheaper I’ll start using them as ballast!
Thank you for this informed input :-)
There’s a whole amazing secret world where our devices come from! I’m glad just to have a little window in on it.