Maker Pro
Maker Pro

Strange problem with low energy light bulb

  • Thread starter Seán O'Leathlóbhair
  • Start date
A

Albert Manfredi

Jan 1, 1970
0
Dave Plowman (News) said:
Err, isn't that what I wrote? It's the colour temperature that matters
rather than the source.

Well, you wrote may things, including this:

"Lighting which is used to replace daylight - like that most of us have
at home for use when daylight fades - ideally shouldn't give such a
sudden change in temperature that it is noticeable. In the same way as
lighting used to supplement daylight - like in say an office - should
also be an approximate match to that daylight. It's common sense,
really."

I do agree that if we are supplementing daylight, e.g. in work spaces
with large windows during the day, rather than providing lighting at
night, a cooler light (hotter temp) is probably preferable. But for
night time lighting, I think what we are looking for is the color of
flame.

I'm saying, it's not that we are conditioned to the color of tungsen,
it's that we are looking for something close to 2000 K at night. Much
cooler cooler light than that (higher temp) is stark and generally
unpleasant.

By the way, this also applies to xenon headlights in some cars. They are
superbly obnoxious at night, to other drivers. Even if they aren't
brighter than halogens, the bluish color is very distracting.
Fortnunately, there seem to be fewer of the really annoying ones around
these days. Maybe the auto makers got too many complaints.

Bert
 
L

Lostgallifreyan

Jan 1, 1970
0
[email protected] (Don Klipstein) wrote in
Most of the output of an incandescent is IR.

What I meant was, might more heat be carried away by the convection in the
argon fill, and be either conducted or radiated away at far longer
wavelengths? I mentioned convection specifically to be clear I'm not
talking about directly radiated energy.
I was only mentioning figures of lumens per watt of visible light
output
to explain that an incandescent achieving 17.1 lumens per input watt
is nearly 7% efficient.

Put 100 watts into an incandescent that chieves 17.1 lpw. You get
1710
lumens. Each lumen is about 1/250 watt of "white light", not the
1/683 watt assumed by those claiming incandescents are only 1-2%
efficient.

I've managed unintentionally to get you to say that three times now. :)

I'm not always quick on the uptake, but I try... what I'm getting at, is
can any other evaluation result in that lower figure? I'm not convinced
that taking only the lumens at 555 nm accounts for this. Lumens seem
slippery enough if they depend on spectra and photopic sensitivity anyway.

Wikipedia again:
"In photometry, luminous flux or luminous power is the measure of the
perceived power of light. It differs from radiant flux, the measure of the
total power of light emitted, in that luminous flux is adjusted to reflect
the varying sensitivity of the human eye to different wavelengths of
light."

I guess that much can be relied on. So try it this way:

Take a 100W incandescent, and a large ellipsoidal mirror to gather as much
of its radiant flux as you can, throwing it to the other focus of the
ellipse where a black painted thermopile awaits. The incoming light is
passed through a dichroic filter at 700 nm to send the IR elsewhere and
pass only the visible light to the thermopile. Assuming you get close to
ideal light gathering for the visible wavelengths (and IR rejection), how
many watts will be read from the thermopile?

I understand that photometric measurements abount, and radiometric ones are
rarer, but that's what I want to look at, as without that grounding the
rest seems most insecure.
 
D

Don Klipstein

Jan 1, 1970
0
On Wed, 11 Jul 2007 15:10:28 +0000 (UTC) in sci.electronics.basics,
[email protected] (Don Klipstein) wrote,

Do you put orange gels on your windows?

No, I don't.

But if the light level is neither in kilolux nevels nor a recent uptick
from something lower, I find daylight to usually have no warmth or "cheer".

If the ceiling is dark but the windows are bright, then things can look
cheerful - sometimes - for some reason.

- Don Klipstein ([email protected])
 
L

Lostgallifreyan

Jan 1, 1970
0
I'm saying, it's not that we are conditioned to the color of tungsen,
it's that we are looking for something close to 2000 K at night. Much
cooler cooler light than that (higher temp) is stark and generally
unpleasant.

It is conditioning, but it's worth thinking about what the conditioning is.

First, how can a hotter temperature be cold?? The only way to account for
that is to look at the environment. Blue sky accepts radiant heat from the
earth so nights chill faster on clear evenings, pale light from the moon or
stars accompanies those same conditions within hours, and those sources are
so pale that we have scotopic vision to cope with them.

The whole thing is based on comfort. There is one exception to the usual
perception of cool faint lights. The whole midsummer night's dream idyll is
based on this, the almost magical inversion that allows a warm night to
make perception of these 'cold' lights seem something other than
threatening to our health.

I bet we could get used to 'cold' light plenty fast so long as we weren't
actually cold ourselves. Conversely, Dickens and many others have commented
on the bleakness of a small flame when there isn't enough heat to warm the
people who need it. It really has to do with our ambient conditions, not
direct colour perceptions at all.
 
D

Don Klipstein

Jan 1, 1970
0
[email protected] (Don Klipstein) wrote in


What I meant was, might more heat be carried away by the convection in the
argon fill, and be either conducted or radiated away at far longer
wavelengths? I mentioned convection specifically to be clear I'm not
talking about directly radiated energy.

As far as I understand what goes on there, around 10-15 watts is
convected from the filament in a 100 watt "USA-usual" "standard" A19.
I've managed unintentionally to get you to say that three times now. :)

I'm not always quick on the uptake, but I try... what I'm getting at, is
can any other evaluation result in that lower figure? I'm not convinced
that taking only the lumens at 555 nm accounts for this. Lumens seem
slippery enough if they depend on spectra and photopic sensitivity anyway.

Wikipedia again:
"In photometry, luminous flux or luminous power is the measure of the
perceived power of light. It differs from radiant flux, the measure of the
total power of light emitted, in that luminous flux is adjusted to reflect
the varying sensitivity of the human eye to different wavelengths of
light."

I believe some who are not aware that the lumen is a unit of luminous
and not radiant flux, or not aware of visible wavelengths other than 555
nm having less than 683 lumens per watt, divided a lumen/watt efficacy
figure by 683 to come up with incandescents being only 1-2% or 2.6%
efficient.
I guess that much can be relied on. So try it this way:

Take a 100W incandescent, and a large ellipsoidal mirror to gather as much
of its radiant flux as you can, throwing it to the other focus of the
ellipse where a black painted thermopile awaits. The incoming light is
passed through a dichroic filter at 700 nm to send the IR elsewhere and
pass only the visible light to the thermopile. Assuming you get close to
ideal light gathering for the visible wavelengths (and IR rejection), how
many watts will be read from the thermopile?

I expect about 6.7 watts in the case of a 1710 lumen 100W incandescent,
if the ellipsoidal mirror is a whole ellipsoid and 100% reflective and the
dichroic filter passes all 400-700 nm light.

As for the rest, approximately or "educated guesses":

UV passing through the glass: .12%
UV absorbed by the glass: .02%

Heat conducted/convected from the filament: ~13%

IR passing through the glass: ~60%
IR absorbed by the glass: ~20.16% ("rounded oddly" to make figures add to
100%)
I understand that photometric measurements abount, and radiometric ones are
rarer, but that's what I want to look at, as without that grounding the
rest seems most insecure.

- Don Klipstein ([email protected])
 
L

Lostgallifreyan

Jan 1, 1970
0
[email protected] (Don Klipstein) wrote in
But if the light level is neither in kilolux nevels nor a recent
uptick
from something lower, I find daylight to usually have no warmth or
"cheer".

If the ceiling is dark but the windows are bright, then things can
look
cheerful - sometimes - for some reason.

I mentioned contrast earlier, and I wonder if it might be this. I have an
odd colour scheme on my monitor, a kind of inversion of usual practise. I
call it 'panel lights'. It has black window objects, white and orange text
on them, the text backgound is a blue-biased mid grey, text black. Desktop
is deep blue with a pattern like dark water seen from a boat at twilight,
but saturated strongly, icon text there is like blue flame. Title bars are
green like plastic backlit by fluorescent light with red text. I like
programs with buttons that use colours well and illuminate like lights on
the black toolbars. Menus are yellow on a grey brown background. When
working on a full-screen text edit, it looks like a monochrome TV framed by
illuminated panels.

What I'm getting at is that this thing has both kinds of colour, 'hot' and
'cold', and most of all, strong contrasts. Some would find it as garish as
a fairground. I find it comforting the same way I find firelight
comforting. It keeps me calm yet aware for long periods while working.
Similar lighting tricks keep air pilots awake on night flights. (That's
partly the basis of the name I give that scheme).

Most colour schemes I see on computers are varieties of dark text on pale
backgrounds. I don't care if they're warm flamelike backgrounds or cool
fern greens and icy blues, I find them ALL distracting, stressful, and the
executive class adlanders white pages and thin grey text and pastel shades
are the very worst.

Ok, so I'm weird, but that's still a natural take on lighting. It shows
that there's a lot more to this than colour temperature. Contrast is
important too, as is the ratio of light to dark, and of object to space,
and suggestion plays a big part. It's very hard to be scientific about such
things, so maybe we shouldn't be trying too hard.

I'm still having a hard time adjusting to the fact that an SI unit, the
Lumen, is based on a statistical consensus, yet is placed alongside
hallowed units like the amp and the volt and the watt which seem as
immutable as 2+2 equalling four. Trying to get objective about what colours
are 'right' for us to accept and discussing it as if it is a hard science
is more weird to me than suggesting that the lightbulb is a form of magic.
 
L

Lostgallifreyan

Jan 1, 1970
0
[email protected] (Don Klipstein) wrote in
I expect about 6.7 watts in the case of a 1710 lumen 100W
incandescent,
if the ellipsoidal mirror is a whole ellipsoid and 100% reflective and
the dichroic filter passes all 400-700 nm light.

As for the rest, approximately or "educated guesses":

UV passing through the glass: .12%
UV absorbed by the glass: .02%

Heat conducted/convected from the filament: ~13%

IR passing through the glass: ~60%
IR absorbed by the glass: ~20.16% ("rounded oddly" to make figures add
to 100%)

I was trying to keep the lumens out of this entirely, but I'll buy it. :)
It makes me wonder what the fuss is about actually. While it's better to
get more efficiency, it seems that incandescents aren't so bad we need to
consider banning them, we just need to think more about what source we use
for a given task. As for the case to ban all but halogen types, how much
might be gained? With IR reflection to make them keep the tungsten hotter
for a given input, we get more light, but even so, is there that much
difference? Enough to say that they stay and standard incandescents go?

If LED's ever get a spectral match for a small efficient low-volt halogen,
at least the choice will be easy.
 
A

Arfa Daily

Jan 1, 1970
0
Lostgallifreyan said:
[email protected] (Don Klipstein) wrote in


I was trying to keep the lumens out of this entirely, but I'll buy it. :)
It makes me wonder what the fuss is about actually. While it's better to
get more efficiency, it seems that incandescents aren't so bad we need to
consider banning them, we just need to think more about what source we use
for a given task. As for the case to ban all but halogen types, how much
might be gained? With IR reflection to make them keep the tungsten hotter
for a given input, we get more light, but even so, is there that much
difference? Enough to say that they stay and standard incandescents go?

If LED's ever get a spectral match for a small efficient low-volt halogen,
at least the choice will be easy.

OK, I'm following all this - just about, I think. So let me now throw in a
slightly new set of questions. Back to LED halogen substitutes. Some
distance back up the thread, consideration was being given to losses in the
control circuitry for the LEDs. So, the first question is, just exactly how
are these things ballasted ? The reason that I ask this is that I was in an
electrical cash and carry warehouse tonight, and I picked up a couple of
LED-based GU10 replacements to have a look at. I didn't count the actual
LEDs, but I'm guessing at about 15 or so - let's say 15. Let's also say that
they are bluish types and let's guess at a forward drop of 4 volts. With
them all in series, that's going to be around 60v DC that's needed to run
them.

Now, these lamps were of exactly the same dimensions as a standard GU10
lamp, with the same 'nail head' pins, set in the identical ceramic base.
240v AC rating, stated on the packet. The glass 'cone' was exactly the same
as on a standard GU10, and it appeared, as far as I could see, that for the
most part, it was filled with the LEDs, which looked like 5mm types, and
their support plate. So that leaves very little space for any drive
electronics - certainly not a switch mode PSU, or even for a smoothing cap
on the end of a simple reccy / resistor combination. Not that there would
have been room even, for a resistor of a sufficient power rating to handle
this kind of drop.

Next question. There were two types on offer, one rated at 1 watt, and one
at 1.3 watts, both with a quoted lifetime of 50k hours. So what exactly is
being said here ? Is that 1 watt input from the mains supply, or 1 watt used
by the LEDs or 1 watt of visible luminous output power ? A website that I
looked at quoted the output of a 0.62 watt one, at 20-30 l - I'm assuming
that to be 'lumens'. If correct, and not a misprint, that seems to be a
piddling amount compared to the 950 lumens quoted for an incandescent 240v
50 watt GU10, and yet the text suggests that they are only 'slightly
dimmer'. It also says that these lamps give off almost no heat, and that
they consume only around 10% of the energy of a conventional equivalent
halogen GU10. So for a 50 watt type, that's about 5 watts, suggesting that
around 4 watts is lost in ballasting ??

Setting aside the issues of colour temperature and CRI, which I am sure will
shortly be overcome, it seems to me that these halogen replacement lamps are
even now on their way to bettering CFLs in that they are already exactly the
same pattern as the lamps that they are replacing, so must have sorted the
ballasting problem. And yet there are no plans to phase out the incandescent
version. This flies directly in the face of the proposals to ban standard
incandescents, when the advocated replacement technology (CFLs) is far from
being a satisfactory replacement, on several counts.

Arfa
 
A

Arfa Daily

Jan 1, 1970
0
Albert Manfredi said:
Well, you wrote may things, including this:

"Lighting which is used to replace daylight - like that most of us have at
home for use when daylight fades - ideally shouldn't give such a sudden
change in temperature that it is noticeable. In the same way as lighting
used to supplement daylight - like in say an office - should also be an
approximate match to that daylight. It's common sense, really."

I do agree that if we are supplementing daylight, e.g. in work spaces with
large windows during the day, rather than providing lighting at night, a
cooler light (hotter temp) is probably preferable. But for night time
lighting, I think what we are looking for is the color of flame.

I'm saying, it's not that we are conditioned to the color of tungsen, it's
that we are looking for something close to 2000 K at night. Much cooler
cooler light than that (higher temp) is stark and generally unpleasant.

By the way, this also applies to xenon headlights in some cars. They are
superbly obnoxious at night, to other drivers. Even if they aren't
brighter than halogens, the bluish color is very distracting.
Fortnunately, there seem to be fewer of the really annoying ones around
these days. Maybe the auto makers got too many complaints.

Bert

Don'cha just hate the way they swing from blue through stark white to green,
when they come round a bend in front or behind you ... Also, having sat
behind some, the perceived ability to light the road, does not seem to be
any better than halogens, which may again come down to colour temperature
and the human vision comfort zone.

Arfa
 
M

Mr.T

Jan 1, 1970
0
Dave Plowman (News) said:
In which way are they 'inaccurate'? They will look wrong to the eye on a
'cut' but as with real life if all shots are matched the eye will
accommodate.

Not so. They ARE wrong. The relative densities of the individual film layers
will be quite innacurate when exposed with the wrong light.
The monitor you're reading this on is unlikely to match
*exactly* another one in colour temperature but will look ok to the
individual. The eye compensates, as I said, as it must do given that
daylight changes. Unless it has a reference to match to.

Which is everything else within your field of view. Only if *everything*
changes will the *brain* correctly compensate.
Err, yes. That's what I said. But it doesn't react instantly. Hence it
notices a sudden change in colour temperature. Like switching on 4500K
lights in a house when it gets dark.;-)

So a couple of minutes readjustment is abhorent to you?
Doesn't bother me too much.
Have you never wondered why most prefer the colour temperature of tungsten
for domestic lighting?

No, as I already stated it was simply conditioning from fires, candles, oil
lamps and tungsten filament globes.
Have you ever wondered why people aren't bothered by the change from
daylight, or in fact are able to wear coloured sun glasses, but can readily
pick an off balance color photo?

MrT.
 
M

Mr.T

Jan 1, 1970
0
Don Klipstein said:
Moonlight's color temperature at its highest is about 4000.

I haven't seen a reference for this, but even assuming it is so, it's still
higher than many people here prefer it would seem.
Meanwhile, at illumination level so low that color vision does not work
well, color temperature matters less. At illumination levels an order of
magnitude or two or three above that of moonlight, most people like it
warm (lower color temperature).

Which is my point. People simply prefer something, then try to introduce
pseudo scientific rationalisation to claim anybody who disagrees with them
is wrong.

MrT.
 
D

Don Klipstein

Jan 1, 1970
0
Not so. They ARE wrong. The relative densities of the individual film layers
will be quite innacurate when exposed with the wrong light.


Which is everything else within your field of view. Only if *everything*
changes will the *brain* correctly compensate.


So a couple of minutes readjustment is abhorent to you?
Doesn't bother me too much.


No, as I already stated it was simply conditioning from fires, candles, oil
lamps and tungsten filament globes.
Have you ever wondered why people aren't bothered by the change from
daylight, or in fact are able to wear coloured sun glasses, but can readily
pick an off balance color photo?

An off-balance color photo has its surroundings as a color reference.
It would be like having colored sunglasses coloring only a small portion
of your field of vision.

- Don Klipstein ([email protected])
 
L

Lostgallifreyan

Jan 1, 1970
0
I haven't seen a reference for this, but even assuming it is so, it's
still higher than many people here prefer it would seem.

Nice way to test: Take a camera and tripod, do a long exposure shot of a
moonlit scene. Then view the phtot on a monitor in a context you know. I
haven't done this but I think it will bear out the claim that the moon's
light is brownish, as it looks when you look directly at it. The blue comes
from a combination of scattered light and scotopic sensitivity to the blue
part of its spectrum.
 
A

Arny Krueger

Jan 1, 1970
0
Dave Plowman (News) said:
Err, isn't that what I wrote? It's the colour temperature
that matters rather than the source.

There many kinds of preferences. One is the preference for that which is
traditional and familiar, and another is the preference for that which is
most effective for the purpose at hand.

I've found that if the goal is reading accurately with limited light, then
higher temperatures even 5000 degrees and up, can be preferable. I've read
far into many a dark fall or winter evening in a tent, using a pretty blue
LED headlamp.

I did some tests of people reading Bibles and hymnals which tend to small
print, in a congregational setting with fairly high light levels, and found
that my readers were most comfortable with color temperatures in the 3200
degree range.

I suspect that preferences for color temperatures below 3200 degrees are
heavily influenced by tradition and past experience.
 
L

Lostgallifreyan

Jan 1, 1970
0
There many kinds of preferences. One is the preference for that which
is traditional and familiar, and another is the preference for that
which is most effective for the purpose at hand.

I've found that if the goal is reading accurately with limited light,
then higher temperatures even 5000 degrees and up, can be preferable.
I've read far into many a dark fall or winter evening in a tent, using
a pretty blue LED headlamp.

I did some tests of people reading Bibles and hymnals which tend to
small print, in a congregational setting with fairly high light
levels, and found that my readers were most comfortable with color
temperatures in the 3200 degree range.

I suspect that preferences for color temperatures below 3200 degrees
are heavily influenced by tradition and past experience.

Yes. Been saying similar stuff here last night. Also, quite apart from
preference and convention and all that, there is a stark fact that we use
shortwave light to resolve fine detail without strain. That's a basic
physical fact. So it makes NO sense at all to suggest that reading is best
done in a low colour temperature. Same goes for any other detailed small
scale activity such as most indoor hobbies involve.

The only reason we need bright incandescent to read by is that it is the
ONLY way we can get enough shortwave light. I've found that so long as you
have a decent continuum such as the newer Cree Xlamps have, and a tint that
favours the long end, such as the WG tint, you can be comfortable with much
lower lumen counts than when using low colour temperatures. This is exactly
what many here said was 'dreary' or similar, but I tried it last night. I
went outside to see the orange light in the clouds over the city, the many
tungsten lamps all around in windows, waited till I was thoroughly
adjusted, then went inside. Far from looking dreary, it was invitingly
bright and easy to see things by, and this was ONE single emitter aimed at
the ceiling. It had the same cosy quality that a pressurised paraffin
(kerosene) lamp has in a country kitchen during a power cut. I remember
that well enough, and this new light was similarly pleasing, if a little
different, sharper perhaps.
 
D

Dave Plowman (News)

Jan 1, 1970
0
Not so. They ARE wrong. The relative densities of the individual film
layers will be quite innacurate when exposed with the wrong light.

You conveniently snipped the part about video. And a daylight film can
also look 'wrong' when taken in daylight of the wrong colour temperature.
Which can be corrected by filters when taking the pic or processing it.
Which is everything else within your field of view. Only if *everything*
changes will the *brain* correctly compensate.

Not so - do you change the colour temperature of your TV or monitor
according to the ambient light? The brain focuses on the important part
after time - within reason.
So a couple of minutes readjustment is abhorent to you?
Doesn't bother me too much.

Fine - but you're in a minority if you like cold domestic lighting.
No, as I already stated it was simply conditioning from fires, candles,
oil lamps and tungsten filament globes.

Fluorescent lights have been around for a long, long time. And early ones
were all cold compared to tungsten. People could easily have got used to
them for domestic light, but very few chose to.
Have you ever wondered why people aren't bothered by the change from
daylight, or in fact are able to wear coloured sun glasses, but can
readily pick an off balance color photo?

Can they? Depends on their skills. Have you never noticed how many people
are happy with a TV where the grey scale is miles out?
 
D

Dave Plowman (News)

Jan 1, 1970
0
I suspect that preferences for color temperatures below 3200 degrees are
heavily influenced by tradition and past experience.

Perhaps if it were only working light. But at home it's usually comfort
light.
 
L

Lostgallifreyan

Jan 1, 1970
0
I could see, that for the most part, it was filled with the LEDs,
which looked like 5mm types, and their support plate. So that leaves
very little space for any drive electronics - certainly not a switch
mode PSU, or even for a smoothing cap on the end of a simple reccy /
resistor combination. Not that there would have been room even, for a
resistor of a sufficient power rating to handle this kind of drop.

There might.. First, it needs to control a fixed current, and an efficient
power converter can be tiny, flat, like this: http://tinyurl.com/ypenut
That's 95% efficient at converting voltage ranging from 5~32 VDC into a
current source that can manage up to 7 LED's in series. A converter from
240 VAC to low volt DC can be had with similar efficiency (I hope), to feed
what I already have. Ideally I can find a single module that does the
entire power conversion at 95% or better.
Next question. There were two types on offer, one rated at 1 watt, and
one at 1.3 watts, both with a quoted lifetime of 50k hours. So what
exactly is being said here ? Is that 1 watt input from the mains
supply, or 1 watt used by the LEDs or 1 watt of visible luminous
output power ? A website that I looked at quoted the output of a 0.62
watt one, at 20-30 l - I'm assuming that to be 'lumens'. If correct,
and not a misprint, that seems to be a piddling amount compared to the
950 lumens quoted for an incandescent 240v 50 watt GU10, and yet the
text suggests that they are only 'slightly dimmer'. It also says that
these lamps give off almost no heat, and that they consume only around
10% of the energy of a conventional equivalent halogen GU10. So for a
50 watt type, that's about 5 watts, suggesting that around 4 watts is
lost in ballasting ??

That one sounds like a marketing hype. The first thing is that it has lots
of standard 5mm LED's. Avoid like the PLAGUE, seriously. All that voltage
drop, and no cooling to speak off, what kind of thermal coupling can be had
for a 5mm LED?

The ones to look for are the Cree and Luxeon types. The easiest way to look
for them is a single emitter, or at least very few of them, with high
output claims. Look into one (unlit!, they WILL damage your eyes if you do
that to lit ones at close range), amd you'll see a distinctive fluorecent
dayglo green yellow cast to the phosphor unlike the chalky phosphors of
weaker white LED's.

Re wattage claims, it's hard to say, without evaluating all the evidence
you can find together. In short, a lamp that needs several emitters to
manage 30 lumens is a joke, when you can cheaply get a single emitter that
puts out >200 lumens with 1 amp pushed through a voltage drop of around 3
volts.
Setting aside the issues of colour temperature and CRI, which I am
sure will shortly be overcome, it seems to me that these halogen
replacement lamps are even now on their way to bettering CFLs in that
they are already exactly the same pattern as the lamps that they are
replacing, so must have sorted the ballasting problem. And yet there
are no plans to phase out the incandescent version. This flies
directly in the face of the proposals to ban standard incandescents,
when the advocated replacement technology (CFLs) is far from being a
satisfactory replacement, on several counts.

I think the ban is 'being seen to be done' kind of reaction. It's got more
to do with trashing an icon known for inefficiency, but there are better
ways to make people change than all-stick-no-carrot.

If governments really want to reduce power consumption I think they should
be subsidising the public to buy computer mainboards based on Nehemiah
CPU's and such. Turning a domestic computer into a fan heater just to run
Windows Vista as a private office is a sick joke! Far more worrying than a
few lightbulbs.
 
L

Lostgallifreyan

Jan 1, 1970
0
Have you never noticed how many people
are happy with a TV where the grey scale is miles out?

Actually that kind of refutes a point that many including you claim. People
WERE happy for the most part with b/w TV's when that's all they had, and
with colour they liked a sharp clean white and a bright vivid image, and
they were happy, and they'd even fall asleep in front of them with the
other lights out, at times, it's an iconic movie thing, often seen, often
shared. Funny behaviour don't you think, given the high colour temperatures
involved?


People tell themselves they don't like stuff the way kids tell themselves
they don't like their greens, or the way they tell themselves they need
heavy clothes on winter days even when those days are warmer and drier than
many summer ones. They even tell themselves that what they read in the
newspapers must be true.

Back to lights: I refer again to the point that reading and detailed indoor
hobbies need shortwave light to avoid eye strain, and the only reason
people turn up the tungsten is because that's the only way they actually
get enough of the shortwave light they need.
 
Top