Connect with us

OT Fahrenheit

Discussion in 'Electrical Engineering' started by Terry, Nov 8, 2006.

Scroll to continue with content
  1. krw

    krw Guest

    Certainly it is. It my not be MKS, nor purely SI, but it is
    metric. K==kilometers (1E3 meters) H==Hours(3.6E3 seconds), both
    of which are SI units. KPH is then a "derived unit" and perfectly
    acceptable.
    Who cares? <snip>
     
  2. Guest

    I am not sure why people think computer rooms should be that cold. The
    official IBM spec for a data center was 75F at 50%RH.
     
  3. Stephen B.

    Stephen B. Guest

    "Doug Miller" wrote
     
  4. T

    T Guest

    In our case the costs of power, heating and cooling are fixed for the
    next ten years in the square footage cost.

    We do run half lighting most times though, to save energy. :)
     
  5. T

    T Guest

    Luckily it has been far warmer than usual here in the northeast. Daytime
    temps have been low 50's to high 60's. Night temps are getting down
    there but with day temps so high, the heat hardly runs and the place
    states near 70F.
     
  6. Guest

    We took the expected latent heat load into account when trhe system
    was sized. That still does not address what you set the thermostat and
    humidistat to. Thw Weksler chart in a computer room is 2 concentric
    rings when things are running right
     
  7. In most places I've been, the rooms run at 21C (70F).
    The RH is then adjusted to stop the room feeling too cold to
    humans in cases where they have to work in there for long
    periods. (A dry 21C with significant wind chill feels very
    much colder than 21C.)
     
  8. Guest


    Maintaining RH at 50% is mostly to make the paper (cards in the olden
    days) more machine friendly. At lower RH static is a problem and at
    higher RH the paper stiffness suffers and it swells imparing feeding.
     
  9. krw

    krw Guest

    Not true at all. A high RH contributes to failures in electronics
    as well. Even recent equipment is specified from 40-60% RH, over a
    fairly narrow temperature range.
     
  10. Guest

    A production laser printer will be wadding up paper long before the
    electronics start complaining. If the paper is too wet it will curl
    when it goes through the fuser. A big printer shoving that paper out
    at 3 pages a second will turn the stacker into something that looks
    like a carnation.
     
  11. krw

    krw Guest

    That may be true, but it doesn't mean it's only the printers and
    card readers that are in controlled environments.
     
  12. Guest

    I think people are far too concerned with the rest of the electronics.
    DASD in a data cernter is just the same drives you have in your PC
    piled in a big box these days and the processors are not that much
    different than your PC. It is certainly a similar packaging. I have
    PCs running in totally unconditioned space in SW Florida with no
    problems. In fact one survived a fire. 3 are running in vehicles that
    see 130-140F in the day time and wide swings in RH.
    IBM started saying in the 80s that if the people could handle the
    environment the computer could. 4300 mainframes and AS/400 mid range
    were "office environment" machines. It was really the big paper
    pushers that needed conditioned space.
     
  13. krw

    krw Guest

    Mainframes are *not* specified for office environment (rather
    "Class A") though. There is a difference between a "departmental
    server" and a data center mainframe.
     
  14. Guest

    I am not sure what machines you are talking about but 4300s and AS400s
    were office space rated. These were around before most people had ever
    heard of a server or a LAN.
     
  15. krw

    krw Guest

    Ok, let me try again, slower. AS/400 and 4300s are/were what we
    now call "departmental servers". /370, ES/9000s were relegated to
    data centers and are rated for a "class-A" environment only. Note
    that "office space" rating isn't exactly harsh either.
     
  16. Guest

    I wouldn't exactly call a 4331,41 & 81 class machines a department
    server. It was the replacement for 370 M138-158 class machines.
    The AS/400 actually out performed that series in black box form.
    The word mainframe became fairly ambiguous anyway when they became
    nothing more than a rack of RISC cards. It is one reason I left. The
    computer business got very boring for a hardware guy. When the CPUs
    pumped water and the disk drives pumped oil it was fun to do. The
    hardware job became pluck and chuck. The Physical planning rep job
    pretty much just went away too. What pass for mainframes these days
    would run fine in a warehouse.

    BTW offices are still FCC class A environments. B is residential
     
  17. krw

    krw Guest

    THat's exactly how they were used. BTW the replacement for the
    3138-3158 class was the 3031.
    Is an xSeries a "mainframe"? Is it a "rack of RISC cards"?
    You were a CE? Hardware development is still interesting.
    I don't believe I said anything about the FCC. I didn't even know
    they cared about temperature or humidity.
     
  18. ehsjr

    ehsjr Guest

    You guys are in semi-violent agreement.
    Keith's first response was:
    "Not true at all. A high RH contributes to failures in electronics
    as well. Even recent equipment is specified from 40-60% RH, over a
    fairly narrow temperature range."

    I call the "not true at all" part complete bullshit.
    What Greg said was 100% true. And the gratuitous
    "let me try again, slower" is another detractor.

    Bottom line: human comfort and "equipment comfort"
    are roughly the same, with the "equipment comfort"
    range being wider than the human comfort range.
    Think about it - humans operate the equipment, and
    would not be willing to work in the thousands upon
    thousands of "normal" datacenters if the machinerey
    could not function in office-like temperature and
    humidity. (Sorry - if you're in the military, you
    work where they tell you - but even then, if it's
    in a datacenter, it's likely to be comfortable.)
    In fact, humans usually get uncomfortable outside
    the 68-72 range, on average. Datacenter machinery
    functions well outside of that range. The farther
    you depart from that 68-72, the more extensive the
    steps a human needs to take. Machines can't take
    those steps, so they will fail when the conditions
    are too far from nominal. What would be interesting
    is some real discussion of the specific numbers.

    I'll give you five examples:
    1) Peat Marwick Mitchell datacenter, early 70's
    An airconditioner failure caused DASD (2314) data
    errors at exactly 94 degrees on their wall thermometer.
    Ran fine at 93.
    2) Manufacturers Hanover Trust datacenter began losing
    equipment (power down) when temperature went above 90
    during a blackout. (Early 80's) They had emergency
    power to keep the data processing equipment running,
    but nothing to power the conditioners.
    3) Bloomingdales (now part of Federated) datacenter,
    mid-late 70's. Red lite checks on CPU (3138) whenever
    a metal cart carrying cards would touch the CPU;
    random red lite cpu checks when loading paper in 1403.
    Relative humidity was 16%. Raising it to 40% fixed
    the problem. No hardware was damaged. Interesting -
    with the lights off, when a new box of 1403 paper was
    opened and fanned out, you could see the discharge.
    4) Divco Wayne had a building heat failure over the
    weekend. On Monday morning, the computer room was
    30 degrees F. The damn system powered up and ran,
    with no problems - but the 1416 print train ran
    audibly slow. (Early-mid 70's)
    5) IBM datacenter, early 80's. A disk pack was
    transported in the trunk of a car, properly packed,
    but in sub zero temperature. Upon arrival it was
    immediately placed in a 2314. The idiot who did it
    moved the pack to subsequent drives when it didn't
    work. 180 heads, 5 VCM's and several days later,
    full service was restored. I guess by the 6th pizza
    oven, he moved the pack soon enough where the VCM
    was not destroyed.

    Specifically, the relative humidity spec is for
    static/paper "fatness". The equipment couldn't
    care less. It will run happily outside the range.
    But if the RH is too low, static discharge can
    occur, and that discharge can interfere with
    equipment operation. The equipment does not mind
    the low humidity, but it does mind the discharge.
    "Wet" paper, due to high humidity, does not do
    well in paper handling machinery in the datacenter.
    Feed the equipment "dry" paper & it performs flawlessly.
    I do not have statistics on "wet" paper - perhaps
    one of you can discuss that in more detail.

    Ed
     
  19. T

    T Guest

    I remember the S/36's and the RS/6000's. Never got to deal with either
    of the above, but I did like the RS/6000's.
     
  20. T

    T Guest

    Ah - we run something comparitively smaller in our office with a pretty
    even mix of *nix to Windows servers. All total there are roughly 50
    servers.

    Room is supplied with power by an APC Symmetra that gives us nominally
    15 minutes of backup power. That Symmetra also has a kill switch for
    emergency and its wired into the fire alarm system so that when the
    sprinklers go off, all power to the room is cut.

    The Symmetra also powers the cubes in the IT space. Right now we get 40
    minutes time out of it, but that's only because two of our employees
    like to have their heaters going full tilt. Otherwise it's over an hour.

    Overhead lighting and air conditioning are not on the UPS. However there
    is a 125kW natural gas fired generator out back that backs up the UPS,
    and also supplies power to not only the overheads, but to the HVAC
    system and we even ran a line out to the MDF int he building so Cox
    could take advantage of our generator in the event of a building wide
    power failure. We weren't being altruistic, we just wanted to make sure
    our network connection stays up.

    We also do quarterly tests of the power system, as well as having the
    system set to do regular exercise runs on the generator.

    That data center was my baby. And the redundancy built in shows it.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-