OT Fahrenheit

Page 3 of 7  

snipped-for-privacy@aol.com says...

Mainframes are *not* specified for office environment (rather "Class A") though. There is a difference between a "departmental server" and a data center mainframe.
--
Keith


Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

I am not sure what machines you are talking about but 4300s and AS400s were office space rated. These were around before most people had ever heard of a server or a LAN.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@aol.com says...

Ok, let me try again, slower. AS/400 and 4300s are/were what we now call "departmental servers". /370, ES/9000s were relegated to data centers and are rated for a "class-A" environment only. Note that "office space" rating isn't exactly harsh either.
--
Keith

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

I wouldn't exactly call a 4331,41 & 81 class machines a department server. It was the replacement for 370 M138-158 class machines. The AS/400 actually out performed that series in black box form. The word mainframe became fairly ambiguous anyway when they became nothing more than a rack of RISC cards. It is one reason I left. The computer business got very boring for a hardware guy. When the CPUs pumped water and the disk drives pumped oil it was fun to do. The hardware job became pluck and chuck. The Physical planning rep job pretty much just went away too. What pass for mainframes these days would run fine in a warehouse.
BTW offices are still FCC class A environments. B is residential
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@aol.com says...

THat's exactly how they were used. BTW the replacement for the 3138-3158 class was the 3031.

Is an xSeries a "mainframe"? Is it a "rack of RISC cards"?

You were a CE? Hardware development is still interesting.

I don't believe I said anything about the FCC. I didn't even know they cared about temperature or humidity.
--
Keith

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

I was region support for the 4300 and the 138, 148. I don't know of ONE 370 138/148 customer who went for the 3031 It basically WAS a 158 (as were the service directors) so there would be no advantage to go 158 to 3031 I was also trained on both the 158 and 3031.

After my time but I bet it is.

CE, Support Specialist then later IPR and Contract Services.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@aol.com says...

^^^^^^^ channel

The channel directors off loaded all the I/O microcode. The 3031 was significantly faster than a 3158 because of the director. IIRC they were pretty cheap too.

It's not after mine. ;-) Nope. /360 is hardly RISC. THe processor complex is an MCM.

--
Keith

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
krw wrote:

You guys are in semi-violent agreement. Keith's first response was: "Not true at all. A high RH contributes to failures in electronics as well. Even recent equipment is specified from 40-60% RH, over a fairly narrow temperature range."
I call the "not true at all" part complete bullshit. What Greg said was 100% true. And the gratuitous "let me try again, slower" is another detractor.
Bottom line: human comfort and "equipment comfort" are roughly the same, with the "equipment comfort" range being wider than the human comfort range. Think about it - humans operate the equipment, and would not be willing to work in the thousands upon thousands of "normal" datacenters if the machinerey could not function in office-like temperature and humidity. (Sorry - if you're in the military, you work where they tell you - but even then, if it's in a datacenter, it's likely to be comfortable.) In fact, humans usually get uncomfortable outside the 68-72 range, on average. Datacenter machinery functions well outside of that range. The farther you depart from that 68-72, the more extensive the steps a human needs to take. Machines can't take those steps, so they will fail when the conditions are too far from nominal. What would be interesting is some real discussion of the specific numbers.
I'll give you five examples: 1) Peat Marwick Mitchell datacenter, early 70's An airconditioner failure caused DASD (2314) data errors at exactly 94 degrees on their wall thermometer. Ran fine at 93. 2) Manufacturers Hanover Trust datacenter began losing equipment (power down) when temperature went above 90 during a blackout. (Early 80's) They had emergency power to keep the data processing equipment running, but nothing to power the conditioners. 3) Bloomingdales (now part of Federated) datacenter, mid-late 70's. Red lite checks on CPU (3138) whenever a metal cart carrying cards would touch the CPU; random red lite cpu checks when loading paper in 1403. Relative humidity was 16%. Raising it to 40% fixed the problem. No hardware was damaged. Interesting - with the lights off, when a new box of 1403 paper was opened and fanned out, you could see the discharge. 4) Divco Wayne had a building heat failure over the weekend. On Monday morning, the computer room was 30 degrees F. The damn system powered up and ran, with no problems - but the 1416 print train ran audibly slow. (Early-mid 70's) 5) IBM datacenter, early 80's. A disk pack was transported in the trunk of a car, properly packed, but in sub zero temperature. Upon arrival it was immediately placed in a 2314. The idiot who did it moved the pack to subsequent drives when it didn't work. 180 heads, 5 VCM's and several days later, full service was restored. I guess by the 6th pizza oven, he moved the pack soon enough where the VCM was not destroyed.
Specifically, the relative humidity spec is for static/paper "fatness". The equipment couldn't care less. It will run happily outside the range. But if the RH is too low, static discharge can occur, and that discharge can interfere with equipment operation. The equipment does not mind the low humidity, but it does mind the discharge. "Wet" paper, due to high humidity, does not do well in paper handling machinery in the datacenter. Feed the equipment "dry" paper & it performs flawlessly. I do not have statistics on "wet" paper - perhaps one of you can discuss that in more detail.
Ed
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Ah - we run something comparitively smaller in our office with a pretty even mix of *nix to Windows servers. All total there are roughly 50 servers.
Room is supplied with power by an APC Symmetra that gives us nominally 15 minutes of backup power. That Symmetra also has a kill switch for emergency and its wired into the fire alarm system so that when the sprinklers go off, all power to the room is cut.
The Symmetra also powers the cubes in the IT space. Right now we get 40 minutes time out of it, but that's only because two of our employees like to have their heaters going full tilt. Otherwise it's over an hour.
Overhead lighting and air conditioning are not on the UPS. However there is a 125kW natural gas fired generator out back that backs up the UPS, and also supplies power to not only the overheads, but to the HVAC system and we even ran a line out to the MDF int he building so Cox could take advantage of our generator in the event of a building wide power failure. We weren't being altruistic, we just wanted to make sure our network connection stays up.
We also do quarterly tests of the power system, as well as having the system set to do regular exercise runs on the generator.
That data center was my baby. And the redundancy built in shows it.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

If you bring a pallet of paper in from outside in Florida (80-90 RH) and put it in a 3800 it will wad up so bad you can't stack more than about 200 pages without taking it out. Forget trying to run it through the burster. They usually tried to keep it in the A/C for several days before using it.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
[...]

Reminds me of an incident that occurred in the late 80s/early 90s when I worked for the Navy. I managed a Tandem TXP system that shared a computer room with a Honeywell 66. One holiday weekend, the air conditioning system failed in the wee hours of Saturday morning after the second shift operators had gone home. (There was no third shift.) Monday being a holiday, the problem wasn't discovered until the first shift operators arrived at about 6am Tuesday to find the data center at about 110 degrees. The Honeywell had gone down only about three hours after the air conditioning did... but the Tandem was still up. The DASD cabinets were painfully hot to the touch, and one of the drives had gone down -- but since Tandem uses mirrored drives, and the mirror was still ok, it did no harm. I measured the exhaust air at the back of the processor cabinet at 134 degrees... but the Tandem was still up.
--
Regards,
Doug Miller (alphageek at milmac dot com)
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@milmac.com says...

Back in 1993 I was responsible for managing a Data General MV9600U running AOS/VS II. Loved that machine, and still remember alot about it.
In any case, these were machines that could take abuse. We knew of one located in a non-ventilated closet that just continued to run until decommissioning day.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@aol.com says...

I remember the S/36's and the RS/6000's. Never got to deal with either of the above, but I did like the RS/6000's.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@cox.nospam.net says...

Those were all designed for office environments. Mainframes most certainly were not, and it had nothing to do with paper (I/O was seldom in the same room).
--
Keith


Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Any of the air cooled machines could run damn near anywhere. When I got to Florida (From the glass house data centers of DC) I saw it happening. A "computer room" was a bay in a strip mall or industrial center. That was also the first time I ran into red leg delta power and the first time I saw "no raised floor" since the 1401 and mod 30 days.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@aol.com says...

We stipulated a raised floor because we could. And it has come in handy, from snaking a power whip over to the telephone switch (An Avaya Prologix) or running network cabling to server racks, etc.
http://blip.tv/file/67664
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

You can recomend anything you like but the salesman is not going to leave the money on the table if the customer says no, particularly when it says it is not required in the sales manual.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@att.bizzzz says...

Actually we had lots of big iron at the Retro Computing Society of Rhode Island. The KL10 was a big beast. Interestingly the collection seriously lacked IBM big iron.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

We had a bunch of them but I tried to stay away from them. When they merged GSD and FE it got harder to do. I ended up working on the 3x and was trained on the RS6000 and AS/400. I was the 7800 (TP support) guy so mostly I did the communication end. All of those boxes were basically solid except the DASD and that was just a software nightmare, not a hardware problem. Once they started using RAID5 they were a no brainer. My boss had a real sense of humor and sent me to Series 1 school the week after I got back from 3090 support school. It was the only school I walked out of.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

I've been in some places, like computer rooms, which were kept like that. Also the generator room at Glen Canyon Dam was kept at 50F. Interestingly, I didn't find it that cold. That house (where 65F was too cold) may have had too many leaks.
--
44 days until the winter solstice celebration

Mark Lloyd
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Related Threads

    HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.