Whirlpool Fridge lock up during an apparent power glitch

Page 3 of 4  
On Thursday, April 21, 2016 at 10:19:41 AM UTC-4, Mike Duffy wrote:

A wise man once said:
"Code Tells You How, Comments Tell You Why
I hang out in an Excel forum and often write macros for other people. I use descriptive variables, add comments, etc. for 2 reasons:
1 - To help the requester understand what the code is doing (Many of them want to learn how to write macros, so the comments help "teach" them.)
2 - Months, and even years later, I often get 2nd requests (either from the original requester or from someone who found the code in the archives) asking for a modification to the code.
If I didn't include the comments when I first wrote it, I'd often have no clue what the original purpose was or why I wrote it that way. Heck, I don't even recognize some of the code that I use at work everyday. :-) Some of my macros are 5 - 10 years old. Luckily they still work for their original purpose and rarely need updating.
The worst part of the "forum job" comes from those who ask for a macro to do a specific task and then keep adding on requirements. I fulfill their original requirements and they come back with "Hey, thanks! That does exactly what I wanted. Now can you make it to 'this' also?"
This often happens more than once which means I sometimes need to start from scratch to avoid just bolting on a bunch of haphazard instructions and ending up with bloated, inefficient code.
When it gets out of hand, I will politely ask the requester what they think would happen if they had signed a contract for a "product" based on a set of requirements and then kept changing the requirements every time the product was delivered.
Of course, that assumes that I get requirements that can actually be used. Sometimes it's not much more than "I need macro to create an invoice from my data sheet. Can someone please post some code?" Uh...no.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/21/2016 7:19 AM, Mike Duffy wrote:

That approach doesn't work when your product processes large amounts of CASH; tells a patient they have/don't have a particular disease process; controls a mechanism moving at a high rate of speed (or large masses); produces a product that people INGEST; pilots a vessel; is relied upon to produce tens of thousands of dollars of saleable (vs SCRAPPED!) product each hour; etc. Folks tend to want designs that can be PROVEN to work. And/or, go through formal (legally required) certification/validation processes.
It also falls down when "your part" has to interface seamlessly with a part that another developer in another part of the world is producing in parallel with your activities.
Or, when the prototype hardware costs $1M and you can, at best, *borrow* a few hours per month on which to "test" your designs.
Of course, you also have scheduling and cost factors to consider. So, you REALLY want to be able to pull something "proven" off a shelf and use it as is -- without RE-writing it or RE-bugging it. "How do I know this works? *HOW* does it work? etc."

Or, you will forget that there is some inherent limitation (or assumption) embodied in the implementation (hardware or software) when you opt to modify/repurpose it for some later use.

I have a distinctive "style" to the way I write code, design hardware and design "systems". But, have been frequently complimented: "Once I realized how you do things, everything was OBVIOUS!" There is value to consistency and thoroughness.
(Do you write "++x;" or "x++;"? Are you consistent in your choice?)

And, chances are, there are holes large enough to drive a truck through in the thoroughness of your documentation, test suite, etc. Because you weren't PLANNING on reusing it. You succumb to the "just throw something together" thinking.
This is even worse with PHB's -- most of which have never written anything of even 10KLoC complexity (yet, somehow feel they have The Inside Track on how to manage 1 MLoC projects -- cuz they read about some "technique du jour") They bow to marketing, manufacturing, stockholder, etc. pressures to "get something out the door". Then, are perpetually playing "catch-up" and "wack-a-mole" with bugs that COULD have been caught in the specification stage.
But, they thought spending the time to write formal specifications would let too much calendar time slip away in which they "aren't being productive".
[No, a marketing document is not a specification; it's a mealey-mouthed wish list that says absolutely nothing about what the product will do and how it will be EXPECTED to perform]
There's this attitude that "We're not sure how well this product will be received; so, we aren't keen on making a big commitment to it. If the market likes it, we'll go back and fix all these things -- or design Model 2". Of course, if they cut too many corners, the product is crap and no one *wants* it! OTOH, if the market sees a value and embraces it, they find themselves too busy trying to shoehorn in the upgrades (because now competitors see an opportunity!) and the product quality eventually suffers -- and no one wants it.
For a fun exercise, try to track down any formal documentation for ANY product (software and/or hardware) that you've been involved with. Big difference when you have a financial/contractual arrangement with someone (client/provider) and that document DEFINES the work to be done!
[Employees can rarely DEMAND such a document; and employers rarely have to provide it -- they can just change their mind "on a whim" -- knowing the employees will somehow have to make it work (and, of course, all of the work they've done up to that point should be magically usable in the new project definition! :> ]
What happens if the pointer I pass to free(2c) is NOT a pointer previously obtained from malloc(2c)? [I'll wait here while you see if you can find a description of the EXPECTED outcome -- for that library that you PURCHASED :> ] Does your test suite check for this condition? Does your code/development environment ENSURE you CAN'T encounter that situation? Or, if it can't prevent it, how does your code/system respond WHEN that happens? Or, is this just the same old bug that will rear its ugly head 2, 6 or 10 months down the road?
[For fun, try passing a pointer to nonexistent memory and see how long AFTER for your code to crash. Or, a "legitimate" pointer that you'd already previously free(2c)-ed!]
I'm always amused at how many misconceptions folks have regarding floating point data types; ignorance of overflow/underflow, cancellation, etc. As if "simply" using floats absolves them from understanding how their equations (and data) perform. "Where did this outrageous number come from? Clearly that's not the result of the equation that I wrote!" (really? did you think the machine just fabricated random numbers of its own free will??)
And, of course, folks who are ignorant of the underlying iron won't see any difference in these two code fragments:
for (row=0; row<MAX_ROW; row++) for (col=0; col<MAX_COL; col++) data[row][col] = (row == col) ? 1 : 0;
for (col=0; col<MAX_COL; col++) for (row=0; row<MAX_ROW; row++) data[row][col] = (row == col) ? 1 : 0;
And, folks who are ignorant of the underlying OS's capabilities will be even MORE clueless!
When you encounter this IN some code, will there be a comment drawing attention to why the particular idiom was employed? Will you be expected to know? Will the original author have known??
When you "lift" the code for some other similar application, will you think about whether or not the assumptions in place in the code BEFORE you lifted it are still valid in the new application? Will YOU leave a comment indicating that?
And, when you go to reuse a piece of software, will you KNOW (or, be able to accurately determine) which of those issues are pertinent to the reuse effort?
If it's all wrapped up in a library, will you even have a CHOICE (will the library document this or will it be another detail that you stumble on after the fact -- and then try to purchase sources for the libraries so you can FIX the problem, as well as identify other similar problems that are lurking and yet to bite you?) Will the library that the NEXT compiler vendor supplies behave similarly?

(some) New refrigerators do inventory control. It's not hard to go from that to an expert system that knows how long particular products (or classes of products) can be kept in particular storage conditions. [This would be actually relatively simple to do with a small set of productions] It would also be easy to know the EXACT conditions incurred by the items in the 'frig (YOUR particular temperature setting, humidification in the crisper drawer, etc.). No more "does this smell spoiled, to you?".
And, from that, to advertisers pitching coupons to you based on their observations of your usage ("He's almost out of milk. Let's arrange for a SERIALIZED coupon for soy milk to appear in the next 'mailing' and we can later see if he actually uses it -- by tracking the serial number on the coupon")
How much is that worth? Or, is it a nuisance? If I could check the contents of my refrigerator while I *happen* to be at the store, can I save myself an extra trip back there, later, to pick up something that I didn't yet know I needed? Can the refrigerator tell me about the latest "recalls" (Trader Joes seems to have a new one each week!) and whether it applies to any items that I currently have in the refrigerator/cupboard? If it keeps me from spending a weekend praying to the porcelain god, how much is that worth to me?
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thu, 21 Apr 2016 10:47:56 -0700, Don Y wrote:

True, true, true, etc. My app was analysing data to produce summary results & graphics for publication in scientific journals. A fault in my programming would not cost or hurt anything but the pride of those who trusted me.

True again. In my case though, a publishing deadline was ususally coming up, and the scientist(s) realized that simple equations can be complicated to program, and Excel graphics can be a little short of publication quality without putting a lot of time into learning how to program using VBS.

Very true, especially with language and time zone barriers. I was lucky to be the 'lone wolf' on all such projects.

Money talks. In my case, there was never any hard 'deliverable' that used any of my code. Otherwise there WOULD be legal requirements for reliability and all those things you mentioned previously.

Like I said, deadlines were approaching and there was no money nor time to let a contract specifying everything it detail. But there was a guy on staff whose job description actually included a section about programming to aid scientific research, and that guy (me) actually had a degree majoring in physics unlike all of the other computer people who had IT degrees.

This depends a LOT on the semantic context of the surrounding code.

Guilty. But then again, the main onus was to get the job done ASAP.
I found that I could save a LOT of time just by doing bulk QC on the inputs before looking at the real task at hand.

Like I said earlier, I started to assume that the code would be re-visited despite the assurances that it was a quick & dirty request. I did this just to make my own future more pleasant.

The stories I could tell would probably not surprise you. In a previous job, I was involved with contract monitoring for major (>$M) government projects. Also, I was on the team that developed operational code for all the 24hr Canadian Government GOES ground stations for meterological satellite images. As far as I know, the software ran okay for decades with no problems, even transiting the dreaded Y2K apocalypse.

I did take a "Numercal Approximation" course at college. (I have a CS minor as well) True, what you say. Some people have no idea. Scientists seem to have a bad oversight regarding their own misunderstanding about things they have never studied in detail. In general, this is because they are actually more intelligent than others. But intelligence combined with ignorance is a pretty good match for stupidity.
If you are at all interested in the sort of code I'm happy to put my name on (i.e. not the crappy 'throwaway' code I described earlier), check out the sources available in the 'Programs' section of my personal not-for-profit website referenced in my signature. The Javascript for the gauge demo is in 'wlib.js', the other sources are downloadable.
--
http://mduffy.x10host.com/index.htm

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/21/2016 12:50 PM, Mike Duffy wrote:

You (or your "user") also gets a chance to check your results: "Gee, that graph doesn't seem to correlate with my IMPRESSION of the raw data. Let me spot check a few values..."
If my device is monitoring the production of a pharmaceutical product (e.g., tablets) -- at rates approaching 1,000,000/hour -- you can't even BEGIN to think you can sort through even a minute's worth of production (at 200/second) to verify the device hasn't screwed up somewhere in that batch.
If I'm accepting dollar bills from a user (gambler) and screw up and pay out thousands of dollars, erroneously, do you really think the user is going to "complain to management"? :> Wanna *bet* "Management" would complain to *me*! (once they examine the logs and notice the "rate of return" for certain machines is "way down"!)

Yes. We had to go through that exercise with some health data just last night. "I'm SURE there is a way to get the program to produce the graphs we want; why don't they look right?"

As I'm writing this, I'm discussing with a colleague in Germany the draft of a specification that I wrote. Not only do I have to ensure there are no errors in the design of the specification, but I also have to ensure it is clear and unambiguous. *And*, as folks in many nations will be using it as "Gospel", I have to hope I haven't relied on some particular language idioms that don't make sense in some other culture. Or, that the design itself won't be "impractical"/unacceptable in other parts of the world.
I designed a system with a partner firm abroad. Made perfect sense in technological terms and economic terms -- *here*. But, pricing in the US is considerably different for bits of tech than it is in other places! Things that were cheap/affordable here were essentially Unobtanium, there. And, in still other markets, actually subject to export restrictions (how many folks have sat down with their corporate lawyer to determine what aspects of their products might not be legally exportable?)
Some years ago, a friend designed a video game with graphics/characters suitable for the US market. Among his implementation choices was the use of skull & crossbones to signify "player death". (Hey, it's not a REAL death! Chill out!) The game had to be redesigned with different graphics as some markets found the symbol offensive.
You don't learn these things as a lone wolf. Or, in MOST "teams", regardless of size.

I "build THINGS". My code only runs "in a workstation" for test purposes (or under a simulator). It ultimately runs *in* a piece of equipment. There's usually no keyboard or (general purpose) display. No way for me to tell a user that the software "divided by zero" -- or tried to free() unallocated memory, etc.
A software "bug" is interpreted as "IT is broken" -- not "there's a bug in the software that makes it what it is"

I was part of a ~$5M project for a software-driven device... that hadn't considered the need for a "programmer"! They eventually assigned the task to a technician -- because he tinkered with software, at home!
{Typical PHB scenario -- clueless as to what the real issues in a particular project were!]

And that's what happens in most organizations. "We don't have time to do it right -- but, we'll have time to do it OVER!"
I spend about 40% of my "effort" writing specifications -- BEFORE I write a line of production code! I may hack together some code fragments to test the effectiveness and viability of particular ideas before committing them to the specification; but, that code is all disposable, back-of-the-napkin stuff. In most shops, that code finds its way into the product -- because they consider discarding it to be a "waste" of the time that it took to hack together!
(would you draw a sketch of your "dream house" -- and expect to find the final blueprints drawn OVER that sketch??)

Every function/module that I build begins with a stanza of invariants. These formally (re)declare the interface requirements (for example, "denominator != 0") AND ENFORCE THEM! So, if some other piece of code violates that contractual interface, the code "tells me". EVERY TIME the function is invoked!

-----^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Exactly. Employers don't care about how comfortable you are in your job. Nor how effective you are (as a consequence of THEIR policies). You work for them. You put up with your work conditions in exchange for a paycheck. If you ever find the balance unfavorable, you leave!
[This is what drove me to work on my own; the apparent willingness of employers to waste "my life" (i.e., my time) doing and redoing things that they COULD HAVE done right in the first place. But were afraid to do so or unable to recognize. With clients, each job is an agreed upon mutual arrangement. If I don't like what they are asking me to do -- or if they are unhappy with the way I do it -- then we simply choose not to enter into an Agreement. No hard feelings.]
It has been fun to "rib" old employers and clients regarding the bad decisions they made. You don't have to explicitly SAY "I told you so" for them to understand the intent! :>
OTOH, it's disheartening to see them repeat the same mistakes despite acknowledging the truth of that reality! :<

Software (esp prepackaged libraries) is sold like automobiles; all of the "detail" is missing. You're just supposed to think of it as a magical turn-key product. Despite the fact that it all has definite usage constraints. Those constraints are never spelled out (at least you might be able to find them in a vehicle's workshop manual!)
Notice how many "undefined behaviors" are listed in interface specs. And, how many of the nitty-gritty details are just "omitted"!
If I provide a clean heap and perform a series of allocations and releases, can you tell me what state the heap *will* be in? (it's a deterministic machine; you SHOULD be able to predict this!) Ah, "little" details missing from the description of that subsystem, eh? How can you know, at design time, that it WILL meet your needs AT RUNTIME? If you can't predict how it operates (because you don't have the sources; or, if you have the sources but not the test suite; or, the test suite but no temporal performance data; or... :> )
By claiming something is "undefined", the spec writer has made his (and the implementor's) job easier -- but YOURS, harder!
And, the subsystem covered by the specification is "designed and implemented ONCE; reused (in theory) endlessly! Why make a one-time job easier at the expense of a recurring activity??

"People don't know what they DON'T know". The few managers/clients that I've respected were aware of their limitations. Most are too insecure to admit same.
[Goldberg (I forget his first name) has an excellent treatment of the sorts of issues that folks writing floating point code should be aware of. I can probably find the exact title...]
I have to provide a "numerical capability" to users in my home automation system that doesn't require them to understand the limitations of the implementation. I.e., N * 1/N should be 1 for all N != 0 (among other guarantees). It would be arrogant to expect homeowners (business owners) to have to understand that order of operations bears on the resulting operation of the calculation, etc.
So, my implementation (written ONCE) takes care of that so the users' applications (written MANY times) don't have to!

Thanks, I'll have a peek (probably later tonight as I'm running out of "workday hours" with my German colleague).
[To rephrase my comments in light of your last comment, here, I have to be "happy to put my name on" EVERY line of code that I write. Others will judge the quality of my work from it -- after all, *it* is the PRODUCT that I ultimately "deliver"!]
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

<https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html *EXCELLENT* resource! (It's easier to understand if you've actually written a floating point package)
The comments on the quadratic formula should have damn near everyone rethinking how they've "solved that" in the past! :>
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thursday, April 21, 2016 at 1:48:24 PM UTC-4, Don Y wrote:

And it's your experience and your opinion that companies in those product areas don't value and re-use their existing code base? That with each new product or each new version of a product, they don't typically start with the existing code, and use much of it for the new product? That instead, they pretty much write all new code, each time? That's my take on what you're arguing here.

That's not my experience. Is that how they build products in all those product areas you listed, copied below, from your experience?
"product processes large amounts of

Show us one example. How exactly does this miracle inventory system work? How does it know I just put a lettuce in the fridge, or a carton of eggs, or how many eggs are left, etc?
It's not hard to go from that

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped>

As the expert system said "I were one, two." I think I hurt Don's feelings when I said I wouldn't hire him but I wouldn't hire me either! At least not back then when I was a "sole practitioner" and that's *because* I was a sole practitioner.

Some of my most profitable gigs were converting stand-alone versions of programs I had written to networkable ones. Obtuse coding got me the repeat business because they shopped the upgrade elsewhere and were told "it would take us weeks to figure out what it does, let alone make it do something else." It was "ahead of its time" coding - outputting dBase II in neatly formatted boxes using ASCII line commands. We burned out at least one dot matrix a month because the client spec'ed 30 active jobs but there were really 300.
If you're dealing with a limited data entry pool you can skip a lot of bulletproofing, I agree. For something that doesn't face hostile actors time is money and I worked 16 hours a day as a lone cowboy. Making it bulletproof wasn't as important as making it work.
To top it off, big jobs seemed to come in all at the same time so I really had to work as fast as I could all the time because I might lose a big job because I was too busy.

Even if the code isn't great, a second pass is never a bad thing. "Build one to throw away - you will anyway." -- Mythical Man Month
I do agree that the first time you have to update your own quick and dirty code is the time you learn to be a little slower and a little cleaner.

Ain't it great to have to rediscover an issue that you solved months or years ago and work around and around it until you remember why you had to do it the way you did?

Just opened a tub of in-date yogurt that had a nice big green mold spot where the foil didn't seal. That would be hard to catch, I suspect. A true self cleaning fridge would be great, though. I've taken to writing the date I open pickle jars and such on the bottom of the jar with a sharpie. Sometimes it shocks me how old some stuff is. (0:
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped>

I know it's painful for microcoders to see all the new (usually dirt cheap) chips/chipsets but not be able to use them because the rewrite and retesting effort would be massive. I've only seen it happen when the device is so popular that people think up additional things it could do and there's a potential for greater sales. Even then, the bean counters have been known to say no - better the devil we know.
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped>

Or his cat or dog was sitting on the keyboard. Google just spit out an error message about such an incident that echoed back the long string of nonsense characters that says, simply: "Malformed request."
I once had an operator who consistently missed the shift key and hit CTRL. Took the installation of a keystroke recorder to figure that out because as you might imagine, CTRL plus any number of other keys can cause real havoc.
--
Bobby G.




Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/18/2016 12:12 PM, Robert Green wrote:

The point is, you don't let a user decide what *you* will accept. You make allowances for what you would reasonably expect. Then, *enforce* those limits -- rather than letting the user exceed them and wreak havoc on your UNSUSPECTING software.
A colleague was just about finished with a project he'd been working on for more than a year. I looked over his shoulder, in passing, and asked what he was working on, at that instant. Then, typed in <something> and hit ENTER -- crashing his program instantly! (keep in mind, he thinks he's pretty much DONE) "What the hell did you just do???" "I typed <whatever>!" "But, you're not supposed to do that!" "Then why did your program LET ME?"
A "potential client" flew me out to look at a system they were developing. Basically, an electronic "lock" (for hotels, etc.). Gave me the grand tour which ended with a presentation of their prototype.
My host gave me a two or three minute demo of how the system worked. I asked if I could play with it; "Go right ahead!"
(keep in mind I have never seen this before two or three minutes earlier)
Standing sideways (so my host could see everything that I was doing), I proceeded to instruct the system to make me a "Grand Master" key (one that would allow me into any lock). I'm sure my host thought this was just the sort of thing someone "playing" might want to do...
I went through all the steps, mimicking what he'd shown me a few minutes earlier. Then, just before actually MAKING the key, I unplugged one particular connector. My host was suddenly nervous. I just smiled...
...and made a Grand Master key.
Then, proudcly held it up for him to see.
And kept smiling as I made two more!
Then, plugged in the connector and hit CANCEL. Stood aside for him to read the message on the screen: "Operation canceled. 0 keys made" as I held the three keys out for him.
"You're not supposed to do that!" "Then why did IT let me?"
People get too focused on trying to make something sort-of work and ignore how it *should* work. And, because they RARELY sit down and write a specification that they (and others) could review BEFORE implementation, they leave these sorts of gaping holes all over!

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Sunday, April 17, 2016 at 10:43:25 AM UTC-4, Ralph Mowery wrote:

display. And then I pushed the buttons to turn it on. Amazingly, it start ed and ran. After a few hours it

Some microcontrollers have a built-in watchdog timer, for use in critical apps. Once activated, the software has to write a certain data value to a certain I/O location to keep it from forcing a reset. Good chance that kind of feature would have saved Philo's food.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped>
<<Some microcontrollers have a built-in watchdog timer, for use in critical apps. Once activated, the software has to write a certain data value to a certain I/O location to keep it from forcing a reset. Good chance that kind of feature would have saved Philo's food.>>
Agreed. There are things they should do/could do to work around this problem and if it happens to enough people to warrant a recall, they probably will. It depends on how often it occurs and under what conditions.
A simple power blip should not cause a fridge to fail to restart. There could be compressor protection issues involved, but even those could be worked around. Even surges could be damped at the device itself.
If it was my refrigerator, I would be trying to duplicate the problem (on video) and if I could easily do so, I would be making life difficult for Whirpool until they resolved the matter.
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/17/2016 7:48 AM, Art Todesco wrote:

Wow, that is completely unacceptable. We bought a new Samsung French door and I may just cut the power to see what happens. To not reset is a major problem if you are away for a day.
I'd certainly try to find out what the problem was. Even though they have exclusions in the warranty, if it is a design problem you may be able to get something out of them for your losses. Refrigerators should come back on with no intervention.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

It evidently isn't just a Whirlpool problem. I did a search for refrigerator power glitch lockup. Samsung Galaxy can do the same thing. <http://support-us.samsung.com/cyber/popup/iframe/pop_troubleshooting_fr.jsp?idx2312&modelname=RM257ABRS/XAA or http://alturl.com/e6ydi
--
Using Opera's mail client: http://www.opera.com/mail/

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/17/2016 10:16 AM, Dean Hoffman wrote:

That makes me feel just great. Bought a Samsung a few month ago. At least a couple of times a year the power goes out from something, even though it is usually just a couple of minutes.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Thanks for reporting the experience. Let us know how it turns out.
--
Web based forums are like subscribing to 10 different newspapers
and having to visit 10 different news stands to pickup each one.
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/17/16 7:48 AM, Art Todesco wrote:

Wonder if the problem is the microprocessor got zapped from a power surge that accompanied the power failure?
--
“I’m interested in making sure we get the maximum amount of revenue from
those who can well afford to provide it.”
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/17/2016 4:48 AM, Art Todesco wrote:

It's just a bad design.
Unfortunately, most people "writing code" are "programmers", nowadays. Writing software for a "computer" and writing software for an "appliance" (embedded system) are entirely different experiences and require very different skill sets. Just because there's a computer *in* it doesn't MAKE IT a computer!
[Amazing how many people are naive about this -- which is exemplified in your reported experience. Programmers and managers think "code is code". Their goal is to get something that *looks* like its correct in the least amount of time and for the least number of dollars -- out of the greatest number of interchangeable brains.
Just like doing the books for a small Ma&Pa firm should be the exact same sort of experience as doing the books for a large multinational: ledgers are ledgers, right? Or working on a lawnmower engine the same as a big V8?]
And, of course, designing the *hardware* for an embedded system is an entirely different issue than designing (or *buying*, COTS) the hardware for your "computer"
Install a normally closed pushbutton switch in the power line to your PC. *Briefly* push it to interrupt the power to the PC and see what happens. Will it reboot? Or not? (I have machines that fall into both categories). The more oversized the power supply (i.e., UNDERSIZED your current utilization), the greater the chance that the computer will weather the outage -- until, of course, the outage is long enough that the power supply can't maintain the load!
If it reboots, it *won't* bring all your applications back and leave you in the exact same state that things were in when the power "glitched". it won't even remember whether you had NumLock and CapsLock on or off on your keyboard (Gee, how hard would it be to remember whether two *bits* were 0 or 1?). etc. The "computer" expects YOU to fix things to your liking.
An appliance, OTOH, is expected to REMEMBER what it was doing and continue as if nothing had happened!
As appliances have to be able to run 24/7/365, they have to address these sorts of problems (and others).
When power is applied, the entire circuit must stabilize before the processor can "come out of reset". There may be several voltages that have to stabilize in order for it to be able to perform its intended function. In addition, there are usually very precise constraints on exactly HOW these voltages stabilize: X must be "up" to its tolerance before Y by at least Z milliseconds -- but no more than W!
When power *nominally* goes away, all of these supplies drop to 0 in a predictable manner. And, the process can repeat.
Exceptions to these "textbook" sequences can actually result in electronic devices "latching" (where nothing the processor will do CAN undue the effects as it happens IN the semiconductors).
You can design to handle "blackouts" (total loss of power), "brownouts" (reduced power) and "dimouts" (brief outages). But, you have to do this deliberately: "how should I react in each of these cases?"
And, in each of these scenarios, there are "I/O's" that must be treated in specific ways. E.g., you can't just let the compressor turn on while you "gather your wits" -- thinking that you can turn it back OFF once you're refocused on the task at hand (what happens if you never get back to "normal"? Do you let the compressor stay in that running state? How do you stop it if you've crashed??)
Likewise, you can't just power up and decide: "Oh, refrigerator is warm, let me turn on the compressor, as intended!" Because you *don't* know what happened while power was off -- nor how long it was off (unless someone has deliberately taken extraordinary design measures to provide this information to you!), you can't know if the compressor WAS on just a few moments ago. In which case, your turning it on now will overstress the motor (trying to start into a large load).
On top of all that, you have to hope that the software never encounters an anomaly that it isn't prepared to handle (in the simplest sense, this is a "bug". But, might actually be some characteristic of the system that rarely occurs and the designer wasn't prepared to address (e.g., an underdamped system "rings") -- "Gee, I never saw that before! But, yeah, the theory SAYS it can happen...".
Finally, you have to hope the system never glitches due to random failures. Even "transient" failures. E.g., alpha particle radiation can "flip" bits in memory as can "cosmic rays". Likewise, "electrical noise" can cause signals to appear to be other than they truly are. Your PC probably has ECC memory in it to detect and correct these intermittent errors; most appliances don't! (cost constraints)
[Most "programmers" are clueless on all of these issues. "I stored a 27 in this memory location, so, there *will* be a 27 there when I go to look it up, later! Right?"]
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
My early 1970's avocado green GE is getting all frosted up again after 6 months or maybe a year.... which is kind of to be expected since it wasn't a frost free refrigerator to start with. Along these lines, I've got a fancy arse Samsung big screen smart TV with a great picture but to use the sleep timer, well I don't because it takes to many clicks in various places to set the sleep timer with the remote and it's just to much trouble so I let the damn thing stay on all night, but I have an older Emerson with a sleep button right on the remote that I can and do use.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/17/2016 12:18 PM, My 2 Cents wrote:

SWMBO relies heavily on the sleep timer for her "stereo". But, it is implemented with ease of use in mind: - turn on the power - press button repeatedly until desired time is displayed She knows that pressing the button 10 times will result in a 2:00 delay. Button has lots of tactile feedback so you *know* when it has been pressed (not true of all remotes). So, she doesn't even have to look at the display.
Biggest user interface "FAIL" I've encountered was a MUTE function that automatically UN-muted when you "did anything"! So, if you muted the TV to field a phone call (or, allow someone else in the room to do so), you couldn't channel surf -- without hitting the DOWN VOLUME key, repeatedly (the first press will unmute the TV!) before you start flipping channels.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Related Threads

    HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.