Whirlpool Fridge lock up during an apparent power glitch

Page 2 of 4  
On 4/20/2016 8:02 AM, Vic Smith wrote:

Have you heard of iOS and Android? Together, they represent 93% of the mobile market. "Windows Phone" accounts for all of 2.5% of that market.
Do you understand that desktop sales are (and have been) in decline? To the tune of 5-10% annually? And, that in many countries, mobile devices now exceed the number of desktop devices? I.e., those old copies of Windows have been dispatched to the bitbucket -- along with the hardware on which they ran.
<http://www.computerworld.com/article/3050931/microsoft-windows/windows-comes-up-third-in-os-clash-two-years-early.html
In Gartner's current forecast, Windows will dip 3.4% in 2016 to 283 million devices shipped while Apple's OS X/iOS will climb 2.1% to 303 million. By the end of the forecast window -- the calendar year 2018 -- Windows will be even farther behind, shipping 298 million devices compared to Apple's 334 million.
(Note the use of the units "Million")
"Overall, Gartner's latest forecast continued the trend of pessimism not only for Windows, but for all device shipments. The researcher now believes 2.41 billion computing devices will ship in 2016 -- 80.4% of them smartphones -- compared to a same-year prediction of 2.46 billion made in the fall of 2015."
(Note the use of the units "Billion")
Sure looks like Windows has long passed it's peak. With the "replacement" being iOS and Android.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Wed, 20 Apr 2016 16:27:59 -0700, Don Y

I was referring to desktop/laptop OSes. I don't use other OSes. My wife has an Android phone though, and it seems to work.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/20/2016 8:18 PM, Vic Smith wrote:

The fact that desktop/laptop OS's are in decline suggests the market has moved past them in favor of other implementations -- that the market considers "more attractive".
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Wed, 20 Apr 2016 20:44:49 -0700, Don Y

More like "If I can do what I do on a phone (browse the internet) why should I pay up for a PC?" Many people used to buy a PC solely to browse the internet and steal music. Now they can do that on a cheap phone - if their eyes are good enough. I built PC's for all my kids, but 4 of 7 now rely on mobile devices only. Desktops/laptops aren't going away. The market is simply adjusting. And Windows remains the king OS of desktops/laptops.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Wed, 20 Apr 2016 23:48:33 -0500, Vic Smith wrote:

You named your kid after a Star Trek character?
--
http://mduffy.x10host.com/index.htm

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/20/2016 9:48 PM, Vic Smith wrote:

Or send email. Or, look at photos -- that they took on their CAMERA. Etc.
The phone offers the same sorts of capabilities without the bulk and lack of mobility that the desktop/laptop imposes.
And, you can make phone calls/SMS with your phone (a relatively rare activity for a laptop/desktop).

The office space is where desktops remain popular. If you look at usage patterns for work days vs. weekend, you can see this most prominently. On the weekend, folks aren't sitting down at their laptop to steal music, browse the internet, review their photos (taken on a different device), teleconference, etc.
You'll eventually see a "dock" for phones that lets you use the computational horsepower in the phone with the user interface of a desktop. This will offer a cheaper alternative to businesses TRAPPED into MS's perpetual hardware and software upgrade cycle. It will also allow employees to "take their desk with them" when they walk to another part of the building, GO HOME, etc.
"The Cloud" has just as much appeal to a mobile user as to a desktop user.
Windows has been in the phone market (and the PDA market before that). And, the phone market has opted to replace it -- with "something better". Despite the fact that Windows was there, first!
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Wed, 20 Apr 2016 22:06:34 -0700, Don Y

We do that - save the music stealing - all the time on our desktops at my house. My wife and MIL are on Skype teleconferencing with friends and relatives in Poland. But I spend quite a bit of time gaming too.

People are always upgrading their phones, more often than I undergo a PC upgrade. PC upgrades have lengthened quite a bit. I think I've been upgrading every 5 years or so, due to the technological demands of games.

Don't use any cloud storage.

I guess they miscalculated somehow.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/20/2016 10:56 PM, Vic Smith wrote:

We "visit" with people with whom we want to chat. Our long distance costs are in the single digits for any given year (e.g., a $20 phone card). SWMBO spends a few hours a year on the phone with her sister; usually just before or after one of the sister's trips ("China was great! I think I'll head back there, next year! But, I'm busy packing for Africa, next week...")
Most of the folks with whom I converse are local and/or I see them regularly. Clients are invariably out of town and contact is via email (phone calls take up too much time and usually result in either or both parties waiting for the other to think something through; email forces you to do your thinking up front!)
I use PC's to rip CD's and convert between file formats. And, to access and maintain my music archive (e.g., so I can copy selections onto particular PMP's). My music tastes haven't changed in decades so there is very little that is "available" that I don't already have (though I have many LPs that I still need to rip). But, I don't LISTEN to any of that music "on" a PC. Even when working *at* a PC, I'll defer to a nearby PMP, instead.

I see businesses replacing entire fleets of PC's every 3 years. I've known some to do so every 18 months! The local University regularly pushes reasonably new machines out as grants (typ a year or two) come to an end (the terms of the grant usually require any equipment purchased to be sold at auction -- you can't just "inherit" the equipment when the grant expires)
Friends and colleagues seem to similarly "upgrade" every few years (this is how I acquired most of my laptops).
I am slow to update simply because of the cost (in time) to do so (the effort required to install all of the applications, fetch new licenses, etc.) and the potential for losing capabilities that always ensues: - lack of a particular driver support in a new OS (rendering the associated hardware prematurely obsolete) - lack of support for a particular application - lack of particular hardware capabilities in the new machine (lack of bus slots or particular types of bus slots) - vendor not making new licenses available for a legacy application
I am contractually obligated to support each of my past clients/projects "indefinitely". As most folks aren't prudent enough to preserve their development environments, INTACT, moving forward (so they can revise an existing product "on a moments notice"), it behooves me to make sure *I* can -- failing to do so leaves me reliant on *their* efforts.
[If a casino operator claims "guests" (gamblers) have found an exploit that is allowing them to STEAL hundreds of dollars every MINUTE -- from EACH machine -- I need to be able to defend *my* implementation: "Here's a copy of the code *I* delivered. Show me how the exploit affects *it*? Ah, so the exploit is only applicable to the codebase that YOU modified! And, how is this MY problem? Why would *I* be a logical choice to troubleshoot a problem introduced by YOUR staff?" (No, I'm not interested in that job, thankyouverymuch -- its neither interesting nor rewarding, for ME)]
And, as most applications (esp anything that runs under Windows!) are incapable of supporting "previous file formats", implementing some "little change" might prove to be a major ordeal! E.g., I was asked to change the type of battery used on an old design. This is a no-brainer job -- replace one battery holder with another. There's no real "engineering" involved. And, there are only two "wires" (contacts) that the PCB layout would have to accommodate.
Plus, any client would KNOW that sort of thing. Refusing to "lend a hand with this simple change" (while not a contractual obligation) starts to look "pissy": "You can't afford to give us an HOUR of your time? Even if we pay you for the *day*?"
Do you explain to the client that you'll have to resurrect some ancient application -- and the OS on which it ran! -- in order to make that change? So, that "hour" is really more like a *day*?
Do you claim that you don't even HAVE that capability available? And, have the client thinking that you're "irresponsible" -- despite the fact that THEY ALSO have lost that capability?
Better business practice is to HELP the client, earn additional good will, be seen as "resourceful" and "reliable". And, with that in mind, take it on myself to make these sorts of activities "less expensive" -- to me, personally.
[I'll still have to bear the cost of being diverted from my CURRENT project to address this "more pressing issue". A day is easy to swallow. SEVERAL days? Not so much so. Current clients might grumble -- to which you reply: "If YOU contact me a year hence with a similar request, wouldn't you want me to make time for your 'pressing need'?"]
I don't play games online or on any of my PC's. I'd rather turn on an (arcade) pinball machine or video game -- why set aside the space for them if you aren't going to PLAY them? :> (there's also a certain nostalgia involved -- an entirely different sort of user experience)

Nor do I -- except to occasionally post photos for others to access in "public" discussions so I don't have to email to multiple recipients; I tend to guard my email accounts pretty jealously -- discarding them frequently (save for my business accounts)
But, the cloud has appeal to businesses. I see many adopting it -- even those folks who are paranoid about the security/privacy issues involved. People (esp businesses) have a hard time saying NO to something that is (relatively) "free".
[In this case, "free" pertains to the capability to access that data from a multitude of locations, simultaneously. I.e., we are repeating the perpetual "big-centralized-computer-with-terminal-clients vs. distributed-smart-workstations Flip-Flop that the industry has experienced since its inception. Now, the "terminals" are PC's (and phones) while the "big centralized computer" is The Cloud.]

They "missed" the mobile OS market AND appear to have missed the mobile application market! Just like they missed the "advertising" aspect that the 'net presented.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thursday, April 21, 2016 at 1:56:45 AM UTC-4, Vic Smith wrote:

I think what you really missed was when Microsoft was in the cell phone business with Windows before the cell phone companies had cell phones out that supported data, email, etc under their own software. I missed it too, because it never happened.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thursday, April 21, 2016 at 12:48:38 AM UTC-4, Vic Smith wrote:

I mostly agree. In the early days of the PC in the consumer market you typically had either no PC or one PC per home. AS the market grew and costs came down, that shifted to many homes having multiple PCs. Smartphones and tablets have now changed that, with those replacing some of those multiple PCs. But I'd sure still want at least one real PC in my house. And Windows has about 10%+ market share in tablets, so there is some Windows existent in that market too. IDK what will ultimately become of Windows, but it's going to be here for a long time to come. And I also agree that for what it does, it works well for me and I have few complaints, nor do I see anything else that does as much, supports so many different hardware configurations, etc.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

<stuff snipped>

That probably has nothing to do with software superiority or attractiveness and everything to do with a laptop not being able to fit in your shirt pocket or purse.
--
Bobby G.





Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

that

It's too massive an effort. Google Chromebooks and Android phones skim the email/browser end of the market but CAD/CAM, DTP and lots of other tasks will still be done on PC's.
MS didn't forsee the growth of the mobile end of the market and was too wedded to the Windows interoperability paradigm to come out with a clean, innovative OS for phones. Ironically the did come out with a tablet version of Windows very early on when tablets weren't quite portable enough to penetrate the market.
Apple's got it easy compared to Windows. IOS only has to support Apple's carefully controlled hardware. Windows has to support 100's of machines and configurations. Android has it even easier, having only to support phones and tablets in limited ways. AFAIK, you still can't even hook up a scanner to a Chromebook so they are not really Windows replacements, just cream-skimmers. (-:
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped>

I'm pretty sure that's firmware, not software. (-:
<stuff snipped about how many devices have CPUs of some sort in them>

But these are usually very close to trivial programs that operate with little user input and that interact with relatively few sensors - at least soldering irons and other items like that. And I would expect that as with new cars, updating will be more and more feasible because so many new appliances will be web-enabled.
It's still quite unfair to compare a microwave's program to something like Linux or Windows that has to host an nearly infinite variety of applications on hardware that's very, very variable. It's largely a function of the number of lines of code. As that rises, the likelihood of finding all the bugs drops in proportion.

How much extra care? If it's extra care, doesn't that imply extra cost? Which gets us back to the comments I took issue with. To be more careful requires more cost and more time. Companies reuse code and build code libraries to be competitive. If they had to write everything from scratch every time it would be prohibitively expensive. Can a "sole practioner" afford to take more time and care? Yes, of course, but even then the more time you spend on one program the less you can spend on other programs (and sources of income). So making programs as perfect as you can make them cuts into your revenue stream. Is it worth the tradeoff? Maybe it is if your client reputation shines as a result and you get more business and clients who are willing to pay for the increased reliability.
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
says...

As MS had about 15 years to fix the bugs in Win XP and never did seem to get them all, then mentionthat it was going to quit suporting it and to go to a new operating system, makes one wonder how bad Win 10 is going to be.
I remember when the calculator in WFWG or so had a big bug in it. Maybe it was Dos 6.x instead, long time and I don't recall, but you could put in something like 6.1 and substract 6.0, whatever the numbers were you got zero instead of .01 . It stayed that way for a long time.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On 4/18/2016 3:35 PM, Ralph Mowery wrote:

Look through the descriptions of each patch ("update"). Dig deeper to see the actual underlying causes. Invariably, buffer overrun errors (or something similar).
Jeez, how many times do you have to get bitten by this same problem before you do *something* to prevent it from happening again? How many times are you going to get kicked in the nuts before you start wearing a cup?

(s.b., 0.1)
Exactly. They'll rewrite the same software (functionality) over (and over!) again and make the same mistakes over (and over!) again. Quality never improves. Because they start digging each hole *fresh* instead of finishing off the one they started, previously!
It's like having a serious illness and switching doctors after he's run countless tests -- and ruled out many potential problems -- just to start all over again with a new doctor and the same tests! (and dumping THAT doctor at the same point that you dumped the previous)
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Clearly you've not worked on a large team programming effort. It's easy to spot because of how you say "I design, I program, I etc." I would NEVER hire you because of that. If you get sick, who fills in for you?
What you describe keeps happening because folks like MS *have* to reuse code. To fix some of the problems that have occurred in Windows or Linux often requires patching because starting from scratch would be prohibitively expensive. Change something in the kernel and everything dependent on the old structure could break.

That's certainly NOT true. Even though code size increases dramatically, major companies like Google, MS and Oracle keep very close track of the number of bugs they have to fix. We both know that buffer exploits are becoming more and more rare as coders learn to pay more attention to parsing input effectively. They learn that by looking at bug reports and trying to insure they DON'T make the same mistakes.
https://www.blackhat.com/presentations/bh-usa-07/Ferguson/Whitepaper/bh-usa-07-ferguson-WP.pdf
claims that "traditional style overflow [have] become more and more rare" and that tracks with the exploit reports that I see from sites like:
https://www.qualys.com/research/top10/
Remote code execution and privilege elevation seem to be the winners nowadays because coders have finally learned to parse input correctly.
This article from Wikipedia notes that adding protection, especially via software, can add significant overhead - translation: there are real costs associated with armoring programs.
<<In computer security, executable space protection is the marking of memory regions as non-executable, such that an attempt to execute machine code in these regions will cause an exception. It makes use of hardware features such as the NX bit, or in some cases software emulation of those features. However technologies that somehow emulate or supply an NX bit will usually impose a measurable overhead; while using a hardware-supplied NX bit will impose no measurable overhead.>>
So it's not only coders that have to up their game, it's machine makers, too.

Hmmm. Seems like we're talking OS's now and not just refrigerator firmware. (-: It doesn't really matter for the points I've been trying to make, chief among them that taking extra care means taking extra time and costs extra money and corporations and most clients have resource and time constraints.
I've written lots of SW and managed lots of coders. Perfect software isn't a matter of just trying harder, it's like anything else in the world - a tradeoff between cost and time that is bound to affect the quality of the finished project. Did people stop using WFWG because of a calculator bug? Doubtful.

Just had this happen to me. The first CAT scan didn't reveal a problem but the second did. Sometimes a new doctor and the same tests can find a problem that the old doctor couldn't.

Which is why the idea that someone like MS will throw away less than perfect code to try again for perfection from scratch just doesn't fly. Once an OS is released, a lot of other programs depend on it being consistent. Every change or bug fix is likely (and often does) create a host of unexpected interactions. Who here *hasn't* been hosed by an update at one time or another?
It doesn't sound like much of the code you create gets exposed to legions of hackers (or even users) like IOS, Windows and Linux. Code you *think* is perfect (like the old Army adage about battle plans) rarely survives contact with the enemy.
That's why I make the claim I can break anything you've written (that's non-trivial), especially with access to the source code. Especially if it has to "face" the net. I've never seen a perfect coder. I've seen some very good ones, and that could be you, but I've also seen far more coders who *think* they are top-notch. However, a super-smart, super-careful and super-efficient coder is rare and if he becomes "unavailable" for any reason, it's hell on wheels to find anyone whose smart enough to understand it.
--
Bobby G.


Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Wednesday, April 20, 2016 at 9:50:22 AM UTC-4, Robert Green wrote:

Never mind MS and Oracle. It's not true even for embedded applications, like the fridge or a piece of communications gear. Unless the existing code is a total wreck because it was done by amateurs or there is some specific need to redo the whole thing, the existing software is typically what's used as a starting point. Being able to keep, reuse that existing code is usually the top priority, over anything else. For example, hardware engineers might want to switch to a different microprocessor based on performance, cost, power, etc. Software managers, project managers, engineering management will overrule them because the cost and investment on new software, starting from scratch, exceeds those other considerations by an order of magnitude. Sure, there are exceptions, if it's some simple design, where the software is no big deal. But even for a fridge, the natural starting point would likely be the existing code, not start with new code.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
<stuff snipped re: code reuse v. starting from scratch>

It's actually that way two stories up, so to speak, because almost all programming languages I'm familiar with come with code libraries of some sort. OS's like Windows also have libraries as well. DLL stands for Dynamic Link Library. So reusuability is important at multiple levels. I believe the opposite of what I think Don is saying. It's libraries that introduce stability to code and provide a similar user experience. It drives me crazy that phone/tablet apps are all over the map. It's apparently no longer trendy to put a close dialog X in the corner, where everyone in the world might look first to close a window. Sigh.

It may not be for a sole practioner like I assume Don to be. If each project is a one-off deal, why not code with home-grown everything? The catch is that takes 10 times as long and not many people are willing to pay for that in either time or money.

starting

When I read that, I know you've been involved in such a project. I have, too. It's very, very seductive to want to change to a new, more capable microprocessor but most instances I know of if there was an upgrade, it was to the same chip family so that old code could still run.

One thing that I think could trigger a "fresh start" migration would be to make such a fridge web-enabled where you had to switch to a controller that could interface with ethernet or WiFi. It's been my experience that the micro is spec'ed for the original task and adding WiFi support will overtax it. At least I can remember my microcoders complaining frequently about being out of interrupt lines, memory or time.
The fridge is one device that can make use of the net to report out of bounds conditions on the unit to someone who can come and investigate. There are already monitors to do that, so integrating them in the future is likely. Consumers like Art will discover they have saved a lot of money when the fridge texts their cellphone and says "I'm getting warm" and they can intervene in time.
--
Bobby G.



Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thursday, April 21, 2016 at 12:26:26 AM UTC-4, Robert Green wrote:

Because even there, if the "new" product is similar to the existing one, which is often the case, it's easier, faster, cheaper to use the existing code. Again that assumes that it's not a complete hack job, that it's documented, etc. If the new project shares little in common with the existing, then you start from scratch. But those are pretty much the same rules for any project. For example, if you have a driver for a communications interface, be it USB, Ethernet, Wifi, etc, why re-write that?

That would depend on who's paying the bill and what they are paying for. More typical is for anyone who owns the rights to the software to reuse what they have a right to reuse and charge what the market will bear. In other words, if I already have software for a project where it will save 80% of the time of development, I would use that, then come up with a price for the whole thing, including the new work.

Absolutely. In many markets, the cost of one CPU vs a competing alternative could be huge, but they won't port the code over, because the cost of doing that, the time, the validation, etc would be staggering. The cost of the CPUs is almost irrelevant. Now, if it's some simple app, on some non-critical consumer $10 item, then the rules obviously can change. But even there, if it's say some child's toy, I would expect Mattel or whoever build it for Mattel to re-use what they have, provided it's similar.

Exactly. There the chipset that supports the WiFi may have a CPU that could also do the other simple control stuff to run the fridge. That's justification for starting a new software design.

That's a good point. Was talking with a friend the other day about "The Internet of Things" and WTF that meant. I cited hooking up thermostats and alarms as examples. I was having trouble thinking of another example, I did think of fridges actually, but dismissed them, not thinking of the out of temp alarm possibility. So, yes, fridges are an example, I'd pay some small premium for a fridge that had a wifi alarm feature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thu, 21 Apr 2016 06:29:27 -0700 (PDT), trader_4 wrote:

I large part of my career was as you call it a "code cowboy". Working completely by myself on small (a few days, a few weeks, occaisionally a few months) programs which were ALWAYS specified as being one-shot deals. No need then, to extensively test for inputs out of range of the given datasets, no need to document for easy re-use by myself, and no need to collaborate with others.
Most of the time though, a request would eventually be made to resurrect the code with modified requirements, or input data that had not been subject to the same levels of quality control as previous runs, etc.
The only thing more vexing than code that is difficult to follow is when you can't even complain to the author about how shitty it is.
So I got into the habit of using descriptive variable names, adding comments, documenting formats, and most importantly, little messages to my future self about why a seemingly simpler method cannot be used.

What we really need is a little robot that crawls around inside the fridge looking for stuff that hasn't been touched for a while or is emitting decay byproducts.
--
http://mduffy.x10host.com/index.htm

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Related Threads

    HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.