Whirlpool Fridge lock up during an apparent power glitch

The fact that desktop/laptop OS's are in decline suggests the market has moved past them in favor of other implementations -- that the market considers "more attractive".

Reply to
Don Y
Loading thread data ...

Hey facts are facts. I've hired lots of coders. Many talk a very good game but the proof is always in the code, the specs and the documentation.

You're only on the hook because both Trader and I agree that what you've said about "doing it right" and code re-use isn't right. For Trader and I to agree is pretty damn rare. I can tell from his comments that he's actually been involved in projects where people have to consider resources and make design tradeoffs.

You've set yourself up as a programmer that doesn't make the mistakes other people do. You're a smart guy Don. You know that's an invitation to "put up or shut up."

I forgot to add one more thing about lone cowboy/sole practitioner programmers. They usually don't tolerate criticism well at all. (-:

Reply to
Robert Green

It's actually that way two stories up, so to speak, because almost all programming languages I'm familiar with come with code libraries of some sort. OS's like Windows also have libraries as well. DLL stands for Dynamic Link Library. So reusuability is important at multiple levels. I believe the opposite of what I think Don is saying. It's libraries that introduce stability to code and provide a similar user experience. It drives me crazy that phone/tablet apps are all over the map. It's apparently no longer trendy to put a close dialog X in the corner, where everyone in the world might look first to close a window. Sigh.

It may not be for a sole practioner like I assume Don to be. If each project is a one-off deal, why not code with home-grown everything? The catch is that takes 10 times as long and not many people are willing to pay for that in either time or money.

When I read that, I know you've been involved in such a project. I have, too. It's very, very seductive to want to change to a new, more capable microprocessor but most instances I know of if there was an upgrade, it was to the same chip family so that old code could still run.

One thing that I think could trigger a "fresh start" migration would be to make such a fridge web-enabled where you had to switch to a controller that could interface with ethernet or WiFi. It's been my experience that the micro is spec'ed for the original task and adding WiFi support will overtax it. At least I can remember my microcoders complaining frequently about being out of interrupt lines, memory or time.

The fridge is one device that can make use of the net to report out of bounds conditions on the unit to someone who can come and investigate. There are already monitors to do that, so integrating them in the future is likely. Consumers like Art will discover they have saved a lot of money when the fridge texts their cellphone and says "I'm getting warm" and they can intervene in time.

Reply to
Robert Green

More like "If I can do what I do on a phone (browse the internet) why should I pay up for a PC?" Many people used to buy a PC solely to browse the internet and steal music. Now they can do that on a cheap phone - if their eyes are good enough. I built PC's for all my kids, but 4 of 7 now rely on mobile devices only. Desktops/laptops aren't going away. The market is simply adjusting. And Windows remains the king OS of desktops/laptops.

Reply to
Vic Smith

You named your kid after a Star Trek character?

Reply to
Mike Duffy

Or send email. Or, look at photos -- that they took on their CAMERA. Etc.

The phone offers the same sorts of capabilities without the bulk and lack of mobility that the desktop/laptop imposes.

And, you can make phone calls/SMS with your phone (a relatively rare activity for a laptop/desktop).

The office space is where desktops remain popular. If you look at usage patterns for work days vs. weekend, you can see this most prominently. On the weekend, folks aren't sitting down at their laptop to steal music, browse the internet, review their photos (taken on a different device), teleconference, etc.

You'll eventually see a "dock" for phones that lets you use the computational horsepower in the phone with the user interface of a desktop. This will offer a cheaper alternative to businesses TRAPPED into MS's perpetual hardware and software upgrade cycle. It will also allow employees to "take their desk with them" when they walk to another part of the building, GO HOME, etc.

"The Cloud" has just as much appeal to a mobile user as to a desktop user.

Windows has been in the phone market (and the PDA market before that). And, the phone market has opted to replace it -- with "something better". Despite the fact that Windows was there, first!

Reply to
Don Y

We do that - save the music stealing - all the time on our desktops at my house. My wife and MIL are on Skype teleconferencing with friends and relatives in Poland. But I spend quite a bit of time gaming too.

People are always upgrading their phones, more often than I undergo a PC upgrade. PC upgrades have lengthened quite a bit. I think I've been upgrading every 5 years or so, due to the technological demands of games.

Don't use any cloud storage.

I guess they miscalculated somehow.

Reply to
Vic Smith

We "visit" with people with whom we want to chat. Our long distance costs are in the single digits for any given year (e.g., a $20 phone card). SWMBO spends a few hours a year on the phone with her sister; usually just before or after one of the sister's trips ("China was great! I think I'll head back there, next year! But, I'm busy packing for Africa, next week...")

Most of the folks with whom I converse are local and/or I see them regularly. Clients are invariably out of town and contact is via email (phone calls take up too much time and usually result in either or both parties waiting for the other to think something through; email forces you to do your thinking up front!)

I use PC's to rip CD's and convert between file formats. And, to access and maintain my music archive (e.g., so I can copy selections onto particular PMP's). My music tastes haven't changed in decades so there is very little that is "available" that I don't already have (though I have many LPs that I still need to rip). But, I don't LISTEN to any of that music "on" a PC. Even when working

*at* a PC, I'll defer to a nearby PMP, instead.

I see businesses replacing entire fleets of PC's every 3 years. I've known some to do so every 18 months! The local University regularly pushes reasonably new machines out as grants (typ a year or two) come to an end (the terms of the grant usually require any equipment purchased to be sold at auction -- you can't just "inherit" the equipment when the grant expires)

Friends and colleagues seem to similarly "upgrade" every few years (this is how I acquired most of my laptops).

I am slow to update simply because of the cost (in time) to do so (the effort required to install all of the applications, fetch new licenses, etc.) and the potential for losing capabilities that always ensues:

- lack of a particular driver support in a new OS (rendering the associated hardware prematurely obsolete)

- lack of support for a particular application

- lack of particular hardware capabilities in the new machine (lack of bus slots or particular types of bus slots)

- vendor not making new licenses available for a legacy application

I am contractually obligated to support each of my past clients/projects "indefinitely". As most folks aren't prudent enough to preserve their development environments, INTACT, moving forward (so they can revise an existing product "on a moments notice"), it behooves me to make sure

*I* can -- failing to do so leaves me reliant on *their* efforts. [If a casino operator claims "guests" (gamblers) have found an exploit that is allowing them to STEAL hundreds of dollars every MINUTE -- from EACH machine -- I need to be able to defend *my* implementation: "Here's a copy of the code *I* delivered. Show me how the exploit affects *it*? Ah, so the exploit is only applicable to the codebase that YOU modified! And, how is this MY problem? Why would *I* be a logical choice to troubleshoot a problem introduced by YOUR staff?" (No, I'm not interested in that job, thankyouverymuch -- its neither interesting nor rewarding, for ME)]

And, as most applications (esp anything that runs under Windows!) are incapable of supporting "previous file formats", implementing some "little change" might prove to be a major ordeal! E.g., I was asked to change the type of battery used on an old design. This is a no-brainer job -- replace one battery holder with another. There's no real "engineering" involved. And, there are only two "wires" (contacts) that the PCB layout would have to accommodate.

Plus, any client would KNOW that sort of thing. Refusing to "lend a hand with this simple change" (while not a contractual obligation) starts to look "pissy": "You can't afford to give us an HOUR of your time? Even if we pay you for the *day*?"

Do you explain to the client that you'll have to resurrect some ancient application -- and the OS on which it ran! -- in order to make that change? So, that "hour" is really more like a *day*?

Do you claim that you don't even HAVE that capability available? And, have the client thinking that you're "irresponsible" -- despite the fact that THEY ALSO have lost that capability?

Better business practice is to HELP the client, earn additional good will, be seen as "resourceful" and "reliable". And, with that in mind, take it on myself to make these sorts of activities "less expensive" -- to me, personally.

[I'll still have to bear the cost of being diverted from my CURRENT project to address this "more pressing issue". A day is easy to swallow. SEVERAL days? Not so much so. Current clients might grumble -- to which you reply: "If YOU contact me a year hence with a similar request, wouldn't you want me to make time for your 'pressing need'?"]

I don't play games online or on any of my PC's. I'd rather turn on an (arcade) pinball machine or video game -- why set aside the space for them if you aren't going to PLAY them? :> (there's also a certain nostalgia involved -- an entirely different sort of user experience)

Nor do I -- except to occasionally post photos for others to access in "public" discussions so I don't have to email to multiple recipients; I tend to guard my email accounts pretty jealously -- discarding them frequently (save for my business accounts)

But, the cloud has appeal to businesses. I see many adopting it -- even those folks who are paranoid about the security/privacy issues involved. People (esp businesses) have a hard time saying NO to something that is (relatively) "free".

[In this case, "free" pertains to the capability to access that data from a multitude of locations, simultaneously. I.e., we are repeating the perpetual "big-centralized-computer-with-terminal-clients vs. distributed-smart-workstations Flip-Flop that the industry has experienced since its inception. Now, the "terminals" are PC's (and phones) while the "big centralized computer" is The Cloud.]

They "missed" the mobile OS market AND appear to have missed the mobile application market! Just like they missed the "advertising" aspect that the 'net presented.

Reply to
Don Y

I mostly agree. In the early days of the PC in the consumer market you typically had either no PC or one PC per home. AS the market grew and costs came down, that shifted to many homes having multiple PCs. Smartphones and tablets have now changed that, with those replacing some of those multiple PCs. But I'd sure still want at least one real PC in my house. And Windows has about 10%+ market share in tablets, so there is some Windows existent in that market too. IDK what will ultimately become of Windows, but it's going to be here for a long time to come. And I also agree that for what it does, it works well for me and I have few complaints, nor do I see anything else that does as much, supports so many different hardware configurations, etc.

Reply to
trader_4

I think what you really missed was when Microsoft was in the cell phone business with Windows before the cell phone companies had cell phones out that supported data, email, etc under their own software. I missed it too, because it never happened.

Reply to
trader_4

Because even there, if the "new" product is similar to the existing one, which is often the case, it's easier, faster, cheaper to use the existing code. Again that assumes that it's not a complete hack job, that it's documented, etc. If the new project shares little in common with the existing, then you start from scratch. But those are pretty much the same rules for any project. For example, if you have a driver for a communications interface, be it USB, Ethernet, Wifi, etc, why re-write that?

That would depend on who's paying the bill and what they are paying for. More typical is for anyone who owns the rights to the software to reuse what they have a right to reuse and charge what the market will bear. In other words, if I already have software for a project where it will save 80% of the time of development, I would use that, then come up with a price for the whole thing, including the new work.

Absolutely. In many markets, the cost of one CPU vs a competing alternative could be huge, but they won't port the code over, because the cost of doing that, the time, the validation, etc would be staggering. The cost of the CPUs is almost irrelevant. Now, if it's some simple app, on some non-critical consumer $10 item, then the rules obviously can change. But even there, if it's say some child's toy, I would expect Mattel or whoever build it for Mattel to re-use what they have, provided it's similar.

Exactly. There the chipset that supports the WiFi may have a CPU that could also do the other simple control stuff to run the fridge. That's justification for starting a new software design.

That's a good point. Was talking with a friend the other day about "The Internet of Things" and WTF that meant. I cited hooking up thermostats and alarms as examples. I was having trouble thinking of another example, I did think of fridges actually, but dismissed them, not thinking of the out of temp alarm possibility. So, yes, fridges are an example, I'd pay some small premium for a fridge that had a wifi alarm feature.

Reply to
trader_4

I large part of my career was as you call it a "code cowboy". Working completely by myself on small (a few days, a few weeks, occaisionally a few months) programs which were ALWAYS specified as being one-shot deals. No need then, to extensively test for inputs out of range of the given datasets, no need to document for easy re-use by myself, and no need to collaborate with others.

Most of the time though, a request would eventually be made to resurrect the code with modified requirements, or input data that had not been subject to the same levels of quality control as previous runs, etc.

The only thing more vexing than code that is difficult to follow is when you can't even complain to the author about how shitty it is.

So I got into the habit of using descriptive variable names, adding comments, documenting formats, and most importantly, little messages to my future self about why a seemingly simpler method cannot be used.

What we really need is a little robot that crawls around inside the fridge looking for stuff that hasn't been touched for a while or is emitting decay byproducts.

Reply to
Mike Duffy

I hang out in an Excel forum and often write macros for other people. I use descriptive variables, add comments, etc. for 2 reasons:

1 - To help the requester understand what the code is doing (Many of them want to learn how to write macros, so the comments help "teach" them.) 2 - Months, and even years later, I often get 2nd requests (either from the original requester or from someone who found the code in the archives) asking for a modification to the code.

If I didn't include the comments when I first wrote it, I'd often have no clue what the original purpose was or why I wrote it that way. Heck, I don't even recognize some of the code that I use at work everyday. :-) Some of my macros are 5 - 10 years old. Luckily they still work for their original purpose and rarely need updating.

The worst part of the "forum job" comes from those who ask for a macro to do a specific task and then keep adding on requirements. I fulfill their original requirements and they come back with "Hey, thanks! That does exactly what I wanted. Now can you make it to 'this' also?"

This often happens more than once which means I sometimes need to start from scratch to avoid just bolting on a bunch of haphazard instructions and ending up with bloated, inefficient code.

When it gets out of hand, I will politely ask the requester what they think would happen if they had signed a contract for a "product" based on a set of requirements and then kept changing the requirements every time the product was delivered.

Of course, that assumes that I get requirements that can actually be used. Sometimes it's not much more than "I need macro to create an invoice from my data sheet. Can someone please post some code?" Uh...no.

Reply to
DerbyDad03

That probably has nothing to do with software superiority or attractiveness and everything to do with a laptop not being able to fit in your shirt pocket or purse.

Reply to
Robert Green

It's too massive an effort. Google Chromebooks and Android phones skim the email/browser end of the market but CAD/CAM, DTP and lots of other tasks will still be done on PC's.

MS didn't forsee the growth of the mobile end of the market and was too wedded to the Windows interoperability paradigm to come out with a clean, innovative OS for phones. Ironically the did come out with a tablet version of Windows very early on when tablets weren't quite portable enough to penetrate the market.

Apple's got it easy compared to Windows. IOS only has to support Apple's carefully controlled hardware. Windows has to support 100's of machines and configurations. Android has it even easier, having only to support phones and tablets in limited ways. AFAIK, you still can't even hook up a scanner to a Chromebook so they are not really Windows replacements, just cream-skimmers. (-:

Reply to
Robert Green

That approach doesn't work when your product processes large amounts of CASH; tells a patient they have/don't have a particular disease process; controls a mechanism moving at a high rate of speed (or large masses); produces a product that people INGEST; pilots a vessel; is relied upon to produce tens of thousands of dollars of saleable (vs SCRAPPED!) product each hour; etc. Folks tend to want designs that can be PROVEN to work. And/or, go through formal (legally required) certification/validation processes.

It also falls down when "your part" has to interface seamlessly with a part that another developer in another part of the world is producing in parallel with your activities.

Or, when the prototype hardware costs $1M and you can, at best, *borrow* a few hours per month on which to "test" your designs.

Of course, you also have scheduling and cost factors to consider. So, you REALLY want to be able to pull something "proven" off a shelf and use it as is -- without RE-writing it or RE-bugging it. "How do I know this works? *HOW* does it work? etc."

Or, you will forget that there is some inherent limitation (or assumption) embodied in the implementation (hardware or software) when you opt to modify/repurpose it for some later use.

I have a distinctive "style" to the way I write code, design hardware and design "systems". But, have been frequently complimented: "Once I realized how you do things, everything was OBVIOUS!" There is value to consistency and thoroughness.

(Do you write "++x;" or "x++;"? Are you consistent in your choice?)

And, chances are, there are holes large enough to drive a truck through in the thoroughness of your documentation, test suite, etc. Because you weren't PLANNING on reusing it. You succumb to the "just throw something together" thinking.

This is even worse with PHB's -- most of which have never written anything of even 10KLoC complexity (yet, somehow feel they have The Inside Track on how to manage 1 MLoC projects -- cuz they read about some "technique du jour") They bow to marketing, manufacturing, stockholder, etc. pressures to "get something out the door". Then, are perpetually playing "catch-up" and "wack-a-mole" with bugs that COULD have been caught in the specification stage.

But, they thought spending the time to write formal specifications would let too much calendar time slip away in which they "aren't being productive".

[No, a marketing document is not a specification; it's a mealey-mouthed wish list that says absolutely nothing about what the product will do and how it will be EXPECTED to perform]

There's this attitude that "We're not sure how well this product will be received; so, we aren't keen on making a big commitment to it. If the market likes it, we'll go back and fix all these things -- or design Model 2". Of course, if they cut too many corners, the product is crap and no one *wants* it! OTOH, if the market sees a value and embraces it, they find themselves too busy trying to shoehorn in the upgrades (because now competitors see an opportunity!) and the product quality eventually suffers -- and no one wants it.

For a fun exercise, try to track down any formal documentation for ANY product (software and/or hardware) that you've been involved with. Big difference when you have a financial/contractual arrangement with someone (client/provider) and that document DEFINES the work to be done!

[Employees can rarely DEMAND such a document; and employers rarely have to provide it -- they can just change their mind "on a whim" -- knowing the employees will somehow have to make it work (and, of course, all of the work they've done up to that point should be magically usable in the new project definition! :> ]

What happens if the pointer I pass to free(2c) is NOT a pointer previously obtained from malloc(2c)? [I'll wait here while you see if you can find a description of the EXPECTED outcome -- for that library that you PURCHASED :> ] Does your test suite check for this condition? Does your code/development environment ENSURE you CAN'T encounter that situation? Or, if it can't prevent it, how does your code/system respond WHEN that happens? Or, is this just the same old bug that will rear its ugly head 2, 6 or 10 months down the road?

[For fun, try passing a pointer to nonexistent memory and see how long AFTER for your code to crash. Or, a "legitimate" pointer that you'd already previously free(2c)-ed!]

I'm always amused at how many misconceptions folks have regarding floating point data types; ignorance of overflow/underflow, cancellation, etc. As if "simply" using floats absolves them from understanding how their equations (and data) perform. "Where did this outrageous number come from? Clearly that's not the result of the equation that I wrote!" (really? did you think the machine just fabricated random numbers of its own free will??)

And, of course, folks who are ignorant of the underlying iron won't see any difference in these two code fragments:

for (row=0; row byproducts.

(some) New refrigerators do inventory control. It's not hard to go from that to an expert system that knows how long particular products (or classes of products) can be kept in particular storage conditions. [This would be actually relatively simple to do with a small set of productions] It would also be easy to know the EXACT conditions incurred by the items in the 'frig (YOUR particular temperature setting, humidification in the crisper drawer, etc.). No more "does this smell spoiled, to you?".

And, from that, to advertisers pitching coupons to you based on their observations of your usage ("He's almost out of milk. Let's arrange for a SERIALIZED coupon for soy milk to appear in the next 'mailing' and we can later see if he actually uses it -- by tracking the serial number on the coupon")

How much is that worth? Or, is it a nuisance? If I could check the contents of my refrigerator while I *happen* to be at the store, can I save myself an extra trip back there, later, to pick up something that I didn't yet know I needed? Can the refrigerator tell me about the latest "recalls" (Trader Joes seems to have a new one each week!) and whether it applies to any items that I currently have in the refrigerator/cupboard? If it keeps me from spending a weekend praying to the porcelain god, how much is that worth to me?

Reply to
Don Y

True, true, true, etc. My app was analysing data to produce summary results & graphics for publication in scientific journals. A fault in my programming would not cost or hurt anything but the pride of those who trusted me.

True again. In my case though, a publishing deadline was ususally coming up, and the scientist(s) realized that simple equations can be complicated to program, and Excel graphics can be a little short of publication quality without putting a lot of time into learning how to program using VBS.

Very true, especially with language and time zone barriers. I was lucky to be the 'lone wolf' on all such projects.

Money talks. In my case, there was never any hard 'deliverable' that used any of my code. Otherwise there WOULD be legal requirements for reliability and all those things you mentioned previously.

Like I said, deadlines were approaching and there was no money nor time to let a contract specifying everything it detail. But there was a guy on staff whose job description actually included a section about programming to aid scientific research, and that guy (me) actually had a degree majoring in physics unlike all of the other computer people who had IT degrees.

This depends a LOT on the semantic context of the surrounding code.

Guilty. But then again, the main onus was to get the job done ASAP.

I found that I could save a LOT of time just by doing bulk QC on the inputs before looking at the real task at hand.

Like I said earlier, I started to assume that the code would be re-visited despite the assurances that it was a quick & dirty request. I did this just to make my own future more pleasant.

The stories I could tell would probably not surprise you. In a previous job, I was involved with contract monitoring for major (>$M) government projects. Also, I was on the team that developed operational code for all the 24hr Canadian Government GOES ground stations for meterological satellite images. As far as I know, the software ran okay for decades with no problems, even transiting the dreaded Y2K apocalypse.

I did take a "Numercal Approximation" course at college. (I have a CS minor as well) True, what you say. Some people have no idea. Scientists seem to have a bad oversight regarding their own misunderstanding about things they have never studied in detail. In general, this is because they are actually more intelligent than others. But intelligence combined with ignorance is a pretty good match for stupidity.

If you are at all interested in the sort of code I'm happy to put my name on (i.e. not the crappy 'throwaway' code I described earlier), check out the sources available in the 'Programs' section of my personal not-for-profit website referenced in my signature. The Javascript for the gauge demo is in 'wlib.js', the other sources are downloadable.

Reply to
Mike Duffy

You (or your "user") also gets a chance to check your results: "Gee, that graph doesn't seem to correlate with my IMPRESSION of the raw data. Let me spot check a few values..."

If my device is monitoring the production of a pharmaceutical product (e.g., tablets) -- at rates approaching 1,000,000/hour -- you can't even BEGIN to think you can sort through even a minute's worth of production (at 200/second) to verify the device hasn't screwed up somewhere in that batch.

If I'm accepting dollar bills from a user (gambler) and screw up and pay out thousands of dollars, erroneously, do you really think the user is going to "complain to management"? :> Wanna *bet* "Management" would complain to *me*! (once they examine the logs and notice the "rate of return" for certain machines is "way down"!)

Yes. We had to go through that exercise with some health data just last night. "I'm SURE there is a way to get the program to produce the graphs we want; why don't they look right?"

As I'm writing this, I'm discussing with a colleague in Germany the draft of a specification that I wrote. Not only do I have to ensure there are no errors in the design of the specification, but I also have to ensure it is clear and unambiguous. *And*, as folks in many nations will be using it as "Gospel", I have to hope I haven't relied on some particular language idioms that don't make sense in some other culture. Or, that the design itself won't be "impractical"/unacceptable in other parts of the world.

I designed a system with a partner firm abroad. Made perfect sense in technological terms and economic terms -- *here*. But, pricing in the US is considerably different for bits of tech than it is in other places! Things that were cheap/affordable here were essentially Unobtanium, there. And, in still other markets, actually subject to export restrictions (how many folks have sat down with their corporate lawyer to determine what aspects of their products might not be legally exportable?)

Some years ago, a friend designed a video game with graphics/characters suitable for the US market. Among his implementation choices was the use of skull & crossbones to signify "player death". (Hey, it's not a REAL death! Chill out!) The game had to be redesigned with different graphics as some markets found the symbol offensive.

You don't learn these things as a lone wolf. Or, in MOST "teams", regardless of size.

I "build THINGS". My code only runs "in a workstation" for test purposes (or under a simulator). It ultimately runs *in* a piece of equipment. There's usually no keyboard or (general purpose) display. No way for me to tell a user that the software "divided by zero" -- or tried to free() unallocated memory, etc.

A software "bug" is interpreted as "IT is broken" -- not "there's a bug in the software that makes it what it is"

I was part of a ~$5M project for a software-driven device... that hadn't considered the need for a "programmer"! They eventually assigned the task to a technician -- because he tinkered with software, at home!

{Typical PHB scenario -- clueless as to what the real issues in a particular project were!]

And that's what happens in most organizations. "We don't have time to do it right -- but, we'll have time to do it OVER!"

I spend about 40% of my "effort" writing specifications -- BEFORE I write a line of production code! I may hack together some code fragments to test the effectiveness and viability of particular ideas before committing them to the specification; but, that code is all disposable, back-of-the-napkin stuff. In most shops, that code finds its way into the product -- because they consider discarding it to be a "waste" of the time that it took to hack together!

(would you draw a sketch of your "dream house" -- and expect to find the final blueprints drawn OVER that sketch??)

Every function/module that I build begins with a stanza of invariants. These formally (re)declare the interface requirements (for example, "denominator != 0") AND ENFORCE THEM! So, if some other piece of code violates that contractual interface, the code "tells me". EVERY TIME the function is invoked!

-----^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Exactly. Employers don't care about how comfortable you are in your job. Nor how effective you are (as a consequence of THEIR policies). You work for them. You put up with your work conditions in exchange for a paycheck. If you ever find the balance unfavorable, you leave!

[This is what drove me to work on my own; the apparent willingness of employers to waste "my life" (i.e., my time) doing and redoing things that they COULD HAVE done right in the first place. But were afraid to do so or unable to recognize. With clients, each job is an agreed upon mutual arrangement. If I don't like what they are asking me to do -- or if they are unhappy with the way I do it -- then we simply choose not to enter into an Agreement. No hard feelings.]

It has been fun to "rib" old employers and clients regarding the bad decisions they made. You don't have to explicitly SAY "I told you so" for them to understand the intent! :>

OTOH, it's disheartening to see them repeat the same mistakes despite acknowledging the truth of that reality! :<

Software (esp prepackaged libraries) is sold like automobiles; all of the "detail" is missing. You're just supposed to think of it as a magical turn-key product. Despite the fact that it all has definite usage constraints. Those constraints are never spelled out (at least you might be able to find them in a vehicle's workshop manual!)

Notice how many "undefined behaviors" are listed in interface specs. And, how many of the nitty-gritty details are just "omitted"!

If I provide a clean heap and perform a series of allocations and releases, can you tell me what state the heap *will* be in? (it's a deterministic machine; you SHOULD be able to predict this!) Ah, "little" details missing from the description of that subsystem, eh? How can you know, at design time, that it WILL meet your needs AT RUNTIME? If you can't predict how it operates (because you don't have the sources; or, if you have the sources but not the test suite; or, the test suite but no temporal performance data; or... :> )

By claiming something is "undefined", the spec writer has made his (and the implementor's) job easier -- but YOURS, harder!

And, the subsystem covered by the specification is "designed and implemented ONCE; reused (in theory) endlessly! Why make a one-time job easier at the expense of a recurring activity??

"People don't know what they DON'T know". The few managers/clients that I've respected were aware of their limitations. Most are too insecure to admit same.

[Goldberg (I forget his first name) has an excellent treatment of the sorts of issues that folks writing floating point code should be aware of. I can probably find the exact title...]

I have to provide a "numerical capability" to users in my home automation system that doesn't require them to understand the limitations of the implementation. I.e., N * 1/N should be 1 for all N != 0 (among other guarantees). It would be arrogant to expect homeowners (business owners) to have to understand that order of operations bears on the resulting operation of the calculation, etc.

So, my implementation (written ONCE) takes care of that so the users' applications (written MANY times) don't have to!

Thanks, I'll have a peek (probably later tonight as I'm running out of "workday hours" with my German colleague).

[To rephrase my comments in light of your last comment, here, I have to be "happy to put my name on" EVERY line of code that I write. Others will judge the quality of my work from it -- after all, *it* is the PRODUCT that I ultimately "deliver"!]
Reply to
Don Y

*EXCELLENT* resource! (It's easier to understand if you've actually written a floating point package)

The comments on the quadratic formula should have damn near everyone rethinking how they've "solved that" in the past! :>

Reply to
Don Y

And it's your experience and your opinion that companies in those product areas don't value and re-use their existing code base? That with each new product or each new version of a product, they don't typically start with the existing code, and use much of it for the new product? That instead, they pretty much write all new code, each time? That's my take on what you're arguing here.

That's not my experience. Is that how they build products in all those product areas you listed, copied below, from your experience?

"product processes large amounts of

Show us one example. How exactly does this miracle inventory system work? How does it know I just put a lettuce in the fridge, or a carton of eggs, or how many eggs are left, etc?

It's not hard to go from that

Reply to
trader_4

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.