What use is WiFi on a Costco Viso TV?

What use is WiFi on a TV screen?

A relative of mine called, who was told "something" by Costco, that their Visio TVs have WiFi and therefore she wouldn't need the "box" whatever that is.

I don't have cable, nor even a TV, but I suspect that "box" is something that was added when they switched from Analog to Digital (or maybe it's a descrambler).

They said they have to pay the cable company for a second box (the first one is free), so, it's not a modem (because you'd only need one modem).

Anyway, my basic question, for you, is "what use is WiFi in a TV"?

Note that I can easily see that bluetooth is useful, since you can then use that TV with a keyboard; but what good is WiFi in a TV screen at home?

Reply to
Ewald Böhm
Loading thread data ...

so you can connect to the internet and watch Youtube, netflix, etc.

Reply to
taxed and spent

Maybe I don't understand. Actually, I don't understand.

To watch youtube, you need a browser, which is usually a program compiled for a certain computer, which runs a certain operating system, and which has a certain byte order and memory structure and a whole bunch of other things associated with a "computer".

Is the TV acting as a "computer"? If so, what operating system is the TV?

What browser does it use? What architecture is that TV browser compiled for?

Reply to
Ewald Böhm

Yes, the "Smart" TV is a computer...Samsung is Tizen OS (not sure about Viz io).

Reply to
bob_villa

The newer smart TVs have their own built in inerface . Maybe you have heard of the devices like ROKU or the one from Amazon. Anyway it lets the TV connect to the internet so if you have say Direct TV you can get movies and other shows on demand bystreaming off the internet. I don't know what system they use,but my TV lets me surf the web. It is awful slow to do with the remote,but I think I could hook up a mouse and keyboard to it if I wanted to.

Reply to
Ralph Mowery

formatting link

Reply to
trader_4

My Samsung will take a USB mouse/keyboard but it is pretty clunky searching the web. They already have most, if not all of the streaming service interfaces built into the software. Some still require that you need to go online with your PC to get the authorization code (HBO for sure). There may be another way to get it but it is easy on a PC.

The local FIOS (Century link) TV offering also has a WiFi interface to the TV box but I am not sure a smart TV can access it.

Reply to
gfretwell

And a tv set has become a computer. They needed a CPU to handle the data= =20 conversion, so they might as well allow it to be used as a more general=20 purpose computer.

Both my DTV sets run Linux. A subset, but it's there.

My blu-ray player runs Linux too, as does my TomTom One GPS. It's free,=20 and yet provides a full OS for building on top of.

Michael

Reply to
Michael Black

It's not clear to me that you need a CPU to handle the conversion of the digital bitstream to analog. It would seem that a dedicated chip or chipset would be more far more suited to the application.

You do need a CPU to handle the human interface and supervise the other chips. I'd think that's the CPU that's running the WEB/wifi interface.

Reply to
trader_4

Everything is a CPU these days. It is cheaper to write software and use an off the shelf CPU chip than to design a purpose built chip. Fixing mistakes is a lot easier too. That is why things as mundane as a washing machine or microwave timer is a CPU. There is a processor in my "dumb" Samsung.

Reply to
gfretwell

Correct. It's actually more efficient to use dedicated hardware for that function. But, ...

Exactly. And, more to the point, you need a computer to decide what to *report* back tot he provider!

TV's (and thermostats, soon refrigerators, etc.) have now become "spies" for their makers (or, whomever their makers want to sell that information). What show is he/she watching? How often (and *when*, exactly) do they turn AWAY from the broadcast? Did they watch that commercial? How many people are in the room? *Which* people? From this sort of information, with the help of Big Data, they can also make educated guesses about your voting habits, medical conditions, income/education level, etc.

And, the *computer* can now extract specific commercials from the data stream (it's no longer a single "broadcast stream!" like in days of old), buffer that and present it to you *when* it thinks it appropriate. You can even be watching a commercial while your neighbor is still watching the actual *program* (buffer the program while PLAYING the commercial).

[For more than 30 years, the techology has existed and BEEN IN USE to selectively *replace* commercials to certain broadcast areas. Of course, this is a really coarse instrument -- EVERYONE sees the replacement. Imagine if you can target individual households -- or individual *viewers* -- with specific messages.... then *measure* how effectively the message was received! I.e., did Joe User actually *buy* the product that you pitched to him last night??]

Etc.

These sorts of things are hard to do with dedicated bits of hardware (unless you make that hardware "programmable" -- hey, like a computer!)

Reply to
Don Y

That depends on the functionality you intend in the product. Note that CPUs go out of production just like "dedicated chips"...

When was the last time you GOT an update to your microwave oven software? The GPS *software* (not MAPS) in your car? Any of the dozens of ECU's in your vehicle? The controller in your furnace? You washing machine/dryer/dishwasher?

Fixing is a misnomer. *Changing* is a better description. Manufacturers make *changes* (going forward) which may (or may not) "fix" problems. But, folks *with* those problems end up living with them. I.e., the CPU doesn't buy the consumer anything!

There's a processor in your mouse. Another in our keyboard. Another in your CD/DVD drive. Another in your network interface. etc.

(Welcome to *my* world! :> )

Reply to
Don Y

Not sure if the story is true, but a few years ago a department store sent a girl some baby information and wanted her to set up a baby want list at the store. Her father got wind of this and called the store and told them he did not like them sending out all that as his daughter was not pregnet. The store said they were sorry but based on the medication and vitimens she was buying she was.

A few weeks later the man called up and said he was sorry to chew them out as his daughter was pregnet.

Now looking at items on certain web sites will bring up ads for that item on other web pages.

I understand Windows 10 is or can be set up to send back a lot of information like that.

Reply to
Ralph Mowery

Yes. And, people don't understand that there is virtually nothing you can do to remain anonymous in your web searching. Disable cookies? Nope Super cookies? Nope Beacons? Nope.

Your actual *browser* can be "fingerprinted" to (practically) uniquely identify you (it). What browser? What OS? What "options"? is Java enabled? JScript? What range of IP addresses? What toolbar installed? etc.

And, that assumes the browser isn't *deliberately* "tattling" on you!

W10 actually leaks a *lot* of information. MS has BELATEDLY realized that selling software is not where the *money* is! Selling *ads* is the cash cow! Knowing what people want, how they behave, what they search for, etc. Sell that information to others. I.e., your *users* are the "commodity" that you are "peddling". Implicit in all this is the Internet connection; if your machine isn't connected to the outside world, it can't tattle on how you are using it!

(you can't make a firewall smart enough to block this sort of information from flowing through clandestine tunnels, etc. There's no way a machine can know what's "bona fide" traffic and what is *undesireable* traffic)

Think about that when you want Philips to control your LED lights; google to control your thermostat; etc.

Reply to
Don Y

You can still buy 8080 chips but these people are using a standard PIC of some sort.

It is still easier to fix the ones on the line. There are flash changes for cars and you certainly see a lot of microcode upgrades on a smart TV.

It makes the product cheaper to make. How much of that filters down to the customer is debatable.

My original point, thanks

Reply to
gfretwell

How about SC/MP's? 2650's? 8x300's? A bazillion *specific* 8051 derivatives? etc.

In the 70's and 80's, the MPU market was all about "second sources"... you wanted to have (at least one) backup vendors for the parts that you'd design into a product. Nowadays, I don't think there are any parts that are made by two different vendors that are "pin compatible".

Yes. But not much help for the folks who already purchased the "previous bug-set".

Only *smart* (network connected) TV's. And, only if you have it connected to the 'net. At the same time, you are at the mercy of the manufacturer to *not* leave you with a LESS DESIREABLE product than the one you purchased. Or, *not* install additional spyware, etc.

Imagine coming to work and finding your computer has been "upgraded" overnight, without your forewarning. What does that do to productivity?

It's one thing if the upgrades fix broken behaviors. But, more often than not, they *change* behaviors -- often in BIG ways!

If you factor in the cost of added (user) complexity and dubious functionality, I wonder if there *is* a net improvement!

In the 70's, we embraced MPU's as a means of replacing dedicated logic to achieve comparable/improved performance at reduced cost. But, this quickly got out of hand. "Feeping Creaturism" took over and folks started cramming *too* much functionality into things that weren't intended to have that level of complexity. E.g., our microwave has buttons that we've NEVER PRESSED! In 15+ years!! Likewise, the "probe" that allows the oven to monitor the interior temperature of it's cooking... never been used, I doubt I could even tell you where it's *stored*! But, the probe, the connector, the electronics and the software were all added to the cost of the microwave.

New cars have support for XM built in. What if I never want an XM subscription? How do I get "credit" for the extra, unused, potential for bugs/failure/complexity increases that the "feature" has cost me?

What is the cost of providing those buttons (tangible hardware cost) and the software behind them? I.e., we've bought features that we'll never use -- and didn't really have a choice in the matter!

Reply to
Don Y

Who are "these people" and who says what they are using? Here is a link to several HDMI chips that do the HDMI to analog display function from just one chip manufacturer. Google and you'll find plenty more. Also, not sure what a "PIC" is. If you used a general purpose CPU, it's only part of the solution. You'd still need a separate D/A, for example. And trying to have one CPU do many things instead of a dedicated chip brings it's own problems, how many times have you had a video freeze on a PC for example?

Reply to
trader_4

You are preaching to the choir here. I like hard wired circuits vs processors but nobody listens to me. My spa controller is 4xxx CMOS and my pool/solar controller is very old school with a 24 hour timer motor, 3 cams with microswitches and very simple switch and relay logic.to control 5 valve servos

Reply to
gfretwell

I am not using FIOS, I still have POTS and Dish.

Reply to
gfretwell

From what I understand, windows 10 is going to do that.

Reply to
Ralph Mowery

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.