Avoid Monster Cable. Get Blue Jean cable. Here's why:
" I say this because my observation has been that Monster Cable typically operates in a hit-and-run fashion. Your client threatens litigation, expecting the victim to panic and plead for mercy; and what follows is a quickie negotiation session that ends with payment and a licensing agreement. Your client then uses this collection of licensing agreements to convince others under similar threat to accede to its demands. Let me be clear about this: there are only two ways for you to get anything out of me. You will either need to (1) convince me that I have infringed, or (2) obtain a final judgment to that effect from a court of competent jurisdiction. "
A great letter. I've saved a copy for "boiler plating" in the future. I really, really would like to see the outrageous and nonsensical claims Monster has been making for year dealt with in court with expert witnesses who would tear them a new output port. Sadly, I have friends that believe that paying 10 times what a cable is worth makes it somehow ten times better. )-:
I'm guessing if Monster is stupid enough to proceed, they might very well end up having to admit, in court, that they are 98% hype and nothing more.
Wouldn't your electrons rather travel in luxury? Contented audio signals produce superior sound when they don't have to fight their way down an impure copper pathway and traverse anything but the most luxurious gold plated connections.
My brother Uncle Monster has had the name for about 25 years. It was given to him by 4 year old and I help add to his collection of Uncle Monster's observations of the world known as "Monsterisms". I wonder if Monster Cable could sue us? An example:
"Human females are genetically Machiavellian, they need little or no training" Monsterism by Uncle Monster.
Thanks to all for helpful comments. I'm leaning toward the Panasonic model suggested in this post from "Bob Vila". I don't want anything bigger than 32" for the bedroom. I've bought very few TVs over the years, and they have all been SONYs, but...all things must end...
In contrast with another comment on this thread about 720, Michael, the seemingly knowledgeable TV guy at Costco , said that 720 is just coming in on many channels (other than HBO & that ilk, which I don't get). He said 1080 as a universal is still few years away.
Per yet another comment on this thread, Michael opined that an average viewer (I guess that's me!) for non-sports events wouldn't be able to tell the diff. between 720 and 1080 at the 32" size. I asked about 1080p and 1080i. He said that 1080i is basically 720; that the "i" means interlinear; that it doesn't refresh as fast as "real" 1080.
Costco's price, w/instant rebate, is $349 until Dec. 2. Maybe I could get it a few bux cheaper elsewhere, but Costco is good to deal with on many counts, including returns.
I like this line: " If you sue me, the case will go to judgment, and I will hold the court's attention upon the merits of your claims--or, to speak more precisely, the absence of merit from your claims--from start to finish. Not only am I unintimidated by litigation; I sometimes rather miss it."
You forgot about gold and silver-plating those crummy copper wires so the electrons can glide down the wires with even lower resistance and certainly travel in a higher class.
During WWII, copper was in such short supply that silver was used for magnet wire at Oak Ridge for use in enriching uranium. What a salvage job that was after the war. 8-)
Here's what CNET, which I've always found to be a credible authority on these issues, has to say on the subject:
formatting link
"9. Side by side, how do 720p and 1080p TVs match up in head-to-head tests?
We spend a lot of time looking at a variety of source material on a variety of TVs in our video lab here at CNET's offices in New York. When I wrote my original article over three years ago, many 1080p TVs weren't as sharp as they claimed to be on paper. By that, I mean a lot of older 1080p sets couldn't necessarily display all 2 million-plus pixels in the real world--technically, speaking, they couldn't "resolve" every line of a 1080i or 1080p test pattern.
That's changed in the last few years. Virtually all 1080p sets are now capable of fully resolving 1080i and 1080p material, though not every
1080p TV is created equal. As our resident video guru, Senior Editor David Katzmaier explains in his HDTV resolutions feature, Blu-ray serves up another video format, 1080p/24, and not every TV properly displays 1080p/24. The 24 refers to the true frame rate of film-based content, and displaying it in its native format is supposed to give you a picture exactly as the director intended you to see it (for a full explanation, click here).
Whether you're dealing with 1080p/24 or standard 1080p/60, doesn't alter our overall views about 1080p TVs. We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. In our tests, we put
720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray player. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV. "
That's in line with what all the side by side actual reviews I've read have concluded, except for the sports part. The issue with sports is fast movement, which AFAIK is not related to resolution, but other display characteristics.
=A0I asked
The "i" means interlaced, which has already been discussed many times in this thread. And interlaced 1080i is not basicly 720 anything. Now here is a puzzlement that I never thought about before, but this thread got me thinking about. Interlacing originated with broadcast TV and was a way to reduce bandwidth. That made sense because with TV transmission, you only have X bandwith in the airwave spectrum. So, it's advantageous to reduce bandwith on any given channel so that you can accomodate more channels in the same freq range. Consequently they interlaced the display, tracing odd number lines on one pass, even number lines on the next pass and relying on the persistence of the phosphor on the CRT to keep the previous pass there long enough for it to still be visible.
But in the case of LCD or Plasma displays, AFAIK, there is exactly one pixel element for each point on the screen. So, how could they actually do interlacing at all? Do they really interlace it, or do they just all actually display only progressively, taking whatever input signal and then processing and scaling it to the display? It would seem to me if you had twice the rows on the LCD display, then you would just use them all in one pass, because I don't see any advantage to displaying one row at a time. All it would do is cut down the bandwith in the display driving circuitry, which certainly isn't a problem for modern semiconductors. So, I would think regardless of what the source is, it's always going to be displayed progressively.
I posted the same article, but it didn't seem to make any difference to some people here. Seems like a lot of people are making the argument for 1080 because they got sucked in and are trying to make themselves feel better because they wasted their money. If you spend enough money on a TV you will see whatever you want to see.
I love to have these "experts" do a blind video test.
Do you have basic/expanded cable with a box? If so it should be noted that the TV should be set to 4:3 mode (bars on the sides of the picture).
If you use one of the "stretch" modes it's going to make people look short and fat. If you use a zoom mode it's going to crop the picture and cut off the content at the top and bottom of the screen. IOW, you might only see a persons eyeballs and part of their forehead instead of seeing their entire head. It that doesn't bother you then fine, but if it does you might want to consider a slightly bigger TV.
Katzmaier, who is cited, didn't write the CNET article. The article just states that Katzmaier said it and it isn't a direct quote. It's the writer trying to convey what Katzmaier meant. Regardless, if you read the rest of the sentences in context around that one inaccuracy, it's clear what they meant.
I and I'm sure Ron would be happy to see any sources you have that have done actual side by side testing of 720 vs 1080 displays, that say they can see a noticeable difference, particularly on screens around 32", which was size from the original question.
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.