Urgent? How many people will die if we don't answer this question in time? And why are you asking about the choice of "frequency and wavelength"? It's not like you're making two choices. Why don't you try thinking about that question and coming up with an answer on your own.
http://reviews.cnet.com/Akai_LCT37Z6TA_37_LCD_TV/4507-6482_7-32366995.html?tag=nav Can you guys tell me how the specs look on this and if it will help with what i'm looking for in a tv viewing experience? I just want to use my tv mainly for watching HD sporting events through my HD DTV reciever, thanks in advance.
Its overkill for what you want. Esp as via satalite/cable you not going to get 1080P Also on 37" 1080P is a waste as the screen is so small. most companys won't make 1080P in 37" also for sports your not going to want to watch 1080i so you be watch most of the time 720P so I'd get a cheeper set if your not going to use the 1080P Your only going to use 1080P if you watching blu-ray or HD-DVD's or have a PS3. Your better getting a sony or samsung 720P set.
Wholeheartedly agree with DBM_Bones, I recently got a SamsungLCD and ESPN HD sports look great. Nothing but HD DVD, BR DVDs and PS3 put out true 1080p and for that matter, there is very little 720p to watch on Cox cable at this time.
So maybe one of you guys could recommend a 720 set for me then. Can I get a 720P 37 or 42" for around 800-900$?
Im not in the USA im in the UK, But just have a look at samsung LCD that do 720P and get the best one in your budget. not sure if they do a 37" you may have to go a bit smaller, but I'd get a better Q smaller TV and a low Q bigger one. ESP as LCD in someways gets worse the bigger you get.
I take it then that the 32" Samsung will trounce the 37" vizio? They are the same price and I know on pcmag they give VERY generous ratings to the vizio.
Vinny, go to www.avsforum.com and do a search for the tv you are looking at and you will get real customer reviews. I never trust reviews from magazines.
or Find a shop that has both and compaire yourself, just make sure that use the same input. they normaly only put the top of the range TV on the HD input demo, and the rest on SD tv.
Don't go by what you see in the store. Higher priced TVs are always adjusted to look better than lower priced TVs. The signals are crap at best and adjusted, too. Unless, you take the time to adjust every TV, you can't make any comparison. Besides unless you had a HDTV already at home, you wouldn't really know what to look for anyway. Go by the numbers, native resolution can't be faked, and everything is converted to this for LCD and Plasma otherwise the TV will really look like crap and possiblLy damage it, too. Most of the HD footage loops used is old and has problems in the footage. Old HD cameras looked really good when things moved slowly but didn't have the response time to deal with alot of action. Watch something new and beware of sales people trying to it is a difference in tv quality playing diffenrent footage. 720p was some crap they made up for LCDs and Plasmas native resolution of 1300 * 768 and tried to call it HDTV. HDTV starts at 1080 lines. Its the lines that matter not the pixels bandwitdth. there is about two million pixels in full frame of 1080 * 1920 and about one million in 720 * 1360 full frame. So, 1080 has twice the resolution and 360 more lines than 720, your eye can't see the interlacing. Nobody broadcast in 720p anyway because it uses the same bandwidth as HDTV with less resolution and lines.
neoteny, I have to respectfully disagree with you on several points. I work at a Best Buy and can confirm that there is no tweaking done to the tvs (by professionals anyway) to make them look better or worse. Most of the tweaking is done by customers to try to make them all look alike. They always fail because every tv (literally) is different. The only way for a non professional to calibrate a tv is to buy a calibration DVD like Digital Video Essentials (DVE). Its not about what you like its about what the producers/directors intend. There are many reasons for different video characteristics. The largest reason is the difference a good video processor can make. Processors in products on a store shelf can range from $10 to $300 or more. Price isn't always a good determination of performance but it is a good place to start. First your loosing credibility with this statement. High-definition comes in many flavors. It started with Big Rear projection CRTs at the end of the '90ies. They are 1080i sets (1920x1080 interlaced) meaning that they have 540 lines per field (there are two frames in each 1/60 frame, NTSC/ATSC). Then came the push for progressive scan. All the newer TV technologies are inherently progressive scan. LCD, DLP, PDP, and LCoS (including 3-chip LCoS technology: SXRD and H-Dila) are all classified as Fixed Pixel Displays (FPD). They have pixels that are active at all times. The amount of fixed pixels that are always on plus the aspect ratio defines the TV sets Native Resolution. In most cases the TV sets native resolution is 720p (1280x720p) with has 720 pixels for an entire frame (instead of 540 per field). Put that with progressive scans ability to handle motion better and you will find that 720p gives people a higher perceived resolution than a 1080i set does. In the last couple of years there has been a push toward 1080p as it combines the detail of a 1080i set and the speed handling of a 720p set. With all that said, what you feed your tv and how you feed it will play a larger role in determining what the picture looks like than anything. A 1080p tv with a decent video processor being fed a 1080i signal at low bit-rate will look worse than that same tv being fed the same material at a high bit-rate. That is why Discovery HD Theater's Atlas series looks great but looks even better on HD DVD and Blu-Ray! Now your just wrong. Yes 1080p has twice the resolution of 720p (both at 60fps anyway) but 1080i in broadcast and 1080i in a CRT TV are different. Sports broadcasters like Fox sports and ESPN broadcast at 720p because it handles motion better than 1080i does (720p requires less video processing but there is less overall detail). The only (and the majority of) broadcasters, that don't specialize in a lot of high speed sports, broadcast at 1080i/30fps. They do so because they capture there sitcoms, dramas, or whatever at 1080p/24fps to make them more cinematic. To save bandwidth they interlace the 1080p/24 signal into 1080i/30. They are able to broadcast at 1080i/30 because a decent video processor should be able to recognize the movie cadence of 3:2 and have no problems reproducing 1080p/24 progressive frames from the 1080i/30 signal. Sadly a lot of cheap video processors in TVs don't have the muscle to do 3:2 pull-down correctly (especially with high def programming). Feel free to ask any applicable question nothing is taboo, Ced
What? Who said that? HD_Nut says something like that but no where near that extreme. But is why I disagree with him. You can't see any color gradients at 40x20p@6000fps. Frame Rates do play a role. But you need about 24-30 to simulate motion. a picture displayed at a multiple will look smoother than a frame rate that can not be divided by the original source number. For example, Movies are shot at 24fps but most tvs are refreshed at 60 fps (NTSC/ATSC). So some projectors and HDTVs can change there display refresh rate to 48 (or 24 times 2) or 72 (or 24 times 3) so that jitter is no an issue. Similarly 30fps content can be shown at 60Hz and 90Hz for increased realism. There are even a couple of 120Hz displays out there so that 60fps source material can be smoothed out. The best available resolution right now is 4K@48-60fps (or 2160x4096 at 48/60 frames per second). At 5K-6K your eyes begin to question reality! Ced
I learned alot reading this post, but I look at this way, whatever looks best to you is perfect for you, The rest is all crap. 720p/1080p/1080i who cares, it's what you like that matters. Just sit back and enjoy, thats all that matters in the end. Kinda reminds of these audiophiles who claim they can hear the hi-hat roll off to early @ 14.526 khz on a set of B & W 801's, BULL YOU CAN. Peace all
in regular tv smaller is sharper and brighter. bigger is duller . looks like hd would do the same maybe maybe not.
"HD_nut Member 20. December 2006 @ 17:47 Link to this message To answer your question, if all signals were in 1080i I would still get the 768p set, because I would have to sit closer to the set to see the benefits of 1080p. I sit at an average distance, about 10 feet, because of the pixel size you should see a better picture with 768 at about 10 feet. I seen how the Sony XBR cross converts to 768p, it's amazing." True or False more picture quality is lost in a cross conversition than in "losing" some pixels because they are to small to see at 10 feet.
Not not really, it's what they call wasted resolution... "The average 42-inch-diagonal, 1,280-by-720 plasma or LCD display has pixels that are roughly 0.029 inches wide. (Of course, each model has different inter-pixel spacing, but, for now, we'll assume they don't.) If the same size display had a resolution of 1,920 by 1,080, the pixels would be 0.019 inches wide. As you can see, in a 42-inch display at a distance of 10 feet, your eye can't discern the resolution available even with 720p. Even more resolution is "wasted" at 1,920 by 1,080." http://blog.hometheatermag.com/geoff...n//index1.html "For example, despite the fact that a 37-inch LCD with "only" 1,366x768 pixels has to throw away a good deal of information to display a 1080i football game on CBS, you'd be hard-pressed to see more detail on a similar 37-inch LCD with 1,920x1,080 resolution. Your eyes would never have seen it, so if you have a good conversion chip, you will see a nice 768p signal.