ATI and NVIDIA: Quick Look at HDTV Overscan Compensation
by Andrew Ku on August 25, 2004 12:00 PM EST- Posted in
- Smartphones
- Mobile
It has been a while since ATI released their HDTV dongles, which provided HDTV output support for most of their Radeon line. In fact, we probably first experimented with their HDTV dongle back in July or August of 2002. Back then, HDTV output support was plagued by the overscan issue.
And for those of you unfamiliar with "overscan", it is simply the part of the picture that is cropped. Depending on whom you ask, others also describe it as the space that bleeds or "scans" beyond the edges of the visible area of the screen. Typical televisions can have a loss of up to 20% of the image due to cropping. This portion of lost image is what is commonly known as overscan. Technically speaking, the information of the "lost picture" is not actually lost, but it is outside the range of the visible area of your TV screen. A similar situation on the computer end is when you view a picture in 100% scaling on a monitor with a lower set resolution than the picture, i.e. a 1600 x 1200 picture in a 1280 x 1024 desktop environment. The difference is that on a computer, you can move the picture around to see portions cut off by the visible area of the monitor.
We should clarify that overscan is a not necessarily a bad thing. It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle. (Underscanning would be the opposite of overscanning, where the image is smaller than the area on which it is being displayed.) It would be tantamount to zooming out when you look at a picture; though in the case of TV sets, the space that doesn't reach the edges of the display would be black. The deliberate use of overscanning allows the screen to be completely filled, as opposed to underscanning, which could have varying degrees of underscanned margins.
The reason why we notice overscan more so on a computer is that we already know what the signal is suppose to look like: where the start menu button should be, where the clock should be, where the corners of the screen should be in relation to our desktop shortcuts. A DVD to a DTV or even a regular TV usually encounters some measure of overscan, though we hardly notice it because we aren't use to its native signal. One way that DVD player manufacturers' or TV manufacturers' compensate for this is to provide a zoom out function, where you tell the system essentially to underscan. This is why we go crazy when we notice overscan from a X-Box rather than from DVD signal; you know where the game menu is supposed to look like.
In theory, if a HDTV was designed for only HDTV, there would be no overscan from component computer video output. The main issue with overscan is that DTV are programmed to do more than just DTV signals. They accept many legacy signals: camcorders, s-video, composite, etc All of this means that there must be a cross platform support for all formats, and the only way for that to occur is to either overscan or underscan. Underscanning would be more frustrating to the consumer, since the signal would be smaller than the displayed area with the black bars surrounding the image. Overscan ensures that video signal always fills the screen, though this gets to be increasingly frustrating when you get an X-Box or output video from your computer and the signal is overscanned.
And as Keith Rochford (Chief Engineer of eVGA) explained, when you switch to a DTV, you are now talking about a high resolution display, and backwards engineering a pixel technology to a line scan technology isn't a simple task. This backwards engineering or transfer is what leads to the large 10% to 15% margins of overscan that we have been accustomed to when we output from a computer to a DTV. For those who own something like a plasma display that can do direct computer video output via VGA or DVI, this obviously isn't an issue, since there is no backwards conversion needed. It is essentially like a really big computer monitor, since it keeps the video card's native output.
In the most practical sense, overscan is something you don't want to have or at least want to minimize. Using your HDTV set as a substitute for your monitor can be awesome, but the limitation of having part of the picture cropped gets to be a major deterrent, especially when you want to plan games, surf the web, or watch videos on that nice big screen.
There are more than just ATI and NVIDIA cards on the market, but most of us are still going to be stuck with one or the other. In which case, you are most likely going to get some degree of overscan. Keep in mind that we can't track down every or even most DTV sets and check the degree of overscan, and even if we could, overscan varies between TV sets because of the manufacturer's design, which isn't a bearing on the video card. For these practical reasons, we are going to focus primarily on how ATI and NVIDIA approach HDTV overscan compensation.
13 Comments
View All Comments
skearns2 - Monday, February 2, 2009 - link
I have a XFX Geforce 8800 GS graphics card & a 24inch viewsonic monitor. It's a vx2435wm( factory settings 1920x1200). When installing the nvidia driver a major issue occurs. Windows, programs, games, start menu, taskbar, clock, everything opens off screen. It acts as if the monitor was bigger than it is. I have the correct resolution. I have set the correct resolution through computer properties and nvidia control panel. Lowering the resolution does lower the resolution, but everything is still off screen.
It's terrible because I can not see the start menu, taskbar or desktop icons because they are off screen. I have tried using a 19inch monitor and I have no problems. Everything is normal. It seems to be a relationship error with the Nvidia driver and the 24inch monitor. But I only use the 24inch... When I uninstall the nvidia driver, things work fine on the 24inch monitor. but slow, and thus pointless. I even install the original Nvidia cd driver that came with the card and the same problem occurs, things open off screen. Also, I use to have a BFG8800gt overclocked agp, and when I first got the vx2435wm monitor I had this issue initially, but it somehow went away. I fear that because i have used different drivers it may be the monitor itself. Please give me a solution on how to fix this problem. I imagine it is some sort of special driver I need.
I contacted viewsonic and they were not much help on this issue :(
anyone know the solution? thanks!
djtonium - Monday, August 30, 2004 - link
Eww.. correction, it's 1152x648 (720p timings).As for LCD, anything other than native resolution results in the interpolation of the pixels. Worst yet, the Sonys have 1366x768 native resolution with thier LCD TVs. At such a nonstandard resolution, that thing interpolates with everything.
djtonium - Monday, August 30, 2004 - link
#9 - The new ATI cards that support HDTV output use a custom connector that supports S-video and HDTV output. Standard S-video has a 4-pin layout, while the extra pins located on the connector of the videocard is meant for HDTV out. I love this idea, but it would be nice if the component video cables ATI supplied was thicker :-) I need to get away from my AIW card since I have no use for the extra bells 'n whistles.#10 Thanks for the response, Andrew. If you want some screen shots of my 4:3 CRT TV (good canidate for overscan stuff) running on 640x432 (full screen) and 1152x768 (wide screen), I'll be more than happy to e-mail you photos.
AndrewKu - Sunday, August 29, 2004 - link
#7 - This article was written with the use of 4.8 cat drivers.#8 - This article was in reference to more of the out of box experience and the approaches taken by each company. Keep in mind that overscan margins vary from HDTV to HDTV.
We mentioned that there are ways to "unofficially" customize things in our Personal Cinema 5700 review. I hope that when I have a bit more free time, I can take a look at the games and share our experience with tweaking games for HDTV.
#9 - We were testing and explaining the approaches taken by the two companies. It doesn't matter which card you use, technically. Overscan is going to vary from HDTV to HDTV set more so than video card to video card. Drivers are the issue on the computer side.
And since the hardware supports HDTV, there are certain cases where you don't even need to use drivers to get HDTV output.
As for the weird output scenario, the difference is that most LCD monitors aren't going to be of the same size caliber as plasma or HDTV sets.
magnusr - Sunday, August 29, 2004 - link
Why did they only test old cards? Why didn't they include the ati x800 series and the nvidia 6800 series?With my ati x800 series it followed a svhs -> rgb output (I think its for hdtv).
I use the dvi out to my optoma H56 xga projector. It supports hdtv. I seem to have a perfect pictures when playing hdtv. Do you mean that I am not getting hdtv out of my x800 dvi plug to my projector?
That sound weird to me. It would be the same as connecting an lcd screen using a dvi cable....
Can't computer monitors show hdtv?
djtonium - Thursday, August 26, 2004 - link
I'm confused. I've had my AIW 9700P for over a year and I rarely (only a few) have issues running games on my HDTV set.AFAIK, it is extremely easy to run DirectX games on HDTV because it can easily support modified resolutions (driver support). Any of the OpenGL games I have (like RTCW) does not fully support custom resolutions - at best, an attempt would cause an OpenGL rendering error.
Just for information (in case someone needs it) for ATI owners with HDTV output:
Unreal Tournament 2k3/2k4: It's been a long time since I've done this, but assuming you know what resolution you were running before you can edit the UT2k3/2k4 INI file (search for the resolution value, and it'll take you to the width/height section of the file). For 4:3 TVs, the lowest resoltion you can run is 640x432 or 480p support. I recommend 864x648 (I forgot this one, I don't have a need for it) or 1152x648 (16:9) if the TV supports 720p/1080i.
Max Payne 2: If you output to TV as your primary display, the game will display a list of supported resolutions - even custom ones defined by the ATI drivers! Again, you can run at 640x432, 864x648, and 1152x648.
Halo: It's been awhile since I've messed with this as well, but you can edit the INI file and put in custom resolutions listed. Custom resolutions aren't listed in the game despite output to HDTV.
Doom 3: There's advice out there regarding commands for custom resolutions. You can add these to your autoexec.cfg file because the custom resolutions won't be listed in the game.
I'm happy ATI has integrated HDTV output with their videocards because it'll drive up sales significantly. Everytime I play games or watch movies (requires DVD-Region Free program to bypass Macrovision - else, TV output is disabled), I always use my HDTV.
Daleon - Thursday, August 26, 2004 - link
I don't get it, was this article written before the 4.8's were released with HD support? If so, why not update it before posting it.Wrath0fb0b - Wednesday, August 25, 2004 - link
This entire set-up is just a lie from ATI to sell more high-end video cards.Hear me out.
I have an original AIW Radeon (7500 core) with DVI-I (DVI + VGA) out. Unfortuantely, i quickly found out that when I plugged the DVI into my Toshiba HDTV (H83) that the ATI driver was shutting off the DVI port on bootup (it would display during the BIOS and load but kill before getting to the windows login . . it's the only explanation).
ATI told me that there is no way to force the DVI port open if the monitor doesn't provide a proper EDID which is total BS because i was going to set the resolution in Powerstrip anways.
After a bit of research, I bought the DVI Detective (google it) which manages to fool the ATI driver into staying open and I know have a perfect image running in both 480p and 1080i.
It's a total ripoff that ATI would cripple their driver like that.
Glad i didn't buy a newer card just for HDTV, saved myself more than $100.
forcemac101 - Wednesday, August 25, 2004 - link
What I can't figure out is that when using svideo (standard 480i) i get very little overscan, yet I can change the resolution via your standard rightclick->properties route. I can use anthing from 640x480 up 1024x728 picel resolutions, yet the HDTV maintains the 480i. Icons and text get bigger and smaller with it's repsective resolution, but overscan doesn't change.Its the card sees the pixel resolution, but it does something to overlay it on the 480i from the svideo. Totally transparent.
It seems to me that it shouldn't be that hard to "program" the component out (I use ati) to a custom resolution that produces no over scan, and yet can fit any pixel resolution to it.
I probablly don't make sense, but anyone with TV out and HDTV try it, use the Svideo use ATI's TV out tab in the driver center the picture up (eliminate 480i overscan) then just change your regular pixel resolution. Games run fine in what ever resolution because the pixel res is translated to a 480i res via the svideo.
PrinceGaz - Wednesday, August 25, 2004 - link
It should be a doddle for them to add those resolutions if they're already included in X-Box versions, after all some ports from the X-Box seem like little more than a quick recompilation of the source code as they didn't bother improving on controller options etc.In any case its no problem at all for any game developer to add custom resolutions if there were any demand. I just wonder how many PC gamers really want to play games on an HDTV, or an ordinary TV for that matter (theres a lot more of them about), unless using a gamepad while sitting on the sofa as you can't really use a keyboard, mouse or joystick very well. In which case they may as well just use the X-Box or PS2 instead.