The purpose of this post is not to compare between graphics Cards Company’s, or to try and persuade you that one is better than others, but to show what different technology’s are in use in the graphics card sector, and on your cards today.
Im going to try and keep this post at mid-level understanding, im not aiming to go really in depth here, but to brush on each one enough so you have an understanding of each technology. Some of the things I have written are over simplified, so keep that in mind when replying about mistakes . For those of you that already know this stuff, then ill apologize now for boring you
The role of drivers.
A driver is, in simple terms, instructions that the computer uses to talk to hardware devices within it. Pretty much everything in your PC will have a driver for it somewhere along the line, although windows has made much of this transparent to the user.
Think of it like this. If I gave you a page of writing in a foreign language you didn’t know, it wouldn’t make any sense to you. If I first translated this into a language you did understand, and then gave it to you, reading it wouldn’t be a problem.
That’s basically what a driver does. Games / 3D apps give the instructions out on what it wants the graphics card to do (Usualy done via either Direct X, or OpenGL), the drivers take this information and put it into a form that the graphics card will understand. Hence why you don’t have different versions of games for differing graphics cards.
In graphics cards today, not only can the drivers do the above, but also have the ability to directly program the graphics card itself. This is very useful for developers when they want to take a intensive algorithm away from the CPU and onto the graphics card.
It’s all about IQ, sunny Jim.
And no, that’s not the intelligence rating of your card, but the Image Quality it produces. While Frames per second may well be important, image quality is often underestimated by people when buying.
A fairly big mistake that people make when seeing a card is come out with comments like “The 6800’s image quality is pants”, or “Ati’s image quality is rubbish”. Image quality can vary from manufacturer to manufacture, and to a very small extent, from card to card. When using an analogue signal; it will leave the GPU, then pass through several “filters” to turn the digital signal (a 1 or a 0) into a sine wave for the monitor to understand. The varying quality of these filters causes the image quality to differ. If one manufacturer uses a better filtering technique, then it will produce a better image. This affects both the sharpness of the image, and the quality vibrancy / depth.
Once the signal has left the GPU, it has no control over what the final quality of your screen will be. If you want to test this out for yourself, go and rip a few of these filters off your card and check out the result
This mainly affects analogue signals. Digital signals still have to go through a form of filtering, but its different to the one an analogue signal will go though. The beauty of DVI is that the signal will remain in its digital form all the way to the monitor. Unfortunately, DVI does not have error correction, so it has the risk of picking up noise on its way to the monitor from other electronic devices. This problem becomes apparent if your in the professional imaging industry, where it has been proved time and time again that low quality cables + distance = problems.
The filters aren’t something that can be changed by drivers. They are physical devices attached to the PCB. With that in mind, lets see what we can do at driver level.