There are an number of reasons why you may wish to upgrade your graphics card:
- problems with your current card
- better performance
- to drive a digital screen
- to support dual screens.
In truth the performance of graphics cards has been more than adequate for office-type applications for several years now. Performance has only been an issue for users who’re into multimedia applications – such as video editing. What’s really driven graphics technology in recent years though is the phenomena of 3D gaming.
That’s not to say that 2D users have no reason to upgrade though. With LCD monitors having finally fallen to an affordable price level, many conventional business users are finding themselves in the market for a graphics card upgrade, just so as to enable them to swap their bulky, hot, power-hungry and potentially eyestrain-inducing CRT monitor for a more ergonomic and less health-hazardous flat panel screen.
If you’re planning on replacing your integrated video with a separate add-in graphics card, the first thing to do is cross your fingers, because it may not be possible. Motherboards with integrated video are notoriously difficult to upgrade. Indeed, what you’re often getting when you buy a motherboard with integrated video is a cheap, compact solution that’s not designed to be upgradeable.
If your luck’s in then your motherboard User Guide will have instructions on how to disable the integrated video, either via a small switch or jumper or by disabling it in the system’s BIOS.
After you’ve installed your add-in graphics card, you should also check in Device Manager for the presence of two display adapters. If you have delete the one one that refers to your integrated graphics ……… and cross your fingers again