Note to moderator. This is a free-form homologue with almost no structure.
Edit it or trash it as you see fit.
This is a general discussion about how to install a new video card into a desktop computer that already has been using a built-in VGA display.
When you get a new video card. The first thing you should do is to read the instructions. Of course, I don't do that. I just love out-of-the-box jamming it in the computer and turn it on and expect it to work. Now most of the time that works for me, so why shouldn't it work for you? Well, to tell the truth, it doesn't always work for me. And it won't always work for you either. So pay attention.
When you install a new piece of hardware into a Windows desktop. There are two general approaches you can take. Well, more like three.
Number one. Just remove the power cord openly covered stick the thing into the right slot , making sure it's properly seated in the socket, and then turn the PC back on and see if it works okay. Everything looks okay, just put the cover back on, then. Now you're happy.
Number two. You carefully read the instructions that come with the computer and with your new graphics card. Somewhere they tell you that there is a potential conflict between the built-in graphics card and the new graphics card. Ideally, that shouldn't happen. But with the wide variety of motherboards and graphics cards out there, conflicts do arise unless you pay attention.
Sometimes the information is only found when you stick in the CD that came with the graphics card. You have to read that information. First, before you stick the new graphics card and the computer. Not to make things confusing, in some cases, you will be told to start up the CD only after you have the card already installed. But that method doesn't always work.
When the CD software comes up, it will check and see if the card is already installed, if it's not on our already installed, it will tell you what the proper procedure is.
Number three. Now here's the procedure that can cause confusion. You are told to set up your computer to use the built-in video graphics display. So you do that. According to the manufacturer's instructions. You're supposed to use the driver that comes with the motherboard or that is already on your computer. If it was done at the factory. Okay, so far so good.
Here you take a brief pause.
It seems like it should not be so difficult for it should be obvious what you are supposed to do. But it's not. Unless you've dome it before, in which case it will be obvious because you already did it before . But not if you have never done it It seems rather convoluted.
Here is what they tell you. I mean, the maker of the new graphics card will tell you this. You removed the graphics driver that is already in your computer. No, wait, no reading remove it, just disable it. You have to go into the Windows hardware device manager and figure out some way to disable the graphics card without actually stopping you from working. What that means is you force it back into a very generic form of VGA. That seems rather strange, but for some installations that is a requirement.
Now you got the built-in video card working in a very generic VGA mode, and not using the driver that the OEM manufacture said to use, you can proceed. What you do next is to turn off the computer and stick in that new video card in the proper place. Power on. Wait for it. At this point, Windows will realize that there are two video cards in the system and will make some kind of choice as to what to do about. The trick here is that Windows has the opportunity to set up both video cards in some kind of a configuration where they could not conflict with each other.
Now you can start up the CD that came with the new graphics card and go through the steps that tells you to do. The software will initialize the card and tell you to reboot the system. So you do that and it works. Somewhere along the line Windows well pickup the driver for the built-in video and see to it that it gets installed in the proper place.
Is that really sound weird? Surprise, that is, in essence, what they tell you to do on some super-duper graphics cards that are out there.
Yet you think it would be easier if I were to give you a bunch of links to confirm all of the illogical things that I've described above? No, it won't help. Because each one will say something different or set in a different way and you'll be absolutely confused unless you are so smart that you already know about it. But if you wish, you could search through your favorite thing for something like this: "How to install second video graphics card,."
On some computers , you may have the option of disabling the on board video device in the BIOS. That is one way to ensure the built-in video will not conflict with the new graphics card.
Wasn't that easy?
Any comments?