Hi,
First of all, the relevant information about my system: Windows XP, Alienware PC purchased in early 2005. Video card is a Nvidia 6600GT.
Recently we had some hot days and my brother played games on it for extended periods of time. Of course, the computer stopped booting after a while, instead giving washed out colors and strange vertical lines on bootup over the logos, and then ending when the screen displayed garbled white text on black background, then going black and refusing to boot.
I took a look at it and ended up booting it up in safe mode so I could mess with settings. Safe mode is still really messed up (washed out colors, tiny resolution size, vertical lines on the screen still), but I can get around in it. Now, I'm thinking the issue is the video card - it probably just blew up or melted or whatever those things do
In addition, he said he saw a "CPU Overheat error" or something at one point.
Anyway I disabled the video card and rebooted. Now it boots up (still with everything all messed up) without safe mode; I don't have to press anything. The thing is, the functionality is wrecked. It doesn't let any games be played, it doesn't let me boot up the resolution of color bit-age. It's basically useless with the video card disabled (not letting me use the internet either, but that's probably unrelated).
So, here's what I'm thinking: the video card is completely shot and I need a new one. Before I go and drop $100+ on a 7900GS, I need to verify that the issue is definitely the 6600GT. While I have it disabled now, the monitor is still physically connected to it on the back of the card, meaning that the display is still interacting with the ruined card. I want to take that completely out of the equation by making the computer recognize the crappy integrated graphics as the display device. This way I can take the monitor cable, unplug it from the broken 6600GT, and plug it into the the original integrated graphics plug/port/hole thing. Then the graphics card isn't affecting anything, meaning that if everything goes back to "normal" it was the card's fault and I can safely buy a new one. Of course, if everything is still messed up its back to square one, but I'll deal with that when I get to it...
My questions to you are:
[1] How do I make the computer start recognizing the integrated graphics as the display device? I can't just plug the monitor in there without doing this, or the screen is just black as the PC looks for the monitor to be in the 6600GT's port. I swear I used to be able to do this, it's probably something simple I forgot.
[2] Any further advice on this problem would be great.
[3] Kind of unrelated, I did some research and found that the 7900GS is significantly better than the older 6600GT. I'd just like to double-check and make sure this is true before I make the purchase (if I make the purchase).
The first question is the big, important one.
Thanks in advance! Hope I was clear