You're right that I am unfamiliar with the specifics of how the hardware works, but I don't think you understood where I was going with that. The output obviously goes through the video card, but whether or not it is rendered on the video card was the bit I was angling toward with my previous assumption.
I assumed that if the CPU calculated output and handed it to the video card, most of the video card's facilities (including damaged ones), would be bypassed and the video card would likely hand the output over to the monitor verbatim with minimal calculations.
One way of "thought-experimenting" to prove-disprove things like this, is to consider- what if this was the case.
if I understand your thought process, you are thinking that the Video card merely acts as an "adapter" so that the BIOS can direct output to the monitor; however, if this was the case, then there would have been no such thing as Video card failures until rather recently; for example, older Video cards (ISA) being merely "pass-through" things, would not require video memory; however, they do require video memory. If that memory get's corrupted/damaged then you will see artifacts on the screen, regardless of the screen mode.
It's not as though computers aren't used to doing this; even as 'recent' as Unreal Tournament, you could elect to have the CPU render video out.
Ahh, here is the confusion. You are thinking that the video card only "does work" when it is being made to render things such as using it's 3-D accelerator features. To be fair, 3-D acceleration is more likely to reveal corruption (particularly with Video memory) merely because it uses it for storing textures; if you (as you noted) use software mode in a game, all textures,vertex, and Lighting data are stored in system memory, and the CPU is essentially used to build a "bitmap" of each frame; the Video card, however, still needs to store that video data in video memory and if it happens to decide to store it in a area that is corrupted you will see the results of that (in fact, you will see it everywhere). Also, the windows desktop uses basic 2-D features; while many of these are accelerated, the acceleration is done by the drive in many cases, and if the driver says "draw this line" either the video card draws the line or it doesn't; I don't think I'd expect that the video card would continue to say "ok I draw it" for every single request the driver makes (thus leaving the screen blank). Usually if the "higher" functions of the video card are dead you simply get a hard freeze during boot or when you try to switch to a 3-D mode.
Thing is, although your theory makes some sense since using higher resolutions uses more memory, such as text-mode 80x25 (what the BIOS uses.. although newer computers switch the card to a graphics mode(640x480) so they can draw the EPA or their manufacturer logo/splash), windows switches to 640x480 to display it's splash, and then switches to your desktop resolution. Thing is, though, if the video card was having problems, then it wouldn't simply accept the various mode switching VGA commands and later accept the control of the windows driver; at some point during boot it would either freeze solid (as the video card freezes the entire machine) or, you would see a garbled display the entire way along. As you have discovered, however, it was the monitor; at some point it stopped "supporting" the video modes the video card wanted to change to. I've had monitors mysteriously stop supporting modes, or in some cases, they start screaming in agony (very high-pitched squeal) when switching to that mode; I had a old monitor that supported 1024x768 out of the box that at some point decided to start screaming like a banshee when in that mode, to the point where I had to switch to 800x600; Eventually the entire unit gave up the ghost as the vertical coils died, leaving me with a very bright horizontal line in the center of the screen, making things a tad difficult.