Bought a Geforce GTX 1050 the other day for $119.99 and went to install it and found out that my DVI-I to VGA adapter wouldn't fit the video cards DVI connection. Looked at the DVI port and saw it was a DVI-D ( digital connection) and didnt have the DVI-I ( analog legacy option ) so that wasnt going to work unless I also upgraded to a brand new monitor. Not planning on spending more than the $119.99, I brought the video card back and got a refund. Then looked to see if there was a newer GTX card that had legacy monitor support so that i could use my VGA type monitor with a newer card. Looking at Newegg it looks like the GTX1050 only comes in DVI-D. So I guess I might stick it out with my older video card a little longer,
but it got me thinking did DVI-I all of a sudden disappear from video cards for legacy display support?I prefer my older square 19" display vs modern widescreen displays and so i have been using my 10 year old VGA Samsung syncmaster 19" flat screen display at 1600 x 1200 resolution. Currently I get ok frame rates with the video card in this system a GT 730 2GB, but a newer GTX would perform way better for some games that the GT 730 you can see struggles slightly with.
I have another newer computer with the EVGA GeForce GTX 780 Ti, 3GB, 3072MB,GDDR5 384bit, Dual-Link DVI-I, DVI-D, HDMI,DP, SLI Ready Graphics Card (03G-P4-2881-KR) Graphics Cards 03G-P4-2881-KR
https://www.amazon.com/EVGA-GeForce-Dual-Link-Graphics-03G-P4-2881-KR/dp/B00GDIIIPW and it has the DVI-I to support my legacy VGA connection. With the GTX 1050 that I picked up, it wasnt a power hungry card like most GTX cards and so I was hoping to use it in my older Athlon II x4 620 2600Mhz quadcore as a better video card than the GT 730 and at $120 it seemed like a good deal if it would work, but for the fact that it doesnt support DVI-I, I dont want to have to get a newer monitor and also have to change my setup which uses a 2-port KVM which is VGA based when switching between computers using the older square flat screen displays.
My setup I have two 19" displays and the display on the right is KVM switchable and the one of the left is connected to the 2nd display connection so that one of the 2 computers can use 2 displays while one of them shares the right display with the other computer. I use this setup to have info up on the left display or play a video and be able to work on the right display environment between 2 computers.
The power supply in the older gaming system doesnt have the 12V molex connector for high end video card power, so I am sticking it out for now with whatever video cards that will run powered directly from PCI 2.0 Express 16x slot in the Biostar MCP6PB-M2+ motherboard. If that GTX 1050 card had a DVI-I port all would be golden, better performance and powered direct from the PCI Express slot like the GT 730. * Seen some people get those P-connector to Molex adapters to power high end video cards on power supplies not intended for that application and not sure if that would be a good idea with this power supply that came out of a HP computer that is rated 400 watts.