Yes, the drivers I listed were from official nvidia's website. When going through the manual selection process for drivers it appears that they switched me over to the geforce domain. I didnt catch that until you mentioned it.
Not a stupid question, windows 10 drivers might be different, however I would expect them to be possibly the same as maybe the ones for Windows 8.1.
Awesome to hear you found some that work using older drivers that I linked. Right now its just a process of figuring out where it breaks or just being happy with an older driver and running with it saving the driver that works for future use when nvidia tombstones accessibility to it. On systems I build I create a folder on them that I call Vendor which is a practice that I learned from another professional in the field and basically all software and drivers that go for a build are kept in this folder and so if needed the drivers for a specific build are in that folder on that system. On external hard drive I would name it like HP_DC5000SFF_Vendor and have all drivers and software there. So if I need to rebuild clean without an image, I can do a fresh OS install and then copy the folder from external to the system and then change its name to Vendor or leave it be and install known good drivers for that build and software that is versions known to work.
What really stinks is for example when you assume that downloads will always be available online and so you dont save a copy of some software that is free use, then you rebuild and go to install that software and all of a sudden they no longer provide it for free and they killed off the free version. So for some software while there are newer versions available I still run the free older version. Additionally some software that was out there created by hobbyist programmers or sole proprietor programmers come and go as they dont pay to keep their websites going etc and tools forever lost unless someone else hosts the downloads which then are questionable as to if they are in fact what you want or bait to get infected etc. So I make a habit out of when i find something cool I will download the software that is free and then I will take a copy of their website with HTTRACK and keep that in a folder with the software so that I also have a offline copy of their website with all info that right now might not be important but at some point in future I might be looking back at when that website is gone when a microsoft update breaks something etc. HTTRACK I can tell it to grab only the first layer or 2 layers in of a website etc so it just skims the important stuff and avoid downloading externally linked sites etc. I then browse the offline copy and if I see something missing I will tell it to grab the 3rd level which hopefully is the info I want to have an archive of and not be grabbing offline copies of gigs of data as can happen. I also use a tool that takes digital snapshots of websites as jpgs etc such as this software here that not many know about ....
http://www.priyatna.org/webthumbnailer.php and this way i can open a picture of a webpage and zoom in if its one that scrolls out of view normally. Its not 100% flawless, but its worked for me in the past with getting pictures of websites to keep a offline copy of their readable content.