I have been farming web sites for information through automated macros which dump the copy/pasted data into a database.
I am checking to see if anyone knows of an easier way to farm web sites for information than having to create a mouse/keyboard macro for each web site to farm information from for analysis?
Maybe a piece of software that interfaces with the HTML source of the web page itself and copies to clipboard information after a specified flag such as a web site stating that it is 83 degrees out if you look at its source will show a field that shows 83 degrees and to copy it directly from HTML source.
Just want to also mention that... I have NO NEED to capture keystrokes or screenshots which could be used maliciously. I simply want an easier way to interface with static information provided openly to all public at these sites and have no need for dynamic data such as data entered by a user which could be used for the wrong means.
Any suggestions greatly appreciated.