You will want to load test the system from all directions of possible bottlenecking etc. To weigh down the CPU I usually use super pi and have it calculates pi to a huge number
http://en.wikipedia.org/wiki/Super_PI. But when wanting to weigh it down and get measured and recordable results to create baselines and then make changes and then run benchmark all over again to test after hardware or software changes I use passmark which works free for like 30 days, just enough time to fine tune a server or system before rolling it out.
http://www.passmark.com/products/pt.htm Another thing I do on the server side of testing is for servers that operate custom databases etc, I run it through its paces by creating virtual systems on some work stations and then run automated routines in those virtual environments to simulate users. I wrote my automation routines using a very simple to use keyboard/mouse macro recorder called Jitbit Macro Recorder that you can compile your macros as exe's and then copy them to each of your virtual machines and have them run in a looped mode accessing, entering, looking up information, etc over and over again for as long as you want. You can increase the load on the server by increasing the number of virtual machines that are concurrently interacting with the server to see if you can bring it to its knees and at what point does it give up the fight to multitask for all the requests. Found many issues this way through burn in testing with servers running SQL and mySQL databases. One issue was detected only under extreme user session loads in which a drive controller was not running to its full potential. Found out in the end that this was because Microsoft's Server installer installed a generic microsoft drive controller driver, which was not the most efficient. I went to the manufacturer's website for the HP Server and downloaded the driver for that main boards SATA drive controller and instantly performance was boosted more than 200%.