Seen lots of articles on the web about crypto currency and with all the others similar to bitcoin, I knew bitcoin is out of the reach of my hardware but what do the others need for processing power.
Found this neat Calculator to check out if interested in mining and wanting to know if you might turn a profit or not without wasting electricity and time.
https://www.cryptocompare.com/mining/calculator/Just out of curiosity I installed a litecoin miner to see where my video card averaged for processing power. It was rather funny to see 19kH/s. I knew it would be rather poor with a GT 730 with 2GB RAM and 128-bit. After running this for a few minutes I stopped it since I got my measurement that I wanted as a benchmark for crypto currency. If this system with this video card was run for a week I would be in the negative by $6.67 USD.
Looked to see by plugging in numbers where the break even is between loss and gain and came up with the fact that this system at 265 watts would need to process 27MH/s in order to be in the green. Two laptops I have one of which operates at 50 watts maximum draw and another at 80 watts maximum draw would need to crunch at 6MH/s and 9MH/s at my cost of 15 cents per kilowatt hour.
Checked out bitcoin to see what that shows for break even for 265 watts and that requires 536MH/s at this time and its ever increasing in processing demand since its an upside down pyramid of ever increasing complexity to get to the next coin.
I was going to test Ethereum but ran into a problem that isnt obvious until you hit the wall which is that it by design requires ever increasing amounts of Video RAM, and because my video card is only a 2GB card, it requires 4GB of Video RAM, and so the program crashes out because not enough Video RAM. On their website they should state be sure you have 4GB of dedicated Video RAM before attempting this. Maybe if I had two cards and paired them I could get 4GB of combined RAM, but I dont have another card to pair with this so, I didnt test this.
However knowing my power consumption of this system full tilt with video card at 100% is 265 watts I was able to figure out that the break even for Ethereum is processing power of equal to or greater than 7MH/s at this time and it will require more processing power as time goes on.
The article that got me interested in this was an article that stated that some MIT student was using his doom room electricity and systems teamed with high end GTX cards to turn a profit at the expense of the college. In there one other guy had to open his dorm window to keep the room cool as he was pulling 2500 watts of power which created the heat of almost 2 space heaters in the small dorm room. He was farming Ethereum and then transferring that into bitcoin which was a safer currency he stated. Interestingly they stated that MIT wasnt available for comment but he has to be either an idiot to brag about this or he has graduated and moved on slamming the door for others there to pull crazy amounts of electricity at college expense. I will try to find the article and link it here when i get home from work. Time to get back on the clock after lunch.
Also will test it on my 8-core 4Ghz FX-8350 system with the GTX 780 Ti and see what KH/s or MH/s value that shows for a single card which is the heaviest video card ( and CPU/GPU combination ) i have for processing power. I am guessing that the GTX 780 likely is behind the times because the electric bill exceeds the computational power of it to turn a profit, but dont know until tested.