Computer Hope

Hardware => Hardware => Topic started by: Wefro_froyas on December 31, 2010, 10:27:40 PM

Title: 166 degrees GPU after game
Post by: Wefro_froyas on December 31, 2010, 10:27:40 PM
So my nvidia 8800 gt was running at 166 farenheit in of it possibly overheating i shut it down is it safe for my GPU to run at this temperature


hold on so my GPU idles at around 146 degrees farenheit when i'm not doing anything.

I was playing dawn of war 2 and it got up 166 degrees farenheit is this to hot for it?
Title: Re: 166 degrees GPU after game
Post by: honvetops on December 31, 2010, 10:55:04 PM
it's  obvious  you have a heat  issue...though not  uncommon  , is your  Tower  vented ?  Could you  place a fan close buy to help ventilate  the heat ?  I have  holes/ vents  all over mine  ...helps  quite a bit

plus...there are  fans  you can  purchase to place underneath  your   GPU  if you  have the  room*
Title: Re: 166 degrees GPU after game
Post by: Wefro_froyas on December 31, 2010, 11:03:36 PM
it's  obvious  you have a heat  issue...though not  uncommon  , is your  Tower  vented ?  Could you  place a fan close buy to help ventilate  the heat ?  I have  holes/ vents  all over mine  ...helps  quite a bit

plus...there are  fans  you can  purchase to place underneath  your   GPU  if you  have the  room*

I looked up on the internet and I read people are getting around this temp all the time for my GPU for idle
Title: Re: 166 degrees GPU after game
Post by: BC_Programmer on December 31, 2010, 11:12:21 PM
74.66 Celsius is pretty low. my GPU idled at 79 until I repasted the innards; it idles at about 69 but usually goes to 98 degrees (celsius) under load, which often means 2 or 3 hours at about that temp sometimes.
Title: Re: 166 degrees GPU after game
Post by: Wefro_froyas on December 31, 2010, 11:15:50 PM
so do you think i'm fine BC? also do you think that i'm do for an upgrade or do you think that the 8800gt is still good?
Title: Re: 166 degrees GPU after game
Post by: deargodpleasehelp on January 01, 2011, 12:31:07 AM
so do you think i'm fine BC? also do you think that i'm do for an upgrade or do you think that the 8800gt is still good?

Put it this way, it's better than 118 *C that my old ATi Radeon 4800 used to Reach under load,

The Dell Mini 10v Has NO internal fans, Whether (beta-testing) running OS-X or Windows... But runs at perfectly normal temps, does get quite a bit hot at times, sometimes when idling, but it still is quite cool. ^_^ ;)
Title: Re: 166 degrees GPU after game
Post by: BC_Programmer on January 01, 2011, 08:55:42 AM
so do you think i'm fine BC? also do you think that i'm do for an upgrade or do you think that the 8800gt is still good?

Yeah I'd say so. Can't say as far as the graphics card goes, I usually defer to Calum's expertise when I can, but from what I can gather the 8800gt is pretty similar to the 9800GT (the latter of which is my card).

Now, if you're having crashing/freezing issues, that changes the whole story. In my case, originally, playing some (the more GPU intensive, crysis, for example) games for about 30-40 minutes would cause the game to crash. I repasted the card and- the temps dropped, but the problems stayed. I ran furmark, and ran it overnight at "full blast" with no problems, and got a lot of other stuff working stabley, but Crysis refused to run for more then about 40 minutes before it crashed.

I reduced a few settings from Ultra High and the problems went away. I eventually managed to isolate the "crash" to having the shadows setting on Ultra High. I turned it to high and haven't had a problem since- at least, not problems I think are related to the graphics card.

Anyway: to summarize, your temperatures look exceedingly well to me. your load temp (74~c) is lower then my idle temperature, and the 9800GT and 8800GT I believe use the same GPU, with minor alterations, and I haven't had any problems.


In further unrelated anecdotes I had a K6-2 that would often have it's "high temperature alarm" trip; strangely it always occured on the exact same level of Duke3d :P.

I rebooted and checked the BIOS temp display and the CPU would sometimes register as over 100 Celsius (222f)! More recently I decided to replace the heatsink/fan on that machine as well as upgrade the processor from 350Mhz to 450Mhz and discovered the cause- the thermal pad was basically turned to a caked on layer of dust. I used some of the thermal paste I had as well as the new (less dust clogged) heatsink and CPU and it ran nice and cool. Also had my NVidia Geforce 5500FX overheat (in fact, I started a thread here on CH on that very issue back in the day, probably before I had ~100 posts); I originally "solved" the problem by underclocking the card (which was factory Overclocked) to the stock 5500FX setting. I later discovered the reason was because the fan was busted. I managed to pop out the fan component and the placed the similar component from a 6200FX I had bought but never used (in this case the 5500FX won out because it was AGP and the 6200FX was PCI).

So, yeah, there's that. before that temp was never an issue, aside from the integrated graphics chipset I accidentally fried on a Pentium machine because I thought the heatsink that was on the graphics chip was mere decoration. Turns out it wasn't.
Title: Re: 166 degrees GPU after game
Post by: Wefro_froyas on January 01, 2011, 10:45:15 AM
Thanks, I had a dish heater running constantly in that room that night.
Do you think that the room temp which was pretty warm could have caused it to get to 166F?

I've never had it get this hot playing Dawn of war 2 and i was thinking that this was the reason.
I turned on my PC this morning and it was pretty cold in my basement. and my card after about 10 to 20 minutes idled at 124 degrees Fahrenheit.

but after I turned on my heater and let the room warm up a bit more I notice the room is back to 142 Fahrenheit.
Title: Re: 166 degrees GPU after game
Post by: deargodpleasehelp on January 01, 2011, 12:04:52 PM
Thanks, I had a dish heater running constantly in that room that night.
Do you think that the room temp which was pretty warm could have caused it to get to 166F?

I've never had it get this hot playing Dawn of war 2 and i was thinking that this was the reason.
I turned on my PC this morning and it was pretty cold in my basement. and my card after about 10 to 20 minutes idled at 124 degrees Fahrenheit.

but after I turned on my heater and let the room warm up a bit more I notice the room is back to 142 Fahrenheit.

What are you, in a heating room!??!?!?
Quote
but after I turned on my heater and let the room warm up a bit more I notice the room is back to 142 Fahrenheit.
You said the room is back to 142 Fahrenheit. How can you stand such temperatures? 90 degrees F is the Max that you should be in, why would you be gaming in Satan's dining room for all you care then??
Title: Re: 166 degrees GPU after game
Post by: Wefro_froyas on January 01, 2011, 01:00:43 PM
What are you, in a heating room!??!?!?You said the room is back to 142 Fahrenheit. How can you stand such temperatures? 90 degrees F is the Max that you should be in, why would you be gaming in Satan's dining room for all you care then??

lollllllll

No what i'm saying is can the room temp affect the card temp at all
Title: Re: 166 degrees GPU after game
Post by: Carbon Dudeoxide on January 01, 2011, 03:23:50 PM
The temperature of the environment will always affect the outcome.
Title: Re: 166 degrees GPU after game
Post by: Salmon Trout on January 01, 2011, 05:22:28 PM
the room is back to 142 Fahrenheit.

142 degrees F is 51 degrees in modern (proper) temperature units. The European record in Seville (Spain) was around that. Why are you using these weird old Fahrenheit degrees? They just confuse everybody. In fact, I don't believe the room is that temperature. In fact I don't think I believe in the OP's ability to collect accurate temperature data either.






Title: Re: 166 degrees GPU after game
Post by: mroilfield on January 02, 2011, 03:00:46 AM
142 degrees F is 51 degrees in modern (proper) temperature units. The European record in Seville (Spain) was around that. Why are you using these weird old Fahrenheit degrees? They just confuse everybody. In fact, I don't believe the room is that temperature. In fact I don't think I believe in the OP's ability to collect accurate temperature data either.

Salmon,

The U.S. still uses Fahrenheit so it isn't "weird" or "old" and doesn't confuse people that live in the U.S..
Title: Re: 166 degrees GPU after game
Post by: Salmon Trout on January 02, 2011, 03:15:29 AM
Salmon,

The U.S. still uses Fahrenheit so it isn't "weird" or "old" and doesn't confuse people that live in the U.S..

And, of course, nobody else matters!  ;)

Title: Re: 166 degrees GPU after game
Post by: overthehill on January 02, 2011, 11:45:41 PM
142 degrees F is 51 degrees in modern (proper) temperature units. The European record in Seville (Spain) was around that. Why are you using these weird old Fahrenheit degrees? They just confuse everybody.
This is getting off topic, but firstly, 142 degrees Fahrenheit is not 51 degrees in modern(proper) temperature units. Secondly, weird,old and confusing everybody?. Wrong again !. overthehill
Title: Re: 166 degrees GPU after game
Post by: BC_Programmer on January 03, 2011, 01:32:49 AM
This is getting off topic, but firstly, 142 degrees Fahrenheit is not 51 degrees in modern(proper) temperature units.
(142 - 32) * (5/9)=
(°F  -  32)  x  5/9 = 61

so yeah, he was ten degrees off.  Oh no.



Quote
Secondly, weird,old and confusing everybody?. Wrong again !. overthehill

It's certainly weird. I mean, Celsius 0 degrees is freezing point. 100 degrees is boiling point. with fahrenheit, you have the 0 temperate being the temperature of a brine of ice, water, and ammonium chloride. 100 degrees was quite literally just the temperature it stabilized at when the thermometer was placed in a persons mouth or armpit. Sort of, random decisions, the two measurements aren't even really related.

As for confusing people, for me I always end up converting it to celsius to get an idea of what the temperature would "feel" like. More on-topic though it's entirely silly to measure computer temperatures in fahrenheit- since almost all tools for that purpose show the temperatures in celsius, there had to have been conversions to give us the fahrenheit temperatures, which I believe is the basis for Salmon Trout noting that it caused confusion. Not purely on the basis that the scale is incorrect or "weird" but more that it seems to have been decided to give us the fahrenheit converted value that was no doubt presented in celsius using the temp tool.
Title: Re: 166 degrees GPU after game
Post by: Salmon Trout on January 03, 2011, 02:46:04 AM
142 degrees Fahrenheit is not 51 degrees in modern(proper) temperature units.

No, indeed they are not. That will teach me to get on my high horse! To convert from ancient F to modern C multiply subtract 32 and then multiply by 5 / 9. If I had done this (which I can do in my head) instead of just using Google Calculate and reading the answer wrongly, I would not be in the embarrassing position I am now in.

Title: Re: 166 degrees GPU after game
Post by: you878 on January 03, 2011, 05:34:29 PM
And, of course, nobody else matters!  ;)
I don't see any indication that he thinks nobody else matters, but you do realize this is a United States based website, so many users will be from the United States.
Title: Re: 166 degrees GPU after game
Post by: BC_Programmer on January 03, 2011, 11:18:44 PM
I don't see any indication that he thinks nobody else matters, but you do realize this is a United States based website, so many users will be from the United States.

Where a website is based is unrelated to the location of it's visitors. I note however that you specifically say "many users" as if having a single US user somehow means everybody should now use fahrenheit and Miles per hour. That's sort of silly. ST's main point was that all temp measuring programs present the temperatures in celsius So in order for them to provide the F measure, they would have had to convert, And given the rather unlivable temperature they've noted for room temperature it's should hardly be surprising to consider the possibility that mistakes were made during the conversion, a possibility that ST accidentally highlighted by making one himself :P. His point was quite clear, it confuses nearly everybody else who doesn't use the Fahrenheit scale every day. (I have to personally convert to celsius to get anything but a general idea of what the temperature is) Basically your argument (to a post that ends with a winky face denoting a mild tone of nonseriousness anyways) is that "well, there are sure to be americans on this site (duh) so we should all use the U.S scales for everything. That is just plain silly.

Lastly, if we look closely at his post, we may notice the winking emoticon.
Title: Re: 166 degrees GPU after game
Post by: navyfalcon on January 05, 2011, 06:39:35 PM
In electronics, the cooler the better. Heat is their enemy. There is a certain temperature factor that for every amount of that temperature, the life of the component is doubled, ( I think it is 10 degrees C - ie for every 10 degrees it is increased, the life expectency is halved, and every 10 degrees it is reduced, the life expendency is doubled) also more reliable. Coolers and heat sinks are not that expensive and are cheap insurance.
 Also heat cycling causes solder joints to cause problems.

Reference:
http://servenger.com/Resources/Modeling_Temperature_Driven_Wearout_Rates_for_Electronic_Components_b.pdf

http://en.wikipedia.org/wiki/Highly_Accelerated_Life_Test
-
hope this helps
falcon
Title: Re: 166 degrees GPU after game
Post by: mroilfield on January 05, 2011, 11:15:00 PM
In electronics, the cooler the better. Heat is their enemy. There is a certain temperature factor that for every amount of that temperature, the life of the component is doubled, ( I think it is 10 degrees C - ie for every 10 degrees it is increased, the life expectency is halved, and every 10 degrees it is reduced, the life expendency is doubled) also more reliable. Coolers and heat sinks are not that expensive and are cheap insurance.
 Also heat cycling causes solder joints to cause problems.

Reference:
http://servenger.com/Resources/Modeling_Temperature_Driven_Wearout_Rates_for_Electronic_Components_b.pdf

http://en.wikipedia.org/wiki/Highly_Accelerated_Life_Test
-
hope this helps
falcon

I know people that use their PCs in hot environments for work just about every day for years and never have a problem yet some people that work in an air-conditioned building go thru a new PC every year or so. Temps do have an effect but unless you are talking extreme temperature changes  then you have to figure in more then just a 10 deg temperature change to figure the life expectancy of electronic components.
Title: Re: 166 degrees GPU after game
Post by: navyfalcon on January 06, 2011, 01:41:27 PM
Please stay on the problem.
Whether it is degrees C or F is just a way of measuring, it does not affect the problem
The problem is "heat"
This can be caused by the heat transfer method getting old or not applied correctly (heat transfer grease) or lack of proper circulation in the case or more cooling needed for the CPU
-
Different computers respond differently to heat problems. The motherboard (how it is mounted). The CPU (cooling methods and mounting plus heat transfer, which can include transfer grease and or heat fins, shrouds, and fans). The case (adequate ventilation) etc.
Since CPUs are expensive and are critical and coolers for them are comparatively inexpensive, it is a good insurance to increase the CPU cooling, which can be important to older computers or stressing computers with games.
-
hope this helps
falcon
Title: Re: 166 degrees GPU after game
Post by: BC_Programmer on January 06, 2011, 02:11:59 PM
Whether it is degrees C or F is just a way of measuring, it does not affect the problem
mroilfield didn't say anything about C or F in his previous reply.

Quote
The problem is "heat"
No, It's not. There is no problem, the original question is wether 166F under load is too hot for the card. It's not. Any semi-modern Graphics card is going to hit that temperature or higher pretty much regardless of what "heat transfer method" is being used.

Quote
This can be caused by the heat transfer method getting old or not applied correctly (heat transfer grease) or lack of proper circulation in the case or more cooling needed for the CPU
I find it funny that you would tell mroilfield to "please stay on the problem" and yet conveniently forget not only that there was no problem to begin with, only questions, and since both of the questions had to do with the Graphics card (and not the CPU), I'm not really sure what you are saying.

Quote
Different computers respond differently to heat problems. The motherboard (how it is mounted). The CPU (cooling methods and mounting plus heat transfer, which can include transfer grease and or heat fins, shrouds, and fans). The case (adequate ventilation) etc.
Yes, and Graphics cards respond generally the same. That is, they stop working or have problems. It's important to note however that not only are the temperatures specified relatively cool for an under-load GPU but that they made no note of any problems whatsoever afterward, and were merely concerned that the temperatures were high.

Quote
Since CPUs are expensive and are critical and coolers for them are comparatively inexpensive, it is a good insurance to increase the CPU cooling, which can be important to older computers or stressing computers with games.
And now not only is this advice irrelevant, it's pretty much wrong. While I cannot disagree that better cooling solutions can only help, what is a bit silly to assume is that any old inexpensive heatsink/fan is going to be better then the stock. The expensive coolers are pretty much the stock coolers with a few bits of glitter tacked on and probably a few dangly bits that make it look more like it belongs in an abstract art museum for the retarded then installed in a Computer. The cheap ones are pretty much just that- cheaper versions of the stock cooler that you got for free with the CPU. Some people have this strange tendency to think that whatever you get with a product for it's use must suck, I imagine this tendency may have arisen from MP3 players including headphones cheaper then prostitutes with syphilis, but aside from the small niche that is overpriced and overappreciated things like MP3 players and so forth that type of thing is hardly seen.

A prime example of subversion of this established stereotype can be found with CPU coolers. First off, there is only one reason to include "cheap" accessories, required or not. To save money. Apple including headphones that make your Queen music sound like it's being sung by vanilla ice as he's being sodomized by a large european Walrus saves Apple loads of money. CPU vendors selling CPUs, with warranties, but with inadequate coolers would be like megablocks being sold with a safety guarantee and giant spinning razor blades attached. It just doesn't make any business sense. If you're overclocking, you might need an "aftermarket" cooler, but sorting out the abstract art and the pieces of chrome plastic takes ages- most "good" heatsinks are in the middle, not overpriced and undercooling pieces of abstract art, and not underpriced pieces of shaped tinfoil, but rather pieces of solid metal designed to radiate heat.

Title: Re: 166 degrees GPU after game
Post by: mroilfield on January 06, 2011, 10:35:42 PM
The expensive coolers are pretty much the stock coolers with a few bits of glitter tacked on and probably a few dangly bits that make it look more like it belongs in an abstract art museum for the retarded then installed in a Computer.

Apple including headphones that make your Queen music sound like it's being sung by vanilla ice as he's being sodomized by a large european Walrus saves Apple loads of money. CPU vendors selling CPUs, with warranties, but with inadequate coolers would be like megablocks being sold with a safety guarantee and giant spinning razor blades attached.

Thanks BC I was needing a good laugh this morning.