TRENDING NEWS

POPULAR NEWS

Do Higher Power Graphics Cards/cpus Run Hotter On Idle

Is my CPU temperature too high?

Hello,
I have built my PC yesterday and so I was checking it today, when I boot up in the bios it was showing 50 C and then it got 59 C quick. In windows it shows 39-40 C at idle and 50 C at basic load and 60 - 65 C in cinebench. I don't have any other cooling in my PC than a small fan. The wraith cooler seems to be not that great, I guess. I expected a lot from it.

My PC Configuration
1. Mobo - ASUS PRIME B350M-K
2. CPU - AMD Ryzen 1300x
3. Graphic card - gtx 750 ti
4. RAM - Corsair 8 GB ddr4 2400mhz

Thanks

Do higher power graphics cards/CPUs run hotter on idle?

Having difficulty figuring this one out.

A medium card and CPU trying to run a high graphic game would require the card to work a lot, so it runs hot.

If I were to use a powerful card and CPU, will it still run hot due to the power or will it run cooler since it doesn't take as much work to run the game?

Are ATI graphics card hotter than NVIDIA graphics card?

It depends upon the specific model... generally, the higher-performance models run much hotter than less expensive cards.

In particular, ATI's Radeon 4850 and 4870 run *VERY* hot... the 4890 introduced a 3rd heatpipe which helped immensely, while the 4770 is a lower power consumption card that runs much cooler.

Lower-performance cards like the Radeon 4670, 4650 and Nvidia 9500GT don't generate anywhere near as much heat to begin with.

Why does Intel HD graphics exist on high end CPU?

Most high-end CPUs are actually designed for stuff such as servers. Not so much for games. And usually a server either has no graphics whatsoever (tends to be the case if running some Linux or similar), or very minimal stuff just to open some admin dialogs (tends to be the case for something like Windows Server). Thus it’s usually a waste to install a dedicated GPU for these things.So a minimalist built-in GPU makes sense for the major use case of high-end to extreme CPUs.Even when working on some graphics intensive programs, the GPU isn’t always used as much as you’d think. Stuff like rendering is nearly always done as a CPU intensive task and the GPU just sits idle. This is very often the case when rendering a scene in something like 3dStudio - where the only time the GPU tends to get a workout is when you’re zooming / rotating the views to setup stuff such as cameras. It gets even worse when rendering using the v-ray render engine or through something like backburner - meaning you’re using multiple computers to render a single image (or a series of images) - all of which aren’t using their GPU in the slightest for this purpose, but needs to run at least the OS itself (which in most cases is Windows requiring some minimalist GPU just to start).Then you get some graphics programs which actually do not use GPU at all, or only in very limited ways. E.g. the one I use to make building models (Revit) tends to use the GPU only some of the time in some select types of views, but the CPU is always used to its maximum. It used to be than internal graphics was just too poor, but around 3 years ago installing something like a Quadro card gave the same performance as running direct on the i7’s in-chip graphics. The bottle neck in this program is single-core speed, not even multi-cores, it seldom uses more than one at a time (like in nearly never) - never mind calculating graphics through a 1000 core GPU. If I couldn’t find a CPU with some in-chip GPU, I’d likely buy a 2nd-hand and/or entry-level card for this - anything else would be a waste.Even other stuff can very easily require an enormous CPU, but only minimal graphics. E.g. take something like statistical analysis - combining multitudes of data through formulas and then displaying their combined results in a graph. You wouldn’t need a GTX1080 for the graph, but you may need a 8 core CPU running at +4GHz to do those calcs.

GPU is getting very hot around 88° C while playing The Witcher 3. Is it safe or should I be worried? When idle the GPU is under 30° C but when playing heavy games it reaches up to 88° C. What is the maximum safe temperature for gaming?

The maximum safe temperature a GPU should be is 90°C.Your GPU is reaching temperatures of 88°C, which should be just fine but I would attempt to lower those temperatures below the 80°C line at least, to be safe.Here’s what you should do to lower the GPU temperature without impacting its expected performance :-Lower any overclocking settings, if present.Most important and effective way - get an air compressor or brush, remove your GPU from the PCI-E x16 slot and thoroughly clean its fans.If it’s more than a year old, consider replacing the thermal compound.While doing this, it’s a good idea to clean the whole PC, by replacing the processor thermal compound and fan as well.The reason you need to do this is because when there is too much dust in your GPU or CPU Fans, the fans are not as efficient in cooling down your GPU or CPU because of the dust getting in the way. The fans also spin harder, unnecessarily.If it’s a laptop, then you have to open up the laptop and clean it from the inside. Here’s a Dave2D Video and LinusTechTips video explaining the cleaning procedure in both cases.

Why does Ubuntu 14.04 seem to run hotter than Windows 7 on the same hardware?

Ubuntu uses open source drivers  for Gpus  by default , which may or may not be a perfect fit for a machine . Example : ATI An ATI card in a dual gpu set up remains turned on and clocked at non idle speeds even when the Intel  gpu is being used ( Radeon Hd 4650 + Acer Timeline 4820 + Ubuntu 12.10) Solution : Turn off the Radeon gpu by modprobing + blacklisting the radeon gpu OR use the radeon proprietary drivers and enable dpm in kernel via the grub boot entry  Nvidia Use bumblebee Use tlp to get fine grained control over such settings : http://linrunner.de/en/tlp/docs/tlp-linux-advanced-power-management.htmltldr : Bad / old driver + gpu = Heat

Do solid state drives run hotter than regular hard drives?

Very old post, but will add to it anyway. I have 2 SSD’s, an Intel 120Gb and a Sandisk 120. Both ALWAYS run warmer than the 3 mechs I have installed. I have 2 WD Green 3Tb’s and a 6Tb WD Blue installed while using the Sandisk SSD as a boot drive currently. The first pic is at idle, and the second is while watching a movie (stored on one of the 3Tb’s). Difference is a bit greater (maybe 46°) while playing Fallout 4.Now the difference isn'e much here, SSD is 39°, the 6Tb is 30° (game installed here), and one of the 3Tb's is 37° (movie playing from here). I have seen the SSD get as high as 46°. The Intel drive ran a bit hotter than this one, but not by much. This may not be much of a difference, but I've seen 47° once, with the SSD, and never seen the mechs get over 37°. Ambient is always between 78° and 80°F. BTW, there are 2 120mm fans (35 CFM ea.) blowing on 2 drives each.Above is at idle.If you can’t see it, the SSD is 40° and the 3Tb the movie is read is 38°. It gets hotter than this but I don’t have time to do it. Without the fans I could probably get the SSD up to 50°. The Intel is a bit worse. The SSD is installed at the very bottom, with 2 empty bays above, and the fan 1 inch away.The drives.Hope it helps.

Intel i7 cores running at different temperatures?

I have recently built a new computer using an Intel i7 2600k everything is going well so far; there is one thing that concerns me though:

One of the 4 cores runs a lot hotter than the other 3. (Not overclocked)

Core 0: 30
Core 1: 32
Core 2: 40
Core 3: 28

These are average idle temps using the newest version of realtemps.

I have reseated the CPU fan (freezer pro 7) a few times now and although the over temperatures change each time; core 2 always seems to be around 8-10degrees hotter than the other 3.

I know that the temperature isn't that high but its the difference between the cores that I'm puzzled about.

Is this something to worry about? Thanks

Is an overclocked CPU always hotter than normal; i.e. even if not gaming?

Yes.Overclocking dramatically increases power consumption.High voltage from power consumption (and remember it is voltage squared)Higher clocksThe effect is geometric, not linear.i7-3770K vs. i7-2600K: Temperature, Voltage, GHz and Power-Consumption AnalysisYou may be able to use the power saving features on your CPU to throttle down the clockspeed and voltage, which should lower the power consumption.With throttling, my CPU because it dramatically drops below 1 GHz when idle does not use more voltage than a non-overclocked CPU.

TRENDING NEWS