Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It’s usually the most expensive part of a gaming PC so people want to able to hold 100% utilisation as it means it’s not limited by the CPU.
However you may not want to hold 100% all the time as it means less stable frame time graph. Some people like to limit their fps to some rounded value like 120fps or 60fps. Other check their 1% lows and lock it there.
This can help with stutters, fps fluctuations and in some cases greatly reduce fan noise, heat and power consumption.
Not always. That means your GPU is being pushed to its max limit. For example, if you have an older game, your GPU would be well below the 100% mark.
If the GPU is pushed to 100% it could also be a sign that you'll need a stronger GPU.
This is why I always turn RT off (unless the game has FORCED RT like Doom the Dark Ages will)
if you want higher fps, turn down visual settings
cpu/gpu are designed to run at 100%
as long as they are not throttling they have enough cooling its perfectly fine
100% peak usage is fine, but not at all times.
Games dont always need the same amount of gpu power. If your gpu is always maxed and running at exactly 60fps, that means during more intensive scenes fps will drop. But if your gpu is running at say 90% it means it can compensate and prevent the fps drop.
I always configure settings so i get ~80% gpu usage.
But its less of a problem at high refresh rates. Drops from 144 to 100 fps are not very noticeable.
And you also have CPU related stutters that don’t show in utilisation numbers.
Think before you type, kiddo.
PLUS, you can always safely push the GPU to another 20% increase manually, personally, I only run on Epic/Max settings.
I run 180fps on 1440p and not even hitting 75% CPU usage on AM4, I can't talk to you amateurs anymore.
That depends on the game engine, some game engines push the GPU to 99-100% regardless of FPS performance and/or what all is going on within the game. If you find this happening with older games (such as the original STALKER games for example) load up MSI Afterburner and turn the Power Limit % down to around 80-90% and then try the game again and see how it does. When you can find a power limit % that's lower then 100% that doesn't really effect your game's FPS, then you can leave it there for that game and everything should be fine, whilst not pushing the GPU to full 100% power for basically no benefits.
Hmm, I always felt that because the Dead Space remake pushed my gtx 1080 very close to the 100 percent mark (even with v-sync on) it means the game pushed the card to its limit. And yes, when I play the game I always put the fan speed to about 60 percent, yet the GPU usage is usually 75 percent or greater, while other times it's at 100 percent.
There have been those who've said that the gtx 1080 is starting to show its age, and that it's time to consider getting something stronger.
Which is a shame, considering that aside from the 4060 (which is only about 18 percent faster), all the other cards' prices are too high.
Now the 1080 Ti, it has some time left, but keep in mind it's still behind where features are concerned. Not a huge deal if playing certain games, such as lower demanding games or older games. Dead Space Remake doesn't even have or use the latest graphical features.
IDK why anyone would look at an RTX 4060 or 4060 Ti, it's too low end. While it does ok on certain things, it won't hold up in the long term; just like the GTX 1060 and 1070 didn't compared to a 1080 or 1080 Ti
Also keep in mind, older GPUs such as RTX 20, 30 and 40 series are helped with DLSS 4, which you can enable in most games via latest driver + NVIDIA app
Yeah but I don't have a large monitor, so full HD is the best resolution I can get. That's why I don't bother looking at benchmarks above 1080p.
You can say the 4060 is too low end, but right now it's the only card that's at a reasonable price. The 4070 and higher are all over $1000.
And in any case I already checked the benchmarks when it comes to preparing for Doom the Dark Ages. The Indiana Jones game uses a similar engine, and for that game if you turn everything to ultra, the 4060 only gets 25 fps. However, if you tune the shadows and texture to medium but keep everything else on ultra, it goes up to 85 fps.
I for one never cared for ultra settings since the difference between ultra and high are usually too small anyway. That's why my standards are that if all settings can be set to high and I can still get an fps of 55 or higher, that's enough for me.
As far as I know, the Indiana Jones game and Doom the Dark Ages are currently the ones with the highest requirements. So I'm willing to bet that the 4060 can still last until around Q4 2027 before it becomes obsolete.
Not getting any younger, your eyes only get worse over time.
But even at 1080p for todays games, GTX 1080 not enough.