Welcome guest. Before posting on our computer help forum, you must register. Click here it's easy and free.

Author Topic: GPU bandwidth: 3440x1440p (100Hz) vs 2560x1440p (165Hz), which is heavier on GPU  (Read 6295 times)

0 Members and 2 Guests are viewing this topic.

NoGoodNames

  • Guest
I've been looking for monitors for my gaming PC build. I've pretty much already decided on ultra wide, but it should help me for future reference and whether or not I might need to upgrade to a GTX 1080 ti (from a 1080) in the future. I'm curious which resolution and refresh rate combination yields a greater load on GPU's for gaming. I did the math, I just want confirmation that this is indeed how GPU bandwidth works and that it works both in theory and practical use. The reason I used 165Hz and not 144Hz is because I narrowed down the best 1440p 144Hz+ IPS G-Sync monitors and they also happen to be overclockable.

(2560 x 1440p) x 165
= 608,256,000 pixels per second.
912.4MHz bandwidth required.

(3440 x 1440p) x 100
= 495,360,000 pixels per second
743MHz bandwidth required.

Assuming you were to utilize the full refresh rate potential (getting 165 / 100 FPS), standard 1440p requires 22.791% more pixels and bandwidth, or 112 million more pixels per second. In reality, this should work out to have about a 22% difference in frame rate, correct?

BC_Programmer


    Mastermind
  • Typing is no substitute for thinking.
  • Thanked: 1140
    • Yes
    • Yes
    • BC-Programming.com
  • Certifications: List
  • Computer: Specs
  • Experience: Beginner
  • OS: Windows 11
Bandwidth is measured via throughput, not frequency- It is the frequency multiplied by the bus width multiplied by the data rate. So 2560x1440 at 165Hz would require 2560x1440*3*165 Bytes, or 1,824,768,000 or around 1.6 Gigabytes per second. (The 3 comes from 24 bits per pixel). 3440x1440 at 100Hz would be 1.38GB.

However, none of these calculations make sense unless the graphics card is only being used as a framebuffer, which hasn't been particularly relevant, certainly not for games, for a few decades. The System doesn't send full, rendered images to the GPU to be displayed- it gives the Graphics Card instructions and data regarding geometry, shaders, textures, and so on. The GPU's bandwidth affects how quickly it can receive that information from the system. The speed that it can take that data and create rendered output depends on the speed of the Video memory and the GPU, which get's into far more complicated systems than being able to directly compare frequencies.

It will, of course, take more GPU capability to run 2560x1440 at 165fps than it would to run 3440x1440 at 100fps, but that capability is largely centered around the capabilities of the GPU, not the bandwidth of either system or video memory.

Note I don't mention refresh rate. That's a separate consideration as it largely affects when the graphics card can send another display frame. How it affects FPS is going to depend on whether vsync is being used. If it isn't then the GPU is going to throw frames out as fast as it can either way. If it is on, then typically it will end up locking to the highest even divisor. eg if it can only sustain 100fps at 3440x1440 then your actual framerate will drop to 82.5Hz at 165Hz; if 60fps is the best it can muster, then it will be running at 41.25fps. Vsync means that even if a frame is "ready" then the graphics card has to wait until the next VBlank interval before it sends the new frame Sometimes graphics cards can start rendering the next display frame while they wait which can help prevent stuttering- again, this is a very complicated topic and you cannot simply do a few back-of-the envelope calculations to determine what will be sufficient)

I was trying to dereference Null Pointers before it was cool.

NoGoodNames

  • Guest
Thanks for your response. That's a lot of info, a lot more than I can understand. I just wanted to make sure that 3440x1440p 100fps would indeed be easier on the GPU than 2560x1440p 165fps.

I'll be using a GTX 1080 and the monitor will be G-Sync. Most likely either the AOC Agon AG352UCG, ASUS ROG Swift PG348Q, or Acer Predator X34.

Does G-Sync lower frame rates in the same way V-Sync does?

Quantos



    Guru
  • Veni, Vidi, Vici
  • Thanked: 170
    • Yes
    • Yes
  • Computer: Specs
  • Experience: Guru
  • OS: Linux variant
Evil is an exact science.

NoGoodNames

  • Guest
Nothing I didn't already know in that article.

patio

  • Moderator


  • Genius
  • Maud' Dib
  • Thanked: 1769
    • Yes
  • Experience: Beginner
  • OS: Windows 7
Nice reply for volunteers willing to help...just an observation.

You already spent time bashing chat...so it kinda makes sense. Just so you know chat runs seperately from the Forums.
" Anyone who goes to a psychiatrist should have his head examined. "

BC_Programmer


    Mastermind
  • Typing is no substitute for thinking.
  • Thanked: 1140
    • Yes
    • Yes
    • BC-Programming.com
  • Certifications: List
  • Computer: Specs
  • Experience: Beginner
  • OS: Windows 11
As the article describes, G-Sync is pretty much intended to prevent lowering framerates the same way V-Sync does, while providing the same benefits of eliminating screen tearing; so to answer your question, it wouldn't cause lower framerates in the same way since it's design specifically to not do so.

of course, you already knew this, so not sure why you asked...  :)

I was trying to dereference Null Pointers before it was cool.