Jump to content


Photo

Disappointing Frame Rates (FPS)

fps performance load order enb

  • Please log in to reply
80 replies to this topic

#76 Yakuza

Yakuza

    Prisoner

  • Members
  • 14 posts

Posted 06 November 2015 - 03:53 PM

Should have an option on that R9 380 in Catalyst to limit FPS to 55-60. Do that. Not above 60, sometimes preferably below 60 if you have trouble maintaining 60 to begin with or use questionable or new drivers. This is supposed to be for useless power consumption, at least marketed by AMD, but limiting FPS is more than just saving money. It's very logical to do in games to stabilize things.
 
http://www.amd.com/e...ies-gaming/frtc
 
Disable V-sync unless it provides functionality to ENB (dunno why it ever would but I don't know ENB), or you'll get extra input lag from old frames being thrown in late when they miss a v-sync cycle. The only time this matters less is when you have a high refresh monitor and stable fps near/above your refresh rate without FPS limiters. That's just a recommendation but it may provide a lot more stability.
 

 
 

[Display]
iShadowMapResolution=4096
iShadowBiasScale=0.400

 
Despite what the STEP wiki says, I am pretty sure these values are very wrong. The STEP wiki states that each shadow setting is used for a specific purpose individually, where in my testing and the way these settings normally work in 3D rendering they work in tandem and cannot/should not be heavily warped. Ideally they shouldn't be weird values, and these are at weird values. I'll be making a separate post on this soon.

If these are config settings that ENB depends on for functionality, you can ignore it for now. I highly recommend you tweak them for testing's sake though.
 
Try leaving BiasScale at 0.15 and continue with other settings. Try reducing your iBlurDeferredShadows setting in [Display] to 1 instead of 3 as you have it currently. But you should be able to tweak Blur without it affecting any of your other shadow settings in a negative way. I find its intensity is dependant on resolution though so maybe it has a significant impact on performance when you begin rendering further shadows.
 

[Display]
fSplitDistanceMult=4.0000


 
I believe this is why you are lagging/having stuttering issues. Reduce this to realistic/recommended levels, in the .8 to 1.2 range. The Ultra setting for this is 1.5 and you do not have an Ultra level CPU for Skyrim (read my other post). At the very least, lower it to 1.2-1.5 and increase incrementally if you notice better performance.


Basically, you are rendering what is labeled as "distant objects/terrain" by your config with that setting. But this increases CPU load I think, as it's producing full meshes and textures for objects further and further away from you as you increase your Split multiplier. In short, you'd have to have a huge, high resolution screen with the eyes of a hawk for this to be necessary, and even then you probably wouldn't have a PC capable of utilizing it properly in Skyrim.


And reducing it in my game is actually a boon in performance, just like my testing with shadows. But in my case I am CPU-bound (similar to you but worse); my graphics card has far more room to fill than my CPU and I hit my CPU's limits much earlier than my GPU can in Skyrim. It is not an entirely GPU-bound setting as far as I've been able to tell. Read my first post a bit up from here for more info on how Skyrim handles CPU power.

Besides the above, with the rest of your config settings this particular line should not need to be tweaked to an extreme value of 4.000. That is an insane number for any PC in existence.



I actually only made an account to report these 2 findings, fSplitDistanceMulti= tweaking and the relation between the 3 primary shadow config settings (Bias, Distance and Resolution). Kind of a funny coincidence, but I still haven't had others test it to prove that I'm right. If I'm wrong and this doesn't help, I apologize in advance. But I am pretty sure it will.

Like 99.997% disinfectant-sure.

Edited by Yakuza, 06 November 2015 - 04:03 PM.

  • 0

#77 z929669

z929669

    Ixian Inventor

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPip
  • 9,262 posts

Posted 06 November 2015 - 04:09 PM

/snip ...
I actually only made an account to report these 2 findings, fSplitDistanceMulti= tweaking and the relation between the 3 primary shadow config settings (Bias, Distance and Resolution). Kind of a funny coincidence, but I still haven't had others test it to prove that I'm right. If I'm wrong and this doesn't help, I apologize in advance. But I am pretty sure it will.

Like 99.997% disinfectant-sure.

Please post your disputes on these INI settings over in teh INI tweaking forum or at the relevant Skyrim/SkyrimPrefs ini forums.

 

Thanks for the input!



#78 Yakuza

Yakuza

    Prisoner

  • Members
  • 14 posts

Posted 06 November 2015 - 04:13 PM

You guys do realize FPS and monitor refresh rate are directly connected.

Motion blur is a technology that masks tearing and such, which may be why you percieve ENB as being "Good", or the experience with it better than it actually is. But,.. I don't know if most people use ENB and get some form of blurring. The difference between the two, assuming ENB is blurring, is that motion blur can compliment higher fps if you're into it, yet it is 100% unnecessary because the frames are fluidly moving in front of your eyes and no tearing should be happening.

If you have a high refresh rate and stable, high fps, you will know this is true. If you have adaptive refresh on a monitor, you will know this 24-30~fps crap is the dumbest thing ever reverberated within gaming culture.


Exactly 30fps on an older CRT monitor will look better than 60hz LCD due to CRT refresh and how it displays pixels/images. Exactly 60fps on 60hz will probably look better than on 120hz depending on driver settings unless you have adaptive framerate control like Freesync/G-sync. 61-120fps will only be displayed at 61-120hz+ because the monitor is only refreshing that many times per second; it cannot physically display more than its refresh rate. Unstable 120fps on 120hz Adaptive (Freesync/G-sync) looks better than unstable 120fps on 120hz anything else, and so on. The cut-off is 144hz/144fps because 240hz+ monitors use technology to flash twice or more, displaying the same exact image twice per GPU frame time, which only cures part of tearing and actually does nothing in terms of displaying more new information to you. It's not showing you more than 120 unique images per second (FPS) even if you have 240 fps locked.

And, seeing as everything is in a theoretical locked, stable 30/60/whatever FPS unless it's able to utilize adaptive refresh, motion blur and low fps seems really good. It's not.

Edited by Yakuza, 06 November 2015 - 04:21 PM.

  • 0

#79 Yakuza

Yakuza

    Prisoner

  • Members
  • 14 posts

Posted 06 November 2015 - 04:23 PM

It gets even stupider when you realize that eventually the difference between 1 frame and the next is potentially only a 30% change in your monitor's displayed pixels or something, hardly noticeable in most scenes. After a certain refresh rate/fps, you are completely limited to what the game is offering you for very different images.


Obviously if you're spinning in circles in Skyrim with ultra-high mouse sensitivity, this totally doesn't matter because you will notice that sweet, buttery-smoothness of high FPS and high refresh.

Edited by Yakuza, 06 November 2015 - 04:24 PM.

  • 0

#80 TechAngel85

TechAngel85

    Akatosh

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPip
  • 12,281 posts

Posted 06 November 2015 - 05:48 PM



You guys do realize FPS and monitor refresh rate are directly connected.

If you have a high refresh rate and stable, high fps, you will know this is true. If you have adaptive refresh on a monitor, you will know this 24-30~fps crap is the dumbest thing ever reverberated within gaming culture.


Exactly 30fps on an older CRT monitor will look better than 60hz LCD due to CRT refresh and how it displays pixels/images. Exactly 60fps on 60hz will probably look better than on 120hz depending on driver settings unless you have adaptive framerate control like Freesync/G-sync. 61-120fps will only be displayed at 61-120hz+ because the monitor is only refreshing that many times per second; it cannot physically display more than its refresh rate. Unstable 120fps on 120hz Adaptive (Freesync/G-sync) looks better than unstable 120fps on 120hz anything else, and so on. The cut-off is 144hz/144fps because 240hz+ monitors use technology to flash twice or more, displaying the same exact image twice per GPU frame time, which only cures part of tearing and actually does nothing in terms of displaying more new information to you. It's not showing you more than 120 unique images per second (FPS) even if you have 240 fps locked.

And, seeing as everything is in a theoretical locked, stable 30/60/whatever FPS unless it's able to utilize adaptive refresh, motion blur and low fps seems really good. It's not.


I don't think they realize that. I've tried to explain it several times, but have given up. It's not worth my time trying to do so. There are several aspects to take into account...fps, refresh rate, monitor's display lag, the display technology... It's never been "30fps is good enough for gaming". It's good enough for movies and TV...not gaming....depending on your monitor. ::):

Hence why I left this discussion a while back.

#81 z929669

z929669

    Ixian Inventor

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPip
  • 9,262 posts

Posted 13 November 2015 - 05:41 PM

Who are "you guys" and who is "they"?





Also tagged with one or more of these keywords: fps, performance, load order, enb

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users