[–] B3bomber 1 points 0 points (+1|-1) ago  (edited ago)

vsync waits for both frames to be rendered before pushing them. This removes jagged textures (texture tearing) and basically makes the game feel smooth. It is entirely driver+hardware dependent and unless the driver is fucked for linux, it should work fine. Note: this does use more GPU power to do.

gsync and freesync are essentially a slightly upgraded version designed for resolutions higher than 1080p to do the same thing.

Known fact: DX has slightly lower framerates than opengl, so don't use DX unless you have no choice. Vulkan does even better than opengl typically as well.

Things to note on graphics settings: Properly done games do not need much, or any, AA. You can set the minimum on your GPUs driver panels in most cases. Some devs are anal about fucking your cards performance and force high levels regardless of your settings on the screen/driver settings. This applies to any game that runs with resolution texture sets fixed at 1024 resolution maximum. All modern games are based on things that are NOT texture pack dependent, when done correctly (some games have the lowest poly count possible for some shitty reason. In the words of a modder on the subject: this girls new textures has more polys on her asshole than the original textures for her entire model).

AA was designed for very old games before this new method of drawing scaling textures. It was to smooth the jagged edges of things designed for 640x480 when people were using resolutions of 1024x768 and it was done at the driver level (this is why it was hardware intensive, the textures didn't have this data, period).

I use AMD so I tend to use these settings:

AA: app preference

AA samples: app preference

Filter: standard (the only other one I use is Edge-Detect)

AA method: Multisampling

Morphographical Filtering: OFF

Anistropic Filtering Mode: app settings

Anistropic Filtering Level: app settings

Texture Filtering Quality: Performance

Surface Format Optimazation: ON

Wait for Vertical Refresh (this is vsync): OFF, unless app specifies (set it on in game)

OpenGL Triple Buffering: OFF (this is part of framerate control, unless your game stutters, don't use it).

Tessellation Mode: AMD Optimized (AMD makes a lot of profiles to deal with this, if it's bad, then the game dev team fucked up and should be told to contact AMD)

Max Tessellation Level: AMD Optimized

Own personal Crossfire Settings: ON, Frame Pacing: ON

All of these settings should work very well for all games. If you want to specify specific settings at the driver level outright, which will override in game stuff (IMPORTANT TO REMEMBER), then make an app specific profile. I've done it for 2 games, though I lost those settings and only rebuilt 1. That one is an older game and it HATES Crossfire. So that's the only special setting for that game, Crossfire OFF.

Fucking list formatting REFUSED to apply. Shift+Enter FTW.

Edit: to see this level of settings detail, you need the last beta version of the CCC. I think that is 15.8. Otherwise you are using that Crimson piece of shit.

Edit 2: a word.


[–] Captain_Faggot [S] 1 points -1 points (+0|-1) ago 

Vsync makes games look smoother but it makes you feel like you're moving through soup from the input delay so it's fucking useless for fps games.

Freesync has your monitor wait until the frame is ready and then draws it from your gpu, meaning input lag is as low as your monitor response time and if your gpu is drawing frames at 93fps, the monitor will be a 93hz monitor for that second. My monitor has a window of 35-144hz that freesync will kick in.

My question is why does vsync on linux/opengl have the same properties as freesync when on dx games vsync will hard lock my fps to steps and make it feel sluggish.