.ini tweaks and performance guide

stormseekaz Profile Options #1

0

Ok let me start off by saying I spent literally 20+ hours turning .ini settings on and off, and adjusting and testing Nvidia Inspector options. Everytime I made a change I had to restart the client.

TERA is not the only game I am playing right now that uses the Unreal 3 Engine so learning the ins and outs of the .ini file was worth while.

There are two files you can edit that will alter your graphics options.

S1Option.ini and S1Engine.ini

They are located in your main Tera directory TERA>Client>S1Game>Config>

S1Option.ini is just a notepad file of your in-game chosen options. The only reason to edit S1Option.ini would be if you wanted to change an in-game graphic/interface option outside of the client. You can open it up and take a look at the settings and you'll understand what I mean.

So, then there is the S1Engine.ini file. This has many sections that are universal to many Unreal 3 Engine games. I will list all of the settings I've edited myself and what I experienced from doing so.

Keep in mind I will be leaving out the majority of the code in the S1Engine.ini file, because theres just tons of stuff, that I haven't learned or been able to see the effects of yet. If I leave a line out, its because I don't know what it does, or changing it made no difference to my client's game display.

--------------------------------------------------------------------

[Enigne.Engine]
bForceStaticTerrain= This setting tells the game wether to make all ground polygons stay the same size and shape and angle no matter how far you are from it. If you set this to false, the game will turn a very round hill of many polygons into a jagged hill of less polygons the further away you get. I'm not sure if this applies to TERA's client, but thats what this option does in Unreal 3 Engnie games.

[Engine.GameEngine]
bSmoothFrameRate= This turns on the game's built in framerate smoothing software, I'm not entirely sure how it works, but it won't ever INCREASE performance, it will only keep your FPS the same, or lower it. But from what I've seen and read on other forums, all it really does is just cut down on sudden framerate changes, or "volatility," but in order to do this, it may end up showing your game at a lower framerate then its capable of, at times, all for the sake of making the framerate changes "smooth".

MinSmoothedFrameRate= This sets the minimum framerate you want smoothing to be enabled.

MaxSmoothedFrameRate= This not only sets the max for smoothing to be enabled but it will cap your games framerate at this.

[SystemSettings]
StaticDecals= Turns on permanent decoration type transparent textures that are laid ontop of objects or surfaces. If there is a blood stain that is permanent in cave or something, this will enable it.

DynamicDecals= Turns on temporary transparent textures that pop up and fade away eventually such as blood stains on the ground from combat.

DecalCullDistanceScale=0.25 This setting changes how far into the distance decals are viewed before they dissapear. I am not sure if TERA utilizes this command line, but I copied this line from my other Unreal 3 game.

DynamicLights= This turns on the majority of the lighting in the game. Without this on, every zone/area you go to will look abnormally dark. This setting is also a prerequisite for alot of the lighting/shadowing in the engine.

DynamicShadows= This turns on any shadow that is created and regularly updated as the object casting the shadow moves. Dynamic shadows are the very CPU intensive kind. You can turn this option off, and still retain blob shadows and tree canopies etc.

LightEnvironmentShadows= This command is supposed to create shadows in areas that are blocked from the main light source, like the sun or moon. So if your in a dark cave or under a large tree, if you go deep enough into the shadow, a secondary shadow will be created, possibly projected in a different direction than the sun/moon whatever. This setting did not seem to make any change for TERA, but I could be mistaken because I never left the noob island and got into some dark areas where a 2nd shadow might appear.

CompositeDynamicLights= This setting simplifies dynamic lights, and by doing so, supposedly increases performance. I saw no visual difference between this being on or off, which is a good thing. So I recommend turning it on. Higher FPS with no visual difference is a good thing.

DirectionalLightmaps= This setting gives a "bump mapping / shiny" type effect to textures, making them look more like a real surface then just a 2d texture. Changing this to off doesn't work in TERA, because it makes the tree leaves/canopy's turn red/green etc. So leave this on for sure.

MotionBlur= I wasn't able to notice any motion blur in TERA, but setting this to true in my other game makes turning your camera view quickly, blur the screen a bit. Its just a post processing effect that uses your GPU only, and it can make a choppy framerate look a bit better. (Motion blur is what films use to trick your eyes into thinking the 30 fps framerate is actually 60 fps. Our eyes can detect 60-70 fps.)

MotionBlurSkinning= Not sure if this setting works in TERA or not, but in the Unreal 3 Engine, it makes objects and textures that are moving quickly past you, blurred. Which is a VERY cool and visually pleasing effect. Set it to 1 to enable it, or 0 to disable. Its more of a First Person Shooter / high speed feature, so it could work in TERA, but you'd never see it unless you or a mob were moving very fast.

DepthOfField= This is a post processing feature by your GPU that adds a layer of blur to objects/textures the further they are from the camera. It can eliminate some aliasing on distant objects, which is good, and can also add to visuals, but some people don't like it.

AmbientOcclusion= This is a post processing setting that can be applied to most 3D applications. it adds darkness to areas that are slightly obstructed from the light source, or adds darkness to areas where two objects or textures intersect (such as the corners where walls meet the floor, or where plants meet the ground). In order to get this to work you will need Nvidia Inspector, see the section down below for more info on how to do this.

Bloom= This is a post processing setting that adds a glare / bright white effect to clouds or neon textures, such as LED lights on a wall or armor etc.


UseHighQualityBloom= I tried changing this setting in TERA and I was unable to see a difference in the bloom in the sky / horizon. It should give a better bloom effect at the cost of performance, but I saw no difference so, I will leave it off.

Distortion= This is a post processing effect that occurs when you swing your weapon or look into a teleportal.

SpeedTreeLeaves= Speed tree is a program/company that makes fast trees for 3d graphics, so if you turn off the "speed tree" your not just switching to "slow tree" your turning off tree leaves completely, since all of the leaves are "Speed Tree Leaves". Atleast thats what happened in my other Unreal 3 Engine game. If you want all the trees in the game to look dead like its winter, then you can set this to false.

SpeedTreeFronds= Same as speedtreeleaves, except its for fronds, which are like giant leaves, like on a palm tree.

OnlyStreamInTextures= Texture steaming is an Unreal 3 Engine aspect I still havent fully wrapped my head around. I understand what it does but I'm not sure how to change [TextureStreaming] settings to improve performance. Texture streaming is supposedly used for console games that need to keep popping new textures up on the fly without loading screens. But what this option does is it keeps Textures streaming, but makes all other streaming things (like animations and models?) load up before hand, instead of streaming in. So turning this option on should theoretically increaase Zone/Map load time, but how it effects TERA, im not sure. Just keep it at the default.

LensFlares= Post processing effect that makes your screen glare if you look directly at a light source such as the sun. Not sure if it works in TERA, because I never left the noob island lol.

FogVolumes= This is a purely graphical option that allows a fog layer to be added to an area. The fog is not lots of particle effects, its just a transparent colored volume, that can change density. So you can turn this setting off, and still see the mist of the waterfalls in TERA. So I haven't encountered any fog volume yet. I am not sure how GPU tasking fog volumes are, but if your struggling for FPS and your GPU is your bottleneck, you can turn this off.

FloatingPointRenderTargets= Changing this from its default setting just causes all the models and graphics to become fubar, so dont change it.

OneFrameThreadLag= This setting is supposed to alter the synchronization between your CPU and GPU. In my Unreal 3 Engine shooter game, turning it on has no effect, but Im usually at 40-70 fps in that game. But in TERA, I get like a 5-10 FPS boost from having this set to on. My CPU is my bottleneck so it increased FPS for me. Maybe if your CPU is really good and your GPU sucks, try changing it to off and see if there is any improvement?

OneFrameGPULag= Not entirely sure if this line works with TERA, but its kind of like the opposite of the OneFrameThreadLag line, it might improve performance in some situation.

UseVsync= This is just the game client's way of telling your graphics card to use Vsync or not. Your Nvidia Control panel's setting will always override the software's setting. But if a game has a Vsync setting like this, I think its always good practice to enable it here, and tell your graphics card "applicaton-controlled". Rather then turning this off and forcing it with the GPU.
Vertical Sync takes your framerate and syncronizes it up with the refresh rate of your monitor, to get rid of tearing. Tearing mostly occurs when your framerate is above your monitors refresh rate and you turn your view quickly. Sometimes looking at white light next to a dark surface and turning can make tearing very pronounced. Tearing is just a visual issue and your game client's performance is not affected by it. Vsync can also be used to cap a game's max framerate, which can help keep your graphics card cooled for places where you get super high FPS, like loading screens or character logins. Remember if you turn this on you should also enable Triple buffering through your graphics card menu. Vsync may slightly lower your FPS, but it should be minimal if you use Triple buffering. However, enabling these two slows down your mouse input time, so for First Person Shooters, where you need instant mouse movement, Vsync and Triple Buff are no-no's. TERA is pretty action based so, if your an archer or caster, you might want to turn vsync off.

Fullscreen= Windowed mode vs Fullscreen

AllowD3D10= Allows the engine to use Direct X 10 or not. Keep this setting off.
AllowD3D11= Allowsthe engine to use Direct X 11 or not. Keep this setting off.

SkeletalMeshLODBias= This setting goes from -1 to 4. 4 being the most jagged models and -1 being the most round, high polygon models. Not sure if it works for TERA, since TERA has a built in character model polygon option in the S1Option.ini file.

ParticleLODBias= Same as Skeletal Mesh LOD bias, except it applies to spell effects / particle effects. Having this option higher may actually make some aspects of spell effects dissapear.

DetailMode= This is a setting for 0, 1, or 2. Higher being more flashy stuff. In my other game, the higher the detail level, the more background decorations there are, such as extra decals etc. So if your squeezing for performance, set it to 0. If you want to see the game in all its glory, set it to 2.

ShadowFilterQualityBias= Changes how blurry the outlines of shadows appear. This can smooth out very blocky looking shadows if you have MaxShadowResolution set very low. This can be set to -1, 0, 1, 2, 3, or 4 and up. I don't believe there is any impact on performance. It's just a preference thing.

MaxAnisotropy= Game client's way of changing your graphics cards texture filtering. 0x 2x 4x 8x 16x Anisotropy. When textures are viewed from an angle, they become blurred. Texture filtering cleans the blur up and makes them look more natural as they should. If your new to this setting wikipedia it or google it and you'll get a better explenation.


MaxMultisamples= This is Unreal 3 Engine's built in Anti Aliasing capability. TERA has it at 1 as default, but changing it makes no difference in TERA. Where as in my other Unreal 3 game, I can increase the Multisample rate and Aliasing goes away, but the FPS hit is uber uber huge. Basically, Unreal 3 Engine scales very poorly with multisampling, so this option might be locked into the client or something. In the next section I will talk about how to get good and fast Anti Aliasing.

Shadow Resolutions:
Shadow resolution affects how blocky your shadows appear. Shadows are created by shooting lots of square beams from a point down onto the game world, and whatever beams are intercepted, show up as black squares on the surface. The higher the shadow resolution, the higher the X by X grid of beams shot down is. So with a shadow resolution of 8, you will shoot 8x8=64 beams down onto the world. 512 resolution, you will get 512x512 beams shot down. Since it multiplies like this, everytime you increase the shadow resolution, it increases the GPU intensity of the calculations, exponentially. In other words, a resolution of 128 will look twice as pixelated as a 256 resolution, but it will take FAR less then 50% the computing power. So if shadows are killing your FPS, turn these resolutions down a bit and see if it helps.

MaxShadowResolution= This controls the resolution shadows will appear as, if you get right up next to them, in their maximum state of detail.

MinShadowResolution= This controls the detail shadows will appear in, when they are at max distance from you. This setting also controls the rate at which shadows will drop in detail level, as you get further away. The best way to describe how shadows change between Max and min is with examples:

Example1: MaxRes=500, MinRes=400
Shadows at 0 yards: 500
Shadows at 25 yards: 475
Shadows at 50 yards: 450
Shadows at 75 yards 425
shadows at 100 yards: 400
Shadows at 125 yards: 400
Shadows at 150 yards: 400

Example2: MaxRes=500, MinRes=200
Shadows at 0 yards: 500
Shadows at 25 yards: 425
Shadows at 50 yards: 350
Shadows at 75 yards: 275
Shdows at 100 yards: 200
Shadows at 125 yards: 200
Shadows at 150 yards: 200

The thing is about these Unreal Engine 3 games, is that shadows very very far away will probably not render at all. So you want to set a MinShadowResolution low enough so that your shadows will degrade uniformally as you get further from them, to where you won't tell the difference.

ShadowFadeResolution= This triggers when the fade process begins. The term "Fade" in this context means go from 100% darkness to 0% darkness(invisible). This is just a mechanic used if people want their shadows to slowly fade from very dark to absent. Using Example2 from above, if ShadowFadeRes is set to 275, the shadow will begin to fade away when you get 75 yards away from it. Since in example2 75 yards = 275 shadow resolution. If you set ShadowFadeRes to 190, the fadnig process will never occur, because the shadows will never reach 190. If the Shadow Fade resolution is set to 500, the shadows will begin fading as soon as you begin walking away from them at 0 yards.

ShadowFadeExponent= This alters how much distance the camera must travel in order for a shadow to go from the ShadowFadeResolution, to completely gone. You cannot set what resolution the shadow fade process will end, and the shadow will have 0% darkness, you can only change this number.

ShadowTexelsPerPixel= I am not entirely sure what the definition of this setting is, but I'll tell you what I've experienced it as. Let say there is a huge mob and a small mob. If the huge mob is stand next to the small mob, and your looking at them both, the huge mob will have a very large shadow compared to the small mob. But if the huge mob is far away, its possible that the large mobs shadow will be equally as big( in pixel count) as the small mob's shadow. So what this option does is it changes not how far you see shadows, or how many shadows you see, but how many pixels on your screen shadows will take up. So if you set this sufficiently high, you will see almost every shadow from every creature. But if it isnt high enough, you will start to see the very smallest (pixel count) shadows no longer render. So to define this line in other words it could be called "Amount of screen space taken up by rendered shadows". Larger shadows taking precidence.

FoliageDrawRadiusMultiplier= Tried changing this around and it had no effect whatsoever, this option is probably null since there is a foilage slider in the S1Options.ini

bEnableVSMShadows= No idea what this does, and everything googled about it hasn't been able to explain it.

bEnableBranchingPCFShadows= No idea what this does, and everything googled about it hasn't been able to explain it.

bAllowBetterModulatedShadows= This setting is supposed to help older graphics cards render shadows easier, but for newer cards from like the 8800 GT and up your supposed to leave it as false. So just keep it at the default.

AllowSubsurfaceScattering= This setting allows membranes that can conduct light (such as human skin) to conduct it and then expel it. I think it may also allow light rays to refract from going through water, but I haven't seen anything in any of my games yet. So just keep this option at default, or false for performance.

bEnableForegroundShadowsOnWorld= No idea what this setting does, or if it works, I've tried changing it and I've not seen any difference in shadows, dynamic or static.

bEnableForegroundSelfShadowing= No idea what this setting does, or if it works, I've tried changing it and I've not seen any difference in shadows, dynamic or static.

ShadowFilterRadius= This setting takes your dynamic shadows, splits them into two parts, and offsets them from eachother. The default is 2.0 which blurs the edges of the shadows. But if you set it to 0.0 it will create no offset / no blur. And your shadow will look cleaner, but the pixelation will be more visible. I set it to 10.0 and the two shadow halves split apart from eachother so far that I could see two of my sword hilt's shadows on the ground. So keep this setting somewhere between 0.0 and 2.0. There should be no impact on performance, its just a visual preference thing. Clean blocky shadow outlines, or blurred smooth shadow outlines.

ShadowDepthBias= This setting defaults at 0.012 on both of my Unreal 3 games. And there is no reason to alter it, I don't believe there is any performance gain from altering it. I can't explain how it works exactly, but to summarize, the higher the value, the less volume of a shadow will appear unless you move your camera closer to the shadow. For example: if you change it to 0.03 you will see some of the chunks of shadows, from different parts of the model's shadow, but if you zoom in closer, you will see more of the shadow. Basically just leave it at 0.012, I didn't see any performance gain by changing it to 0.2 / 0.3.

FXAA= Turns on the FXAA injector, a type of Anti aliasing method, but it doesn't seem to change anything in TERA if set to true or false. it might be overridden by the "Light enrichment 1 > 2" setting change. Light Enrichment 2 looks like FXAA.

SpeedTreeBranches=
SpeedTreeBillboards= Haven't played with either of these options, they weren't in my other game, but I assume they do the same thing as the SpeedTreeLeaves command, but with branches etc.

EnableHighPolyChars= I found this command on a forum, not sure if it works or not on TERA. But if your trying to get better performance, you might want to add this line to your [SystemSettings] section and put it to false.

bAllowWholeSceneDominantShadows= Dynamic shadows are produced by mobs/players moving around. Dynamic shadows can also be produced by buildings/terrain/trees. In my other game, turning this to false would give me a good FPS increase, with no change in visuals. Because setting it to false would change the terrain/tree shadows from dynamic shadows, to shadowmap version. (Permanent static shadow, on the ground). I keep this setting at false, and it doesn't get rid of any shadows, but improved FPS.

UpscaleScreenPercentage=True
ScreenPercentage=100.000000 These two lines of code function together. There are two aspects of your screen display in the game: there is the 3d models and world. And there is the 2D user interface (UI) laid ontop of it. Have you ever wanted to decrease your game resolution, but not your interface? Well thats what this line does. I have a 1400x900 resolution monitor, if I set game to 1200x700 or whatever it is, my game will stretch that resolution out. With these lines of code, you can do that same effect, but maintain your User Interface at your native resolution. Resolution has a very large impact on performance. So if your desperate for some extra FPS and your GPU is your bottleneck, you could set your ScreenPercentage to 50.0 or 75.0 and keep UpscaleScreenPercentage as true, and you would have the performance of that lower resolution, but still keep your UI's native resolution intact. You can use this setting as a way to deal with the shimmery/pixelated look of transparent textures like flowers and grass. If you set your screenpercentage to like 95.0 or 97.0 it will just barely stretch out your game, and get rid of the shimmery pixel effect. It will basically do kind of what the "Lighting Enrichment 2" option does, blur the poop out of everything, including nameplates. But not your UI.


TEXTUREGROUP_World=(MinLODSize=1,MaxLODSize=4096,LODBias=0)
There is a large group of settings like this TEXTUREGROUP one above, but changing them does nothing. I think they are overridden by the S1Options.ini texture slider option. So don't bother changing these around.

bAllowLightShafts= This setting does not come in TERA's file by default, but it works in my other Unreal3 Engine game. It turns on Godrays / light rays that you see when looking toward the sun behind a tree or building, etc. I don't know if it works for TERA, but might as well throw it in the .ini file and set it to true in case it does, because its very very pretty,

---------------------------------------------------------

ANTI ALIASING:

In order to get the highest quality and fastest FPS Anti aliasing, you need to download Nvidia Inspector, (sorry ATI users i dont know if there is an equivalent). Open it up and click on the little wrench and poker icon on the middle right side of it. This will show you a more robust and advanced form of the "Nvidia Control Panel." On the top left look for the TERA profile, and once you select it, change the "Anti Aliasing Compatibility" line to:

0x000100C5 (Brother in Arms: Hell's Highway, Gears of War, etc...)

What I did was went to the wikipedia page and found all of the games that use Unreal Engine 3 that are listed in the AA compatiblity section. I ran my TERA client with each compatibility setting and some of them gave much better framerates then others. Here is a run down of how each compare, if En Masse is interested:

(FPS was taken from character login screen with 4x AA, and 4x (supersample) transparent AA)
0x00000041 = 51 FPS
0x00000045 = 52 FPS
0x00000145 = 26 FPS
0x000100C5 = 65 FPS
0x00020041 = 51 FPS
0x080100C5 = 62 FPS + artifacts/visual glitch

I read a thread about forcing AA and someone said they used the 0000145 option, which is what I originally did, but it gave me such a massive FPS hit compared to no AA. But once I tested them all and found the 0x000100C5 option, it runs smooth as butter. Its astonishing to me that changing one little option can double my FPS with forced AA.

So some of you might ask, "Why force AA when you can just change "Lighting Enrichment" option to 2 and get it." Well that is not real AA. Its just blurring everything. Go up to a player and look at their nameplate, set Enrichment to 1, then change it to 2. Their nameplate will get blurry and difficult to read unless your right up next to them. There is no FPS loss to my knowledge from turning Enrichment 2 on. But all it is doing is just blurrying your game, minus the interface. That is not a good solution to aliasing.

So, if you want to have any real form of AA and you are an Nvidia user, you have to set AA compatiblity to 0x000100C5 in the Nvidia Inspector. En Masse if you're reading this, I'd recommend you give the players a real AA option and use this hex code. The lighting enrichment crap won't cut it.

----------------------------------------------------------------------------------------

Multi-GPU / SLi:

I was astonished to find nearly NO search results when typing in "TERA" and "SLi" I think there was only one thread I found on a random website, and it strayed off topic before giving much info. So I had to fiddle with the compatibility setting to see if Nvidia had truly selected the correct SLi profile.

Nvidia has a profile for TERA, on their latest Driver 295.73. They might of made a profile before this driver, I'm not sure. But in Nvidia's TERA profile, it has SLi compatiblity set to:

0x02400005 (Dragon Age: Origins, X2 the Threat, Dead Space, etc)

Now the reason this is odd, is because my other Unreal 3 Engine game, which is in open beta too, runs SLi perfectly fine on my dual GTX460's. And it's SLi compatibility hex code is 0x00000000. Which means, the game by itself will run SLi fine by default. But TERA requires a compatibility code and its the same game engine. So I again looked up all of the SLi compatibility hex codes that included a Unreal 3 Engine game in it, and wrote them down. I tested each compatibility type inside the game, standing on a cliff looking down onto a big area with mobs etc. But with no moving objects on my screen, so I'd have a very steady framerate. Here are the FPS results:

0x02400005 (default) = 54 FPS
0x0240000D = 58 FPS
0x02400045 = 57 FPS
0x02402005 = 56 FPS
0x02402045 = 55 FPS
0x02402185 = 58 FPS
0x02402205 = 47 FPS
0x02402405 = 55 FPS
0x02406405 = 56 FPS
0x02C00045 = 56 FPS
0x03402405 = 55 FPS
0x42402005 = 55 FPS

After all this testing, I concluded that no other compatibility setting exceeded the default enough to warrant changing it. So I left it at the default, and gave Nvidia the benefit of the doubt.

The really odd thing about the 0x02402205 setting, is that it got a much lower FPS in the actual game, looking down from the cliff, at 47 FPS. But at the character login screen, it got 131-136 FPS, where as all the other settings got 94-96 FPS at the character login screen. Which is very very wierd. So I guess 0x02402205 is really good at SLi when there is a very small amount of 3d on your screen, but when it gets bogged down with whole scenes, it handles it worse then all the others. This is why I threw out the those Login screen numbers, because it isn't as real of a test as the actual game is.

So, in summary, there is no need to change the SLi compatibility setting as of now. There could be a better profile, but I haven't found it, and its not an Unreal 3 Engine game compatibility setting, so good luck finding it.

-----------------------------------------------------------------------

Ambient Occlusion:

In order to get Ambient Occlusion to work, you must go into Nvidia inspector and go to the TERA profile. In the very top box "Ambient occlusion compatibility" set it to:

0x00000025 (Tom Clancy's Endwar) for normal amount of darkness

or

0x00000026 (Warmonger) for very heavy/thick darkness and slightly lower FPS

Then go down to the "Common" section and change "Ambient Occlusion usage" to "Enabled - 0x00000001"

And you can set "Ambient Occlusion Setting" to performance, quality, or high quality. I'd recommend setting it on "performance" because the FPS loss is the least, and you won't be able to really notice the difference between each setting.

You can also go into your S1Engine.ini file and change AmbientOcclusion=true. Although I am not sure if this is a requirement, its a good thing to do though.

Here are the FPS numbers for all the Unreal 3 game Ambient Occlusion codes I tried. The FPS was taken at the race selection screen with my two cards in Single GPU + SLi AA mode. After the FPS are notes listed from what the AC would do in the actual game.

0x00000000 (default): failed
0x00000002 (Mirror's Edge): failed
0x00000010 (Unreal Tournament III): 58 FPS - AC has rapid flickering when looking straight down(90*) onto grass patches
0x00000020 (Mass Effect): 68 FPS - AC has very slow flickering on/off all the time
0x00000021 (Gears of War): 69 FPS - AC has rapid flickering when looking at grass patches from a 45* angle.
0x00000024 (Xmen Origins: Wolverine): 58 FPS - AC causes the grass patches to look very aliased/grainy/pixelated)
0x00000025 (Tom Clancy's Endwar): 68 FPS - AC has rapid flicking when looking straight down (90*) onto grass patches
0x00000026 (Warmonger): 64 FPS - no flickering at any view angle. But AC was abnormally thick/dense.

If your going for best performance and least issues, go with 0x00000025 Tom Clancy. The flickering at 90* only occurs if your looking straight down onto grass patches, which is a very rare scenario, Your camera will usually be 0* even with horizon, or at 45* angle.
-------------------------------------------------------------------------

GPU's vs CPU's:

My rig is dual GTX460's with 4GB of old DDR2 I think, ram. Windows Vista 32 bit, and a q6600 Core 2 Quad processor overclocked from 2.4 ghz to 3.0 ghz.

Unfortunately, my PC hit its cap on upgradability about a year ago. Because almost every game I've played now has my CPU as my bottleneck. Which really bothers me, because if there is one that has stayed relativity constant in MMORPG's in the last 15 years, its how much game code processing needs to occur.

I blame it on poor coding. Textures are higher resolution, and models are higher polygon counts, then they were 15 years ago, yes. But, I fail to see how my old Pentium 4 3.06ghz single core processor, could run Dark Age of Camelot with 50+ people on the screen just fine, with my GPU being the bottleneck. But my quad core, newer, higher speed CPU can't handle more then 15 mobs/players on the screen at once, without the FPS dropping significantly.

Out of all the testing I did, tweaking settings over and over. Do you know what the #1 biggest killer of FPS was for me? "PC View Distance" When this option is set to 1, you cant barely see mobs right infront of you. Their nameplates show up before their models do. And if you set it to 6 (max) you can see mobs about 3-4x as far. And going from setting 1 to setting 6, gives me a 15 FPS DROP. And this is a CPU issue, not a GPU issue. For some reason, viewing 30 more yards of mobs, kills my FPS. What exactly is changing in that code thats making my CPU slow down my FPS by 30% or so? Is it the animations? I am not sure. But there is no reason that viewing additional mobs should kill a game's FPS that much.

Now, wouldn't you say TERA has a very weak player/mob max view distance? Some of the foilage at max setting shows up before players/mobs do. Why would any game do that? I believe its because of this FPS killing code that additional mobs creates.

Ok my rant is over. The CPU being a bottleneck in MMORPG's has been bothering me alot. Because its the one component I can't upgrade without completely redo'ing my comp. One final note: the other Unreal 3 Engine game that Im currently playing, I can have 16-32 people flying around my screen and shooting guns/missiles, and I maintain my 40-80 FPS. But in TERA if I engage a group of mobs with max settings, my FPS dips down to about 17 or so. That is just terrible game code. Same engine, same number of mobs/players in the viscinity.

So, in terms of CPU vs GPU, TERA is a very very CPU intensive game. Your bottleneck will most likely be your CPU.

I ran my game with both GTX460's in SLi, with the default AF2 (alternate frame rendering) setting. AF2 is the optimum setting for almost every game I've played. I've played about 12+ GPU intensive games since I got SLi, and I've never experienced any black flickering or problems, but in TERA I get this super annoying black flicker once every minute or so, as if a single random frame is showing up as a black screen.

So what I do as a temporary fix is I run GTX460 #1 as the Single GPU. And use the other GPU as the SLi Anti aliasing unit. Nvidia allows you to render a game with one GPU and use the other to do the post-process Anti aliasing phase, which can allow you to get some monster high AA sample rates, that would normally kill your FPS if using just one card.

With SLi in default AF2 mode, and with 4x AA and 4x(supersample) transparent AA running, I get I'd say about 40-60 FPS when running around the noob island. 25-40 FPS when engaged in combat. My MSI Afterburner overclocking program said the GPU's were at about 25-45% usage most of the time.

But if I run a SLi mode using a Single GPU as the frame renderer, and the other GPU as the AA post processor, I get about 35-55 FPS running around, and 15-40 FPS in combat. The on screen GPU stats said GPU usage was at about 45-60% I think. It was higher then the AF2 mode but I'd say neither GPU ever broke the 70% usage mark.

So in summary, a single GTX460 should be able to play this game on max with 40-60 FPS no problem. Aslong as your CPU isn't your bottleneck.

So for those of you thiking of upgrading or getting another GPU for TERA, DONT. Its your CPU.

I put my game in windowed mode and checked CPU usage for a while, and it always sat at 50% (each core counts as 25%, so the game was capping out on its dual core usage). So, the game only uses two cores, and uses them ineffectively. So the CPU was for absolute sure the culprit in my FPS woes.

P.S. My ram wasn't the issue, TERA never broke 1.3 GB of ram usage on the noob island. And with Vista, programs get a default 1.9 GB to use, or 2.9 GB or so to use if you use a UserVA tweak.

---------------------------------------------------------------------------------------

Game Engine FPS lowering Flaws / Bugs:

Do an experiment next time you play TERA. Go to any area in the game, press Alt. Make sure your using fraps or another program to see your FPS. Let go of your mouse, and give your FPS 10-20 seconds or so, and take note of what it is.

Then grab your mouse and move it in circles. Your FPS should drop significantly. (If your CPU is the bottleneck.) For some reason, moving the mouse or giving the game client some kind of user input, is a huge killer to your framerate. I have played games in the past such as Ultima Online that had a "Run mouse in a seperate thread" setting. If you checked it, your mouse cursor would go from lagging hardcore, to very nice and smooth. So my assumption is that moving your mouse around, or opening a menu, or any other user input, is very intense on the game client's thread processes.

The unfortunate part about this is, its not just occuring when your in the Alt, free mouse cursor mode. When your running around in First Person Crosshair mode, your moving your mouse around all the time as well, and its giving the FPS drop as well. So for an action game that requires First Person shooter mouse movement, this really sucks. (For the CPU bottleneck players like me atleast)

Another way to demonstrate this is put auto run on, or hold down W, and open a interface panel over and over, such as inventory etc. Your FPS will drop, and drop, and drop. The faster you spam it, the lower your FPS will go. The correlation was nearly 1:1.

I've heard rumors that a game client's code/optimization improves alot right before launch, but considering this game has been out in Korea for a while, I fear not alot of client process management optimization will be occuring. But I've got my fingers crossed. If anyone has any ideas on why viewing additional mobs would create such massive CPU usage, please enlighten me. Because that one setting is the worst for me. And its the most important setting as well (for pvp atleast).

---------------------------------------------------------------

Texture LOD issues:

I noticed an error when playing the game at Texture Resolution setting 0, or 1. (2 being the max) Im pretty sure any player can replicate this issue if they turn down texture resolution from max.

So put the setting on 0 or 1, and zoom your camera away from your character maybe halfway between max camera distance, and right over your shoulder. Now walk up to a bush, or tree leaves, or a mountain side. You will see the texture of that object at its highest resolution, if you don't try moving forward or backward until the highest mipmap / LOD level of the texture shows up. Next, zoom your camera in, so your camera moves closer to the texture. The texture will go from high detail, to lower detail, as you move closer to it.

This is most likely a LOD bias / coding bug, because textures should ALWAYS become higher resolution the closer the camera gets to it. The highest LOD level should never be 20 feet from the object, but the entire 0-20 feet from the object range. The reason this bug / issue is really annoying, is if you have your camera zoomed out decently far, and you walk past a tree, some of that trees canopy/leaves will be right up in your camera. And you will be seeing it at the 2nd best resolution level. But if you zoom your camera out further, and leave your character stationary, the tree leaves will pop into the highest resolution threshold.

So to summarize, normally the way game graphics work is the textures get mipmap / Level-Of-Detail modes. And the further away you get from an object, the worse the resolution becomes. But you should ideally not notice a difference, because your sufficiently far from the texture.

Lets say theres 5 texture mipmaps/detail levels: A,B,C,D,and E. "A" is highest resolution, "E" is lowest. Ideally what would happen is each texture would be used when the camera is X distance from the texture:

A: 0-20 feet,
B: 20-40 feet,
C: 40-80 feet,
D: 80-160 feet,
E: 160-320 feet

What appears to be happening in TERA, is the LOD list actually goes like this:

B: 0-20 feet,
A: 20-40 feet,
B: 40-80 feet,
C: 80-160 feet,
D: 160-320 feet,
E: 320-640 feet.

So when you get into that 20-40 feet from the texture zone, you get max resolution, but if you zoom your camera closer to the texture, it will eventually "pop" back into B resolution mode. Which I'm sure is unintentional.

Changing your texture setting to max (2) resolves this issue. So I think it is a bug in the LOD code for textures or something.

----------------------------------------------------------

Well thats pretty much everything I learned/found from CB1 and CB2 weekends. I got very little progress done in terms of leveling/questing. If you think I'm crazy for spending 20 hours tweaking .ini files to get higher FPS, when I can just buy a new PC, your right. I really should just buy a new Mobo/CPU, but I do get some kind of satisfaction out of tweaking game settings. You'd be amazed at how long I spend in character creation menus :)

Other than those, I didn't see any bugs or issues, and the game looks fantastic. I pre-ordered the game already so, I may end up building a new PC to fix my CPU issues... we will see.
Edited by: stormseekaz over 2 years ago
Lalaru Profile Options #2

0

wow nice guide TY for the info! I know sliph also figured out how to set up MULTI GPU setting so it utilizes both cards for his 590.
Abeni Profile Options #3

0

Yah I can't wait for the 2nd half. Right now my machine can handle Max slider settings and then some. Problem is I'm on a 21 inch Square Resolution CRT Monitor and some of these tweaks will help me out a lot.

You know the monitor I'm using was one of the nicer ones for it's time (2002). It's still kickin' but unfortunately most games are setup for a User Interface and graphics features to work best in a wide-screen format. I think normal people can tuck UI elements to the side on their extra pixels of width...for me I have to re - arrange everything or the UI will be all up in my business.
stormseekaz Profile Options #4

0

@ Abeni, I didn't spend a whole lot of time playing with the "ScreenPercentage" and "UpscaleScreenpercentage" options, but I think if you set it to false, and then play around with your resolution, you might be able to simulate a widescreen situation.

That was one of the things I loved about Ultima Online's user interface, was that the game graphic screen was a fraction of the screen, and you could move the User Interface components outside fo the graphics area.

I tried fiddling around to get something like that, but Upscale=false and Screenpercentage=80.00 would just cover the outer rim of my game with black, and not allow the UI components to show up. So it was just putting a black frame over my display, so if you took the screenpercentage lower I guess eventually you would just see your character and some area to their flanks.
DarMMody Profile Options #5

0

Very helpful guide!

I play with 2x460GTX SLI and a i5 2500k very good framerate everywhere maxed except in cities where I get 40-60fps.

I wanted to force AA using Nvidia inspector but then my textures started to stutter and flash in some places and sometimes the game lost the AA for a few seconds leaving it without any AA at all and then the effect would comeback again (which was pretty annoying too) so I ended playing with default NVIDIA settings and using the in-game Blur.

duece95 Profile Options #6

0

Thanks alot for taking the time to post this. Very useful.
Fawney Profile Options #7

0

Stormseekaz... you're slightly crazy but, I'm glad someone is curious enough to do this sort of thing so the rest of us can be lazy so thanks! :P
Squat Profile Options #8

0

You're using Vista 32bit with 4 gigs of ram.

I am disappoint.
Edited by: Squat over 2 years ago
DarMMody Profile Options #9

0

^

Instant fps increase by only installing W7 trust me.
Imsyu Profile Options #10

0

I was wondering about this, i have my GTX 550 SLi cards on just high performance in general and was wondering if i should be turning on AA? unless high performance already includes AA cause i get high FPS pretty much everywhere unless there is like 100 people then it drops down to 30ish but in town its 40-70 and out leveling/in combat in 5 mans its around 50-60 as well and in group pvp/death matches 10 v 10 its about 40-50 also