Nvidia _Program Settings_ for X3:TC
Moderators: timon37, Moderators for English X Forum
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
Nvidia _Program Settings_ for X3:TC
Hello. I don't see X3:TC in my nVidia Program Settings in the nVidia Control Panel. I found that my 2nd vid card wasn't heating up, so I had to go in and create a profile for X3 then change 'SLI Performance Mode' to 'Force ** 2' to get SLI to work. I'm wondering if there any other settings that will optimize the game for me. I'm not worried about performance; I'm water cooled with dual 750m 800GTXs and 4ghz extreme 6850. I wonder if someone from egosoft could post a 'best peformance' and 'best quality' example of the settings in the nVidia control panel, as well as a 'do not setting these options' footnote.
Thanks! Great community here.
Thanks! Great community here.
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
Before my OP I had my settings all to "global" (except SLI had been turned on) in the nVidia Control Panel Program Settings for X3:TC (and rolling demo). I ran the rolling demo as a benchmark and received an overall average framerate of 71.
I did some research and changed the prog settings to what I thought would be best. I now have an overall average framerate of 76.6 AND the game looks much more cinematic! Egosoft, please take a look at these settings and tell me if I'm on the right track:
(showing only items I changed, rest are default)
Antialiasing - Transparency: Multisampling
Force mipmaps: Trilinear
Multi-Display/mixed-GPU acceleration: Single Display performance mode
SLI Performance mode: Force alternate frame rendering 2
Texture filtering - Anisotropic sample optimization: on
Threaded optimization: on
Triple Buffering: on
I did some research and changed the prog settings to what I thought would be best. I now have an overall average framerate of 76.6 AND the game looks much more cinematic! Egosoft, please take a look at these settings and tell me if I'm on the right track:
(showing only items I changed, rest are default)
Antialiasing - Transparency: Multisampling
Force mipmaps: Trilinear
Multi-Display/mixed-GPU acceleration: Single Display performance mode
SLI Performance mode: Force alternate frame rendering 2
Texture filtering - Anisotropic sample optimization: on
Threaded optimization: on
Triple Buffering: on
-
- Posts: 1399
- Joined: Wed, 6. Nov 02, 20:31
I'm so thankfull of this post!
Having checked my settings in the same program I found that SLi mode wasn't even turned on! Soon corrected.
I adopted most of the settings you have suggested except for forcing SLi mode due to the warning that the application has settings to control this. I'll wait for recommendations before afixing it.
Going to jump in game now to see the difference. (Cant beleive I've not been ustilising my SLi config
)
Having checked my settings in the same program I found that SLi mode wasn't even turned on! Soon corrected.
I adopted most of the settings you have suggested except for forcing SLi mode due to the warning that the application has settings to control this. I'll wait for recommendations before afixing it.
Going to jump in game now to see the difference. (Cant beleive I've not been ustilising my SLi config

-
- Posts: 1399
- Joined: Wed, 6. Nov 02, 20:31
Well using your suggested settings the game grashed when in a sector that contained a few of my own complexes and I opened up the comms menu with an alternative station.
So I reverted to recommended Nvidia settings for the application for my set-up and kept SLi on and everything ran fine. Not much of an expected difference from original set-up but some slight improvement over the slight performance hit in busy sectors where I have lots of property. To be honest I couldnt really say as I cleared it of roids since I last visited.
It would be interesting to know about this SLi force in mode for performance, especially if this is required to have SLi function.
@OP: Of course your "rig" sounds as if it is much better suited for more complex graphics and performance settings than mine so your chosen settings may well work out without issue.
So I reverted to recommended Nvidia settings for the application for my set-up and kept SLi on and everything ran fine. Not much of an expected difference from original set-up but some slight improvement over the slight performance hit in busy sectors where I have lots of property. To be honest I couldnt really say as I cleared it of roids since I last visited.
It would be interesting to know about this SLi force in mode for performance, especially if this is required to have SLi function.
@OP: Of course your "rig" sounds as if it is much better suited for more complex graphics and performance settings than mine so your chosen settings may well work out without issue.
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
I've not yet had a crash with these settings, but then again I'm a n00b and have no stations up yet.
Pelador - the one setting that I think you will need to make sure that you're running to utilize SLI is:
SLI Performance mode: Force alternate frame rendering 2
Now, there is some discussion as to whether 'Force alternate frame rendering 2' or 'Force alternate frame rendering 1' is best for X3:TC, but it has to be one of those (or maybe 'force split-frame rendering' as well) to make both of your video cards work together. The warning is really just saying 'hey, nvidia chooses the best settings for this already and your game should be in the list' but the game isn't in the list of programs in the nvidia console. If you have 'Select an SLI Configurtion: Enable SLI' selected in 'Set SLI and PhysX configuration, then take a look at a game in the list on the 'Manage 3D settings' page of the nvidia console. You'll notice that game titles that natively request SLI mode say the following under SLI performance mode: 'NVIDIA recommended (SLI)'.
I'm just really unsure. I searched the internet extensively and found no example configs for nvidia/x3:tc. I'm very anxious to see what Egosoft has to say.
*edit* - btw.. I tried running the demo/benchmark with Select an SLI Configuration set to the nvidia default and only got 52fps instead of 76fps. I watched my second video card temps with Everest Pro and they were steady (2nd card wasn't doing anything).
Pelador - the one setting that I think you will need to make sure that you're running to utilize SLI is:
SLI Performance mode: Force alternate frame rendering 2
Now, there is some discussion as to whether 'Force alternate frame rendering 2' or 'Force alternate frame rendering 1' is best for X3:TC, but it has to be one of those (or maybe 'force split-frame rendering' as well) to make both of your video cards work together. The warning is really just saying 'hey, nvidia chooses the best settings for this already and your game should be in the list' but the game isn't in the list of programs in the nvidia console. If you have 'Select an SLI Configurtion: Enable SLI' selected in 'Set SLI and PhysX configuration, then take a look at a game in the list on the 'Manage 3D settings' page of the nvidia console. You'll notice that game titles that natively request SLI mode say the following under SLI performance mode: 'NVIDIA recommended (SLI)'.
I'm just really unsure. I searched the internet extensively and found no example configs for nvidia/x3:tc. I'm very anxious to see what Egosoft has to say.
*edit* - btw.. I tried running the demo/benchmark with Select an SLI Configuration set to the nvidia default and only got 52fps instead of 76fps. I watched my second video card temps with Everest Pro and they were steady (2nd card wasn't doing anything).
-
- EGOSOFT
- Posts: 54273
- Joined: Tue, 29. Apr 03, 00:56
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
Respond to the best quality/performance settings for modern dual nVidia cards please. There are some settings that are not good for certain games based on their graphics engines. I assumed that the creaters of the software would better understand the dynamics behind the graphics engine and how it is affected by different settings within the nVidia Control Panel. A best performance and best quality example would be very helpful and would allow the end users to decide on settings based on their personal preferences. If there isn't anyone available to do this work I'd understand, you're doing amazing things with X3, but I'm confused as to how a modern company wouldn't have tested such an advanced simulator on a modern system. I believe that the high end software market is using high end systems. Could someone there collaborate with an nVidia representative to provide the examples? Perhaps that would be the first step in creating an nVidia Profile for X3:TC?Respond to what? What settings people use is up to them, and is based on their specific hardware and their own personal preferences and priorities. As far as I'm aware nobody at Egosoft has an SLI setup themselves, so giving advice on that wouldn't be easy anyway.
Thanks for your time. I know how busy you must be.
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
Shawn,
What's "best" depend on what you prefer. some people MUST have FSAA, others prefer oodles of frames. Some prefer VSYNC on, others must have it off. Not even the game's developer can know what settings you prefer to use in your SLI rig, nor is it up to developers to adapt their games to work better with one brand of multi-GPU computing (that's at least superficially why we have APIs).
Best performance: Turn all settings to low or off (but keep SLI to a setting that works).
Best quality: Turn all settings to max or on.
It's not that hard, and it really, really is down to personal preference.
The Nvidia profiles come from Nvidia and the most (only?) relevant part of those is what type of SLI rendering works best with each application. As you know, you can always add a profile if there isn't one created by Nvidia already.
What's "best" depend on what you prefer. some people MUST have FSAA, others prefer oodles of frames. Some prefer VSYNC on, others must have it off. Not even the game's developer can know what settings you prefer to use in your SLI rig, nor is it up to developers to adapt their games to work better with one brand of multi-GPU computing (that's at least superficially why we have APIs).
Best performance: Turn all settings to low or off (but keep SLI to a setting that works).
Best quality: Turn all settings to max or on.
It's not that hard, and it really, really is down to personal preference.
The Nvidia profiles come from Nvidia and the most (only?) relevant part of those is what type of SLI rendering works best with each application. As you know, you can always add a profile if there isn't one created by Nvidia already.
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- EGOSOFT
- Posts: 54273
- Joined: Tue, 29. Apr 03, 00:56
The previous poster has hit the nail on the head on pretty much all counts.
The only thing I'd like to add is that developers generally try not to work with top of the range equipment in-house. Quite apart from the cost of keeping up with such equipment, doing so tends to give false expectations of how the game will perform for the vast majority of players who are much more likely to have mid-range systems.
The only thing I'd like to add is that developers generally try not to work with top of the range equipment in-house. Quite apart from the cost of keeping up with such equipment, doing so tends to give false expectations of how the game will perform for the vast majority of players who are much more likely to have mid-range systems.
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
The problem with that is that you're developing a game that won't be on the streets until 2 years from now and the high-range machines you bought at the start of development are mid-range machines today and they'll be low-end when you're done. 

/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- EGOSOFT
- Posts: 54273
- Joined: Tue, 29. Apr 03, 00:56
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
-Egosoft
Let me reiterate.. there are some settings that should be turned off for some games, and others that have more effect based on the engine. There are a combination of settings that create best performance and best quality case scenerios depending on how the software was created. Example configs. 'Best' meaning numbers and statistics, not opinions.
I would have prefered no answer rather than 'it doesn't matter'. I'm sure that's not the company policy. Atleast, I've been in the industry for a very long time and have never seen a company with that policy. In the meantime I'm going to forward this thread to someone at nVidia so they can contact Egosoft and offer some assistance.
-MisinformedRespond to what? What settings people use is up to them, and is based on their specific hardware and their own personal preferences and priorities. As far as I'm aware nobody at Egosoft has an SLI setup themselves, so giving advice on that wouldn't be easy anyway.
OK.. so the answer is.. I'll get no answer. I wonder what percentage of nVidia/X3:TC users are using default nVidia settings and in turn not playing the game at its full potential? One user has already posted here that he had no idea how to manually configure his graphics for the game. 'End Users'... remember?Best performance: Turn all settings to low or off (but keep SLI to a setting that works).
Best quality: Turn all settings to max or on.
It's not that hard, and it really, really is down to personal preference.
Let me reiterate.. there are some settings that should be turned off for some games, and others that have more effect based on the engine. There are a combination of settings that create best performance and best quality case scenerios depending on how the software was created. Example configs. 'Best' meaning numbers and statistics, not opinions.
I would have prefered no answer rather than 'it doesn't matter'. I'm sure that's not the company policy. Atleast, I've been in the industry for a very long time and have never seen a company with that policy. In the meantime I'm going to forward this thread to someone at nVidia so they can contact Egosoft and offer some assistance.
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
-Reply from nVidia executive almost immediatelyIn the meantime I'm going to forward this thread to someone at nVidia so they can contact Egosoft and offer some assistance.
I hope this helps.Shawn,
Thanks for sending this. I will forward it to our Dev. Rel. team so they can look into this.
Regards,
(name omitted)
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
Hey, you're the one who said not to work on high range machines. In reality, there are SOME engineering samples sent to most developers, unless that's changed drastically in recent years.CBJ wrote:You're making the (false) assumption that there are no PC upgrades during the course of those 2 years.
The QA department and preferably the engine programmer/s should have some beefier stuff to try out experimental functions on, right?
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- Posts: 13647
- Joined: Thu, 15. Jul 04, 04:41
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
Shawn,
Just because you don't like or read the answers you're given that doesn't mean the answers are wrong or that they don't exist.
There are settings that you prefer to enable at the cost of performance and there are settings you shy away from because you don't see the point. If the difference is only in image quality - what numerical metrics would you use to measure it, when it's all down to personal preference?
Who do you believe is responsible for manually configuring your computer the way you want it? And why do you think there are user-configurable and graphical settings and profiles storage in all video drivers if there was one, "best" setting for all games and all customers?
I'm starting to think you may not know what you're talking about, but when it comes to the SLI rasterization mode, you're right that some settings will affect performance immensely. There are also some games that just won't use FSAA even if enabled and others were you MUST force the Anisotropic filtering level in the drivers or you'll be stuck with trilinear texture filtering. But if you had any other settings in mind, you're welcome to elucidate your point.
I wonder which industry it is that you have been in.
http://www.mobygames.com/search/quick?q ... on&x=0&y=0
http://www.mobygames.com/developer/shee ... rId,23880/
Good thing you reminded Nvidia to add a preset profile for Terran Conflict. It's just too bad that neither SLI nor Crossfire are compatible / stable enough to use default-on for all applications, but at least every SLI and Crossfire video card and motherboard come with big, colored quickstart guides that show you how to enable it.
Just because you don't like or read the answers you're given that doesn't mean the answers are wrong or that they don't exist.
There are settings that you prefer to enable at the cost of performance and there are settings you shy away from because you don't see the point. If the difference is only in image quality - what numerical metrics would you use to measure it, when it's all down to personal preference?
Who do you believe is responsible for manually configuring your computer the way you want it? And why do you think there are user-configurable and graphical settings and profiles storage in all video drivers if there was one, "best" setting for all games and all customers?
I'm starting to think you may not know what you're talking about, but when it comes to the SLI rasterization mode, you're right that some settings will affect performance immensely. There are also some games that just won't use FSAA even if enabled and others were you MUST force the Anisotropic filtering level in the drivers or you'll be stuck with trilinear texture filtering. But if you had any other settings in mind, you're welcome to elucidate your point.
I wonder which industry it is that you have been in.
http://www.mobygames.com/search/quick?q ... on&x=0&y=0
http://www.mobygames.com/developer/shee ... rId,23880/
Good thing you reminded Nvidia to add a preset profile for Terran Conflict. It's just too bad that neither SLI nor Crossfire are compatible / stable enough to use default-on for all applications, but at least every SLI and Crossfire video card and motherboard come with big, colored quickstart guides that show you how to enable it.
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- Posts: 1399
- Joined: Wed, 6. Nov 02, 20:31
At this stage I'm personally content to run with recommended settings. But the issue of having to set SLi force performance modes to get the actual use of SLi configurations would be nice to clear up for Nvidia cards.
I doubt there is a mode that may "hurt" my PC or the game (mainly thinking about potential heat variance with settings, but I'd imagine those variables to be considered and safe else there wouldnt be an option, not like its like overclocking)? So am willing to try various settings to see if there is a benefit.
I doubt there is a mode that may "hurt" my PC or the game (mainly thinking about potential heat variance with settings, but I'd imagine those variables to be considered and safe else there wouldnt be an option, not like its like overclocking)? So am willing to try various settings to see if there is a benefit.
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
It is a bother, but it's became some games will not run at all with some SLI settings, so the default setting for games not defined in any profile is that SLI is not used. This way more games will work right off the bat, while Nvidia constantly add more profiles (in driver updates) for new games verified to work with one or several SLI modes.pelador wrote:But the issue of having to set SLi force performance modes to get the actual use of SLi configurations would be nice to clear up for Nvidia cards.
You can only do that through overclocking or setting the video card fan too low.pelador wrote:I doubt there is a mode that may "hurt" my PC or the game (mainly thinking about potential heat variance with settings,
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- Posts: 2928
- Joined: Sat, 6. Mar 04, 16:44
In a word, no. It would be helpful and in an ideal world that would be good, but more mileage is gained from testing on minimum spec machines that on higher spec machines. Having both not only eats into the QA budget, but also requires more QA people to use it.perpiotrredman wrote:The QA department and preferably the engine programmer/s should have some beefier stuff to try out experimental functions on, right?CBJ wrote:You're making the (false) assumption that there are no PC upgrades during the course of those 2 years.