3900x to 5900x for X4

This forum is the ideal place for all discussion relating to X4. You will also find additional information from developers here.

Moderator: Moderators for English X Forum

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4750
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Tue, 14. Jun 22, 02:20

Apparently leaks are starting to surface for the GPUs so they do exist behind NDA walls. That said I guess they can join Intel 10nm as far as being cutting edge and delivering on time goes...

Scoob
Posts: 9921
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Tue, 14. Jun 22, 22:17

I was buying some other bits the other day, so looked again at the 5900X (as a replacement for the 3900X) it's still £360 which is a good price. Once again I went through various comparisons done by various Youtubers and tech sites. While the gains - given all other system components are up to the job - were solidly around 15%, that really isn't going to cut it. I mean, say I'm in a CPU-bound 40fps situation, playable if watching / coordinating things but not so great if dogfighting for example. I'd gain approximately an additional 6 fps. Not enough to make a difference.

I guess my issue, as explained previously in this thread, was that I'd wanted to wait for Zen 3 as it improved on a number of weaknesses with Zen 2. However, due to my existing (at the time) PC throwing a wobbly, my hand was forced - if I wanted to keep playing games. That's why I check back on Zen 3 stuff from time to time.

I've scratched my upgrade itch recently - first time since 2019 - first with a new 1440p Monitor (I'd been on 1200p for over ten years!) then with a replacement GPU (1070 to 3070). Things are generally running really well, and X4 continues to be the only game even remotely CPU-limited. So, while my FPS isn't any better when things get busy, it's markedly prettier :)

Regarding CPU upgrades, I suspect I'll see what Zen 4 brings and look at doing a full system upgrade then... or not, if I'm happy how this PC is performing still.

Scoob
Posts: 9921
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Wed, 22. Jun 22, 16:26

Here's something weird I've noticed that I certainly did no expect...

With the 1070 - which is still a great GPU btw - I was clearly CPU bound in certain scenes. Zero surprised there. Heavy fleet combat was the main offender here. I will say though, were Busy Wharfs - lots of ships landing and taking off, especially Combat Drones - used to hurt fps, that doesn't really happen any more, at least to no where near the same degree.

Anyway, since getting the 3070, I knew I'd be able to turn up a few graphical settings - basically all of them lol - but I'd still have issues in the usual CPU-limited (large battles) scenarios, as well as other busy locations. However, and this is the surprising bit, I appear to be getting such issues far far less than I once was. Sure, I still get the odd random "wonder why fps has tanked. Save / Reload all is well" moments. Plus I also still see a marked drop in fps (25%) when there are NPC's wandering around a docking platform vs. not - indeed there's a marked judder and overall (25%) fps drop when they spawn in seconds after teleporting to one of my stations. However, when in a spaceship, doing spaceship stuff, my fps is a LOT more stable.

So, are there some features that the 30 series GPU's can do, that a 10 series GPU might resort to some sort of CPU fall-back, where it cannot do that particular thing? We've seen this over the years of course in other titles, where a GPU might not be able to fo a certain effect / lack support for a particular DX feature, so the CPU does it - but at a CPU cost.

Just wondering, as it surprised me that previously obviously CPU-limited scenes appear to run smoother now.

Oh, just popped a water block on the 3070, be gone fan noise!

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4750
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Thu, 23. Jun 22, 12:07

Scoob wrote:
Wed, 22. Jun 22, 16:26
So, are there some features that the 30 series GPU's can do, that a 10 series GPU might resort to some sort of CPU fall-back, where it cannot do that particular thing? We've seen this over the years of course in other titles, where a GPU might not be able to fo a certain effect / lack support for a particular DX feature, so the CPU does it - but at a CPU cost.
Likely driver overhead. Nvidia had notoriously bad DX12 and Vulkan support until the RTX 2000 series, with the GTX1000 series barely getting things competitive with AMD. I would not be surprised if the RTX3000 series has further streamlined drivers and hardware design to reduce the amount of CPU driver overhead. Less CPU driver overhead means more CPU time and resources to queue render commands so more FPS.

If using a PCIe 4.0 compatible CPU then it can also be from the increased PCIe speed. Although not really saving CPU cycles it does cut down on potential bottlenecks that could stall frame rendering such as moving data between RAM and video memory.

There are also a number of other technologies that can help with GPU performance. For example GPU scheduling offload to further reduce CPU driver overhead, resizable BAR support to further reduce driver overhead and improve memory copy performance, e.t.c.

However without corroboration with other people you should take your own results with a pinch of salt. It could easily be that changing the GPU also changed some of the driver settings back to default for the GPU which perform better than what you had your previous card set to. It could even be your previous card was slightly unstable so suffering a large performance penalty you were not aware of.

Scoob
Posts: 9921
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Thu, 23. Jun 22, 17:08

I didn't realise they'd improved things significantly, so didn't think of driver overhead reduction. Could well be that. As I already had an NV card, I didn't even change the drivers, so all settings were preserved. I've since started tweaking things though. I guess I should just accept it's running better and be content with that, though I was surprised.

PC is so quite now both CPU and GPU are water-cooled.

geckoproductions
Posts: 22
Joined: Tue, 31. Aug 10, 09:48

Re: 3900x to 5900x for X4

Post by geckoproductions » Fri, 24. Jun 22, 01:13

Hey so I have a 5900x, coupled with a 6900xt AMD card, and I have to be honest, it never goes above 30% CPU usage. I dont know if that is an issue with optimisation or if my capacity is just more than enough, but I drop to 28fps or something just using the map, in 4k. even in seti mode, it only drops down to 30fps or so, and things are pretty smooth. so my 2 cents: I definitely have more than X4 needs/can use.

jlehtone
Posts: 21801
Joined: Sat, 23. Apr 05, 21:42
x4

Re: 3900x to 5900x for X4

Post by jlehtone » Fri, 24. Jun 22, 11:12

The 5900x has 12c/24t? The Windows shows 100% when all cores are in full use? Then "30%" would be bit over 3 cores, which is what the game is known to be able to use. (Linux shows "per core", so your CPU in full use -- all 24 "logical cores" -- would be 2400%, when Windows shows just 100%.)

X4 has one main thread that easily consumes 100% of one core and that is where the limit lies.
Goner Pancake Protector X
Insanity included at no extra charge.
There is no Box. I am the sand.

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4750
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Fri, 24. Jun 22, 13:58

jlehtone wrote:
Fri, 24. Jun 22, 11:12
The 5900x has 12c/24t?
That is correct.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
The Windows shows 100% when all cores are in full use?
Windows shows 100% CPU usage when all threads are in use. Due to how SMT works this means that at 50% CPU usage it is possible that as good as 100% of core resources are already being used with the other 50% usage being made up by mostly stealing half the cycles from the 50% that is already working. Technically SMT allows the use of cycles or core resources that a single thread cannot fully use, but in most gaming workloads this extra computation results in ~5% more performance at best and often a slight performance regression due to the halved per thread performance.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
Then "30%" would be bit over 3 cores
Using the above logic this means that ~7 CPU cores are being almost fully used. To be honest this is unusually high for X4, with it usually only loading 2-3 heavily.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
Linux shows "per core", so your CPU in full use -- all 24 "logical cores" -- would be 2400%
Linux shows per thread/logical processor. For the 5900X this would mean that most CPU resources are being used at just 1,200% since beyond that is all SMT threads which give minimal additional throughput in most workloads.

Post Reply

Return to “X4: Foundations”