3900x to 5900x for X4

This forum is the ideal place for all discussion relating to X4. You will also find additional information from developers here.

Moderator: Moderators for English X Forum

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4764
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Wed, 20. Apr 22, 03:48

Diroc wrote:
Wed, 20. Apr 22, 01:25
I can run a little cooler with my FX-8150 only using 4 of the 8 supposed cores.
The FX-8150 is not really comparable to the processors of today, in a bad way. Although it is "8 core" it has core pairs which share a floating point unit so for any application that involves floating point units, such as most games, it effectively only has 4 cores. This means that if you disable 4 cores that make up 2/4 pairs you are likely forcing X4 to run on a dual core processor which will impact performance significantly. If you disable 1 core from each pair then performance should not really change at all, but neither should power consumption unless those cores could not idle efficiently. There was a reason these processors nearly bankrupted AMD...

For more modern AMD processors I would generally recommend adjusting the power limit if you are concerned about thermals. This would give applications access to full boost speed and all cores as they want while letting the processor governor manage balancing the power draw. Idle cores use very little power and in power limited situations, such as all core workloads, the cores will boost to a lower, more energy efficient, frequency. The only time disabling the cores could give better performance is if the application has very bad scaling with core count to the point that the gains from all the extra parallelism it is using are less than the gains from slightly higher core clocks.

With modern AMD and Intel CPUs you should not even need to worry about thermal throttling since they are designed to deal with it for extended periods. As long as the cooling solution is reasonable (good contact, removes some fraction of the heat, fans not obstructed by a solid sheet of glass) they should just work and automatically drop their power usage to match what the cooling solution can remove. How good this is for processor longevity is another matter, one I do not know a definite answer for, which is why some people might prefer to lower the power limit.

Scoob
Posts: 10072
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Wed, 20. Apr 22, 16:37

Looks like the 5800X3D is £410 in the uk - that's £50 more than the 5900X is currently sold for. Not as good a price as I'd hoped, but not really a surprise. Will have a ponder about what to do. Really though, in titles other than X4, I'm more GPU-limited. I can turn a few settings down to get my desired minimum FPS (60), I'm fine with that. However, the 1070 does seem to be working quite hard these days.

Action_Parsnip
Posts: 12
Joined: Wed, 19. Sep 12, 17:26

Re: 3900x to 5900x for X4

Post by Action_Parsnip » Sat, 23. Apr 22, 04:31

I am extremely interested in seeing what a 5800x3d can do in this game! The UK pricing is more than a 5900x but if you're mainly gaming than I don't see a better choice?

Also re: new GPUs, the pricing situation seems very fluid right now, Ethereum pricing is solidly on the low end and GPU prices are dropping week by week. I'd say give it a month and things might be dropping through the floor.

Scoob
Posts: 10072
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Sat, 23. Apr 22, 13:42

Same, I'd love to see what results people get. 5800X3D current pre-order only at Scan, they sold the initial stock quite quickly it seems. It does irk that the excellent 5900X is £50 cheaper though. If the 5800x3D had been the same price as the 5900X (£360) I'd have been fine with that. It's just that it's ONLY X4 that needs the extra CPU (especially single threaded) as other taxing games I have multi-thread really well. X4 does not.

I was tempted by the 3080Ti Founder's Edition at £1,049 - lower than any other 3080Ti bar the KFA - but it sold out quickly. With summer almost here - had some stunning weather already in the UK - I know I'll be using the PC less, so it makes sense to wait if I can as the 40-series (and AMD's products) are due 2H / Q3/4 and look to be quite a jump.

User avatar
BigBANGtheory
Posts: 3168
Joined: Sun, 23. Oct 05, 12:13
x4

Re: 3900x to 5900x for X4

Post by BigBANGtheory » Sun, 24. Apr 22, 11:03

Apparently if you have an MSI MEG X570 series motherboard they've enabled overclocking for the new 5800X3D :)

This is very useful because the clock speed has been reduced compared to the 5800X and will potentially help its performance in more scenarios, assuming of course you can squeeze more speed out of it.

I like AMD products but I'm saddened by their pricing decisions and of course those of its competitors, I guess the morale of the story is when demand is high there simply is no real competition just alternatives and then all parties have a vested interest to keep their margins high. :|

Socratatus
Posts: 1502
Joined: Tue, 11. May 04, 15:34
x4

Re: 3900x to 5900x for X4

Post by Socratatus » Sun, 24. Apr 22, 15:01

I'm pretty pleased with my 5900X, runs X4 flawlessly so far.
"If you`re looking for that one person who can change your life, take a look in the mirror."
"No problem can withstand the assault of sustained thinking."
"Don`t raise your voice. improve your argument."
"Some men are morally opposed to violence. They are protected by men who are not."

Valhalla_Awaits
Posts: 42
Joined: Sun, 24. Apr 22, 04:07
x4

Re: 3900x to 5900x for X4

Post by Valhalla_Awaits » Mon, 25. Apr 22, 05:04

Socratatus wrote:
Sun, 24. Apr 22, 15:01
I'm pretty pleased with my 5900X, runs X4 flawlessly so far.
Vanilla X4 sure, even a much older cpu can. The issue is when you start using mods that increase station counts from vanilla's capped limit, or fleet sizes, or allow for "actual" wars to happen. Then even the best cpus beg for the sweet release of the bluescreen of death. :P

I am also very curious how the new 5800X3D performs in X4. I'll likely wait for the 6000 series to add 3D vcache, but that won't be till mid/late 2023 it seems. But I'd like to know how much the new 3D vcache works with the poorly threaded X4.

Animaga
Posts: 6
Joined: Tue, 9. Mar 21, 06:54
x4

Re: 3900x to 5900x for X4

Post by Animaga » Mon, 25. Apr 22, 06:51

Valhalla_Awaits wrote:
Mon, 25. Apr 22, 05:04
Socratatus wrote:
Sun, 24. Apr 22, 15:01
I'm pretty pleased with my 5900X, runs X4 flawlessly so far.
Vanilla X4 sure, even a much older cpu can. The issue is when you start using mods that increase station counts from vanilla's capped limit, or fleet sizes, or allow for "actual" wars to happen. Then even the best cpus beg for the sweet release of the bluescreen of death. :P

I am also very curious how the new 5800X3D performs in X4. I'll likely wait for the 6000 series to add 3D vcache, but that won't be till mid/late 2023 it seems. But I'd like to know how much the new 3D vcache works with the poorly threaded X4.
Just made a video and posted in the FPS thread about the 5800X3D I just got.

User avatar
PromX
Posts: 122
Joined: Thu, 21. Feb 08, 23:39
x4

Re: 3900x to 5900x for X4

Post by PromX » Fri, 29. Apr 22, 18:08

I wonder how big would the difference be if i upgraded to, for example, the 5700x from my the current R5 3600.

User avatar
BigBANGtheory
Posts: 3168
Joined: Sun, 23. Oct 05, 12:13
x4

Re: 3900x to 5900x for X4

Post by BigBANGtheory » Mon, 13. Jun 22, 23:51

Imperial Good wrote:
Tue, 19. Apr 22, 00:26
BigBANGtheory wrote:
Sun, 17. Apr 22, 21:45
In 6-8months time Intel Arc will be competing against Navi 33 with perf levels around a 6900XT way above 3070 lvls....
And probably with matching price tag way above the 3070... Even if Intel Arc is not high end, all it has to do is provide a modern feature set with good value proposition.
or nothing at all it seems :? Arc is later than late as predicted...

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4764
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Tue, 14. Jun 22, 02:20

Apparently leaks are starting to surface for the GPUs so they do exist behind NDA walls. That said I guess they can join Intel 10nm as far as being cutting edge and delivering on time goes...

Scoob
Posts: 10072
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Tue, 14. Jun 22, 22:17

I was buying some other bits the other day, so looked again at the 5900X (as a replacement for the 3900X) it's still £360 which is a good price. Once again I went through various comparisons done by various Youtubers and tech sites. While the gains - given all other system components are up to the job - were solidly around 15%, that really isn't going to cut it. I mean, say I'm in a CPU-bound 40fps situation, playable if watching / coordinating things but not so great if dogfighting for example. I'd gain approximately an additional 6 fps. Not enough to make a difference.

I guess my issue, as explained previously in this thread, was that I'd wanted to wait for Zen 3 as it improved on a number of weaknesses with Zen 2. However, due to my existing (at the time) PC throwing a wobbly, my hand was forced - if I wanted to keep playing games. That's why I check back on Zen 3 stuff from time to time.

I've scratched my upgrade itch recently - first time since 2019 - first with a new 1440p Monitor (I'd been on 1200p for over ten years!) then with a replacement GPU (1070 to 3070). Things are generally running really well, and X4 continues to be the only game even remotely CPU-limited. So, while my FPS isn't any better when things get busy, it's markedly prettier :)

Regarding CPU upgrades, I suspect I'll see what Zen 4 brings and look at doing a full system upgrade then... or not, if I'm happy how this PC is performing still.

Scoob
Posts: 10072
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Wed, 22. Jun 22, 16:26

Here's something weird I've noticed that I certainly did no expect...

With the 1070 - which is still a great GPU btw - I was clearly CPU bound in certain scenes. Zero surprised there. Heavy fleet combat was the main offender here. I will say though, were Busy Wharfs - lots of ships landing and taking off, especially Combat Drones - used to hurt fps, that doesn't really happen any more, at least to no where near the same degree.

Anyway, since getting the 3070, I knew I'd be able to turn up a few graphical settings - basically all of them lol - but I'd still have issues in the usual CPU-limited (large battles) scenarios, as well as other busy locations. However, and this is the surprising bit, I appear to be getting such issues far far less than I once was. Sure, I still get the odd random "wonder why fps has tanked. Save / Reload all is well" moments. Plus I also still see a marked drop in fps (25%) when there are NPC's wandering around a docking platform vs. not - indeed there's a marked judder and overall (25%) fps drop when they spawn in seconds after teleporting to one of my stations. However, when in a spaceship, doing spaceship stuff, my fps is a LOT more stable.

So, are there some features that the 30 series GPU's can do, that a 10 series GPU might resort to some sort of CPU fall-back, where it cannot do that particular thing? We've seen this over the years of course in other titles, where a GPU might not be able to fo a certain effect / lack support for a particular DX feature, so the CPU does it - but at a CPU cost.

Just wondering, as it surprised me that previously obviously CPU-limited scenes appear to run smoother now.

Oh, just popped a water block on the 3070, be gone fan noise!

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4764
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Thu, 23. Jun 22, 12:07

Scoob wrote:
Wed, 22. Jun 22, 16:26
So, are there some features that the 30 series GPU's can do, that a 10 series GPU might resort to some sort of CPU fall-back, where it cannot do that particular thing? We've seen this over the years of course in other titles, where a GPU might not be able to fo a certain effect / lack support for a particular DX feature, so the CPU does it - but at a CPU cost.
Likely driver overhead. Nvidia had notoriously bad DX12 and Vulkan support until the RTX 2000 series, with the GTX1000 series barely getting things competitive with AMD. I would not be surprised if the RTX3000 series has further streamlined drivers and hardware design to reduce the amount of CPU driver overhead. Less CPU driver overhead means more CPU time and resources to queue render commands so more FPS.

If using a PCIe 4.0 compatible CPU then it can also be from the increased PCIe speed. Although not really saving CPU cycles it does cut down on potential bottlenecks that could stall frame rendering such as moving data between RAM and video memory.

There are also a number of other technologies that can help with GPU performance. For example GPU scheduling offload to further reduce CPU driver overhead, resizable BAR support to further reduce driver overhead and improve memory copy performance, e.t.c.

However without corroboration with other people you should take your own results with a pinch of salt. It could easily be that changing the GPU also changed some of the driver settings back to default for the GPU which perform better than what you had your previous card set to. It could even be your previous card was slightly unstable so suffering a large performance penalty you were not aware of.

Scoob
Posts: 10072
Joined: Thu, 27. Feb 03, 22:28
x4

Re: 3900x to 5900x for X4

Post by Scoob » Thu, 23. Jun 22, 17:08

I didn't realise they'd improved things significantly, so didn't think of driver overhead reduction. Could well be that. As I already had an NV card, I didn't even change the drivers, so all settings were preserved. I've since started tweaking things though. I guess I should just accept it's running better and be content with that, though I was surprised.

PC is so quite now both CPU and GPU are water-cooled.

geckoproductions
Posts: 22
Joined: Tue, 31. Aug 10, 09:48

Re: 3900x to 5900x for X4

Post by geckoproductions » Fri, 24. Jun 22, 01:13

Hey so I have a 5900x, coupled with a 6900xt AMD card, and I have to be honest, it never goes above 30% CPU usage. I dont know if that is an issue with optimisation or if my capacity is just more than enough, but I drop to 28fps or something just using the map, in 4k. even in seti mode, it only drops down to 30fps or so, and things are pretty smooth. so my 2 cents: I definitely have more than X4 needs/can use.

jlehtone
Posts: 21810
Joined: Sat, 23. Apr 05, 21:42
x4

Re: 3900x to 5900x for X4

Post by jlehtone » Fri, 24. Jun 22, 11:12

The 5900x has 12c/24t? The Windows shows 100% when all cores are in full use? Then "30%" would be bit over 3 cores, which is what the game is known to be able to use. (Linux shows "per core", so your CPU in full use -- all 24 "logical cores" -- would be 2400%, when Windows shows just 100%.)

X4 has one main thread that easily consumes 100% of one core and that is where the limit lies.
Goner Pancake Protector X
Insanity included at no extra charge.
There is no Box. I am the sand.

Imperial Good
Moderator (English)
Moderator (English)
Posts: 4764
Joined: Fri, 21. Dec 18, 18:23
x4

Re: 3900x to 5900x for X4

Post by Imperial Good » Fri, 24. Jun 22, 13:58

jlehtone wrote:
Fri, 24. Jun 22, 11:12
The 5900x has 12c/24t?
That is correct.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
The Windows shows 100% when all cores are in full use?
Windows shows 100% CPU usage when all threads are in use. Due to how SMT works this means that at 50% CPU usage it is possible that as good as 100% of core resources are already being used with the other 50% usage being made up by mostly stealing half the cycles from the 50% that is already working. Technically SMT allows the use of cycles or core resources that a single thread cannot fully use, but in most gaming workloads this extra computation results in ~5% more performance at best and often a slight performance regression due to the halved per thread performance.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
Then "30%" would be bit over 3 cores
Using the above logic this means that ~7 CPU cores are being almost fully used. To be honest this is unusually high for X4, with it usually only loading 2-3 heavily.
jlehtone wrote:
Fri, 24. Jun 22, 11:12
Linux shows "per core", so your CPU in full use -- all 24 "logical cores" -- would be 2400%
Linux shows per thread/logical processor. For the 5900X this would mean that most CPU resources are being used at just 1,200% since beyond that is all SMT threads which give minimal additional throughput in most workloads.

Post Reply

Return to “X4: Foundations”