nVidia 40 and 50 series GPU owners, how's Frame Generation?

Anything not relating to the X-Universe games (general tech talk, other games...) belongs here. Please read the rules before posting.

Moderator: Moderators for English X Forum

Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Hey all,

Frame generation. I know there's a lot of talk about it not helping with input lag. A 30fps title running at a frame-generated 120fps would still feel like 30fps - or less. In many titles, this simply would not be a nice experience, as others have shared. However, for me at least, X4 is a bit of a special case. It's only larger battles that ever cause my FPS to drop significantly below 60.

Now, such battles usually involve me as a pure spectator, so any input lag is irrelevant, it's all about the visuals. So, if frame generation could reliably turn a 30fps battle into an (at least) 60fps one, I'd be very happy.

Currently I have an RTX 3070, which is still running almost everything I play GREAT at 1440p. The one exception being X4 during the aforementioned larger battles. As, during these battles, I'm NOT seeing either CPU or GPU pushed hard, I did wonder if Frame Generation would work as well as I'd hope on a 50 series card.

At this moment, I have an RTX 5070Ti in my basket, along with an appropriate water block. I had planned to wait for the 50 Super series, or perhaps even 60 series (or AMD equivalent) before I upgraded next. However, the whole memory situation, expected price increases, doubt if 50 series Super will even be a thing AND the fact that said 5070Ti is at an all-time low at the moment got me thinking.

To be clear, I'm perfectly happy with the performance I get in every other title I play currently. X4 is the exception, but low frames are not down to the GPU, or CPU, being maxed out. Of course, a better GPU would allow me to turn up a few settings in some titles no doubt, while maintaining a good fps, but it's not going to be night and day in that regard as I'm fine with what I have in those other titles.

I'm hoping those who've played X4 both before and after frame generation can share their experiences.

Side note: Typing here as a large 30fps battle plays out between my fleet (Destroyers, Carriers and LOADS of fighters) and a large Xenon incursion fleet. Epic, but far from smooth.
Blaze1st
Posts: 574
Joined: Thu, 13. Feb 25, 13:42

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Blaze1st »

I run on a 4090 and frame generation makes no difference so far as I can tell.

None of the graphics settings have much impact to be honest so I just leave it at 4k@60Hz with everything on ultra. Still drops in busy scenes from 58fps (driver capped) to mid 40s.
Falcrack
Posts: 5980
Joined: Wed, 29. Jul 09, 00:46
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Falcrack »

I have a 5070, upgraded from a 3060 Ti. I have a i5-12600K CPU with 32 GB RAM. I tried frame generation a couple times, and it sucked compared to having it off. There were weird screen artifacts, and fps seemed lower in many cases compared to having it off. The 5070 is powerful enough for me that no frame gen is necessary to get great frame rates in X4. It was definitely a great upgrade from my 3060 Ti. Despite X4 being described as being CPU limited, the graphics card upgrade was noticeably better experience.

But maybe I just set it up frame gen wrong. I just never found a situation where frame gen was needed to get me the frames I wanted, since I've been satisfied with the frames when it is off.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Blaze1st wrote: Fri, 28. Nov 25, 02:40 I run on a 4090 and frame generation makes no difference so far as I can tell.

None of the graphics settings have much impact to be honest so I just leave it at 4k@60Hz with everything on ultra. Still drops in busy scenes from 58fps (driver capped) to mid 40s.
No difference? That's a shame.
Falcrack wrote: Fri, 28. Nov 25, 05:36 I have a 5070, upgraded from a 3060 Ti. I have a i5-12600K CPU with 32 GB RAM. I tried frame generation a couple times, and it sucked compared to having it off. There were weird screen artifacts, and fps seemed lower in many cases compared to having it off. The 5070 is powerful enough for me that no frame gen is necessary to get great frame rates in X4. It was definitely a great upgrade from my 3060 Ti. Despite X4 being described as being CPU limited, the graphics card upgrade was noticeably better experience.

But maybe I just set it up frame gen wrong. I just never found a situation where frame gen was needed to get me the frames I wanted, since I've been satisfied with the frames when it is off.
Hmm, strike two for frame generation it seems.

For me, 5800X3d + 3070 + 32GB DDR4 3600 + fast M.2s in RAID0, fps is generally nothing short of excellent. The only situation in which I'm GPU-bound - GPU at 99/100% - is in certain sectors with cloud effects. Heretic's End is a perfect example. Turning fog down a bit helps, but I like the full fog effect in other sectors that don't suffer nearly as much. There's an impact on GPU load, sure, but not enough to drop things below the 60hz cap I set. So I keep it on. CPU is generally having a pretty easy time of it too, it rarely ramps up the clocks. I'll often see several cores not going above 2ghz. They're working, but not that hard. When things start to get busier - more ships engaging in battle for example - I see frequency on one Core start to push just over 4ghz, GPU generally not taxed. However, it's when fps does start to dip - I'm vSyncing at 60 remember, so that's the hard cap - going below 50, into the 40's that things get weird.

When fps drops in larger battles, what I see is always the same:

- GPU load DROPS, it's not being pushed.
- CPU load DROPS over all threads. That previously "busiest" thread pushing 4Ghz or a little over drops down to under 3 or even under 2ghz.
- Alongside this, I see both CPU and GPU power use drop, and temps take a marked drop too.
- Basically, PC hits a point where it stops working harder as things get busier, and hugely reduces resource usage.

This is the consistent pattern for me. Things start to get busier, PC is working harder, things hit some sort of threshold and fps suddenly drops markedly (into the 40's or even 30's) while both CPU and GPU are working less. Temps and power use reflect this. It's weird. The drop is very sudden too, it's not like it drops an fps each time a new ship shows up. Rather it's running great (~60fps) then it suddenly tanks to 40's or even 30's. It's the sudden nature of the drop that's most jarring I think. It's like something is suddenly becoming overloaded, perhaps by all those projectiles, ship manoeuvres, calculated hits etc. But the drop is sudden.

My hope was, as there's certainly (in theory) CPU and GPU resources to spare - even on a 3070 - a 40 or 50 series GPU with frame generation could happily fill in the gaps. Input responsiveness isn't really an issue, as I'm generally just watching / directing these battles, but seeing these epic battles play out a bit smoother would be lovely. Sadly though, it seems that it just does work with X4. That's a real shame.

To be totally clear, I'm generally more than happy with X4's performance. It's only the Fog effect - particularly in Heretic's End - that really taxes my GPU for some reason, even if NOT looking at said clouds. Proximity is enough for FPS to tank and GPU load to sky rocket. This can bee addressed via turning down a few game settings however. It's just those large, epic battles that bring the game to a crawl.

My prior 3900X exhibited the same issue when things got busy, but it's much more marked on the 5800X3d - perhaps because the 5800X3d was quite literally DOUBLING the fps I saw previously in many instances. Note that when testing I lifted that 60fps limit of course, though I generally game at 60 or 120 depending on the title - G-Sync Monitor. So, when it tanks it seems harsher. Friend with 12900K & 3090 (now actually 5090) sees the same thing. His overall fps is generally slightly lower than mine though, outside of fog, as X4 really likes X3d it seems.

I suppose my dream for X4 is ever better performance. However, that's specifically in larger battles. General performance flying around busy sectors, being near one of my huge stations etc. is perfectly fine. Great even. It's just those battles and the, quite frankly, just plain weird resource usage drop of my PC (and two other PCs) when things get "too" busy.

So, that 5070Ti and Block are still in my basket. I almost bought, but held off. I gamed for a bit last night, several different games, and they all ran great. For my specific set of titles, the 3070 is still doing a great job. The only exception in X4, and low frame in that = low GPU and CPU usage too, so I'm not convinced a 5070ti will help there without working frame generation.
xavii
Posts: 68
Joined: Sat, 4. Feb 17, 00:06
x3ap

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by xavii »

I had 9800x3d with a 4070 and I was not happy with the performance playing on a 32 inch 4k monitor, but it was not bad. I upgraded to a 5080 and now I have a very smooth framerate with FG on. I set the limit to 120 frames. Input lag is no issue, even with multi FG on.
I play with maxed out graphic settings (except volumtric fog which is set to low) and with Nvidia DLAA which provides great sharpness with good antialising. I don´t think 120 frames are necessary in this game, but I like the extra smoothness. Yes there are some artefacts using FG, but you really have to look for it and it is easy to neglect it.
In big battles or around very large stations the framerate still goes down (is in this case GPU or CPU the bottleneck? I don't know), but I don't spend most of the time in big battles.

I absolutely can recommend FG. If your system is potent enaugh I would use DLAA instead of DLSS, it makes the picture look so much clearer and has an almost 3d like effect. For me an X game never looked so well and run so smooth.

My only wish at the moment is that they ad nativ HDR support. I like the increased contrast with HDR (the Windows HDR option), but it causes significant colour banding when activated, but still looks better most of the time then SDR.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

xavii wrote: Fri, 28. Nov 25, 17:56 I had 9800x3d with a 4070 and I was not happy with the performance playing on a 32 inch 4k monitor, but it was not bad. I upgraded to a 5080 and now I have a very smooth framerate with FG on. I set the limit to 120 frames. Input lag is no issue, even with multi FG on.
I play with maxed out graphic settings (except volumtric fog which is set to low) and with Nvidia DLAA which provides great sharpness with good antialising. I don´t think 120 frames are necessary in this game, but I like the extra smoothness. Yes there are some artefacts using FG, but you really have to look for it and it is easy to neglect it.
In big battles or around very large stations the framerate still goes down (is in this case GPU or CPU the bottleneck? I don't know), but I don't spend most of the time in big battles.

I absolutely can recommend FG. If your system is potent enaugh I would use DLAA instead of DLSS, it makes the picture look so much clearer and has an almost 3d like effect. For me an X game never looked so well and run so smooth.

My only wish at the moment is that they ad nativ HDR support. I like the increased contrast with HDR (the Windows HDR option), but it causes significant colour banding when activated, but still looks better most of the time then SDR.
Thanks for this. Your experience seems the polar opposite of others.

For me, at 1440p with the 3070, I was limiting myself to 120fps (the limit of my G-Sync monitor with HDR enabled) but I'd still get GPU-limited dips, especially in foggy areas. So, I turned things back down to my prior 60fps setting. I can both see and feel the difference between 60 and 120fps. The former looks pretty smooth, but I do feel the input delay a bit more. However, when watching rather than flying myself, 60 works just fine.

I was using DLAA until recently. It looks pretty decent, and is great at making distant object look distant. However, I felt it was sometimes a little too out of focus for closer stuff. It's good, but I'm a little torn between liking it or not. I'm X2 sagt Bussi auf Bauch like that lol.

Regarding HDR. When I got my first HDR Monitor - the one I'm using currently - I was shocked how an 8-bit colour title could look so very much better. All the games I play looks so much better on this monitor IF it runs in HDR mode. Then I moved from Windows 10 to Windows 11 and visuals were improved even more. I really didn't expect that. I don't think I've actually got an titles that are natively 10bit, but whatever Windows is doing in regards to HDR, things do look a lot better.

For clarity, even with Frame Gen enabled, you're saying that large battles still see a significant frame rate drop? I LOVE watching large battles, so it's here I'd like to see improvements. Generally performance is great, battles though is where it drops.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

After much procrastinating, I ordered that 5070Ti. While I'm still pretty happy with the 3070, various factors have come into play such as "It's a good offer" (as low as it's ever been), "Prices are likely going up", "stock can be hit or miss", "I want it". That sort of thing. Also, considering my current PC, the 5070Ti looks like a really nice match for the 5800X3d - the last upgrade for this platform methinks.

I've read a number of reports on how people say things are just smoother / nicer with 50 series over older GPUs. Not a particularly scientific metric, but having experience similar jumps over the years when upgrading I do get it.

I will find out for myself how well things work in X4, but it's not the be all and end all of the upgrade. Obviously other titles can benefit too, and perhaps things will just "feel better" on the newer GPU. PLus I can play with settings that were off-limits (too heavy an FPS hit) or just simply weren't available with the 30 series. I remember going from an RTX 1070 - which is actually in the PC I'm typing on now - to the 3070 in the gamer, and being very impressed by the jump in performance and general perception of smoothness. With the 3070, of course VRAM was no more than the 1070, which did cause me pause at the time. No regrets though, considering the market at the time and the price I got it for. Might see if anyone wants my water-blocked 3070, else I'll pop it into this system (3900X) replacing the 1070. Will need to buy WC bits for this PC though, something I have considered.

I'm hoping I've jumped in at the right time, I'll feel vindicated if prices go up, 50 Super is delayed / Cancelled and 60 series comes along later than originally expected lol. Or, of course, if I'm simply happy with the 5070Ti.

One thing that did annoy me however... All the way through the buying process, it showed next day (Saturday 29th) delivery. Perfect. I go through the check-out, enter my card details and...now it's suddenly Tuesday 2nd, no option for Saturday. I'd hoped to have something to play with over the weekend. Ah well. Water block arrives Tuesday (different company) but I will be running the 5070Ti on air initially to test it.

Anyway, once up and running, I'll be sure to share my own experiences. On paper, the 5070Ti looks like a good deal for the nVidia side of things. The 5080, while undoubtedly faster, is typically only about 15% faster based on the many (many!) reviews I've been reading and watching. If pricing was around the same difference, fine, however, the cheapest 5080 I found is quite literally 50% more expensive than the 5070Ti I ordered. As I plan to water cool the 5070Ti (GPU + block still cheaper than a basic 5080!) I hope it'll live its best life, coming at least close to a stock 5080. I'm also quite comfortable with overclocking, seems like undervolting is the thing with 50 series, so I'll have a play with that. I know my 5800X3d is undervolted and it's certainly up there with the best of them. 5800X3d loves being water cooled, as does the 3070. Hopefully the 5070Ti is the same.

Apologies for sorta going off my own topic, but I posted because of considering an upgrade. Trying to be sensible with my purchase, though friend was attempting to bully me into getting a 5090! Totally wasted on me... he does VR, I don't.

No buyer's remorse... yet. This is good lol.
User avatar
alt3rn1ty
Posts: 3925
Joined: Thu, 26. Jan 06, 19:45
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by alt3rn1ty »

I cant really offer any comparison advice that you can relate to, as I am always on Laptops
But for me I think I may have been one of those none impirical evidence promoters of a 5070ti, its far better than my previous mobile 3060 gpu.

So all I can say is try max fps 180, DLSS Balanced, Frame Gen x3. My laptop screen supports GSync so thats on, I also have Low Latency Mode set to On in NVidia Control Panel (Not Ultra, just On, Ultra was doing strange things)

It probably works differently than I imagine, but 60 real frames is what I would always like the game to be at as a minimum, 60-120 is room for x2 Frame Gen, 120-180 is room for x3 Frame Gen, so 60+60+60 = Max FPS setting (Game and NVidia Control panel) at 180 ..
.. Its unusual now to see the in games own FPS counter go below 100, 90% of the time its hitting the cap of 180.

But I do tend to keep away from big battles. I only build a few necessary stations in different sectors (thats a few overall, not per sector), fleets are minimal for my needs, and miners/traders well balanced for the few stations needs without having massive amounts of them.

Specs in signature below, I love the Intel Ultra 9 CPU (Only allowing the Performance cores in the BIOS) and NVidia 5070Ti combination, I have very little to complain about these days with the games performance.
Spec's@2025-05-17 - Laptop - Acer Predator Helios Neo 16 AI - Win 11
CPU - Intel Core Ultra 9 275HX 2.7-5.4ghz, RAM - 32gb DDR5 6400(OC),
Discrete GPU - NVidia Geforce RTX 5070 Ti, VRAM 12gb GDDR7,
SSD - M.2 PCIe NVME 1Tb
, OLED WQXGA 2560x1600.
:goner: Seeker of Sohnen. Long live Queen Polypheides. :boron:
>> Click me for X4 Forum Avatars <<
Blaze1st
Posts: 574
Joined: Thu, 13. Feb 25, 13:42

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Blaze1st »

Personally I would avoid the two lower end entries and go for either a 5080 or 5090 if you can afford the latter.

I run DLSS for games with lumen or ray tracing, DLAA if it can maintain a steady 58fps, but for anything without any kind of post-processing lighting at 4k I don't see the point in wasting resource on AA (forced with any kind of DLSS).

I'm also very aware of tearing and artefacts (generally use very high look sensitivity) so v-sync/g-sync needs to be on.

As for where the blockage is in busy scenes I dunno, I'm running on a 7950X3D with 64GB and MP600 Pro SSD. Not much more I can throw at the game for a stable render.
Falcrack
Posts: 5980
Joined: Wed, 29. Jul 09, 00:46
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Falcrack »

Blaze1st wrote: Fri, 28. Nov 25, 21:06 Personally I would avoid the two lower end entries and go for either a 5080 or 5090 if you can afford the latter.
The 5070 Ti is likely to be a much better value for performance than those higher end models. I got the 5070 because it offered the best gains in performance for the price. I would only order the extreme top end if money was not an issue. Which to me, money was an issue.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

alt3rn1ty wrote: Fri, 28. Nov 25, 20:50 Specs in signature below, I love the Intel Ultra 9 CPU (Only allowing the Performance cores in the BIOS) and NVidia 5070Ti combination, I have very little to complain about these days with the games performance.
Your specs show a 5070Ti 12GB, is that a typo or do you have a 5070?
Blaze1st wrote: Fri, 28. Nov 25, 21:06 Personally I would avoid the two lower end entries and go for either a 5080 or 5090 if you can afford the latter.

I run DLSS for games with lumen or ray tracing, DLAA if it can maintain a steady 58fps, but for anything without any kind of post-processing lighting at 4k I don't see the point in wasting resource on AA (forced with any kind of DLSS).

I'm also very aware of tearing and artefacts (generally use very high look sensitivity) so v-sync/g-sync needs to be on.

As for where the blockage is in busy scenes I dunno, I'm running on a 7950X3D with 64GB and MP600 Pro SSD. Not much more I can throw at the game for a stable render.
Here's the thing, I could readily buy a 5090 & water block if that's what I really wanted. Of course, there are always other things to spend money on lol. However, it doesn't present good value for me without upgrading everything else too. The cheapest 5080 is literally 50% more expensive than the 5070Ti for around 15% better performance at 1440p. That's just not enough for the price.
Falcrack wrote: Fri, 28. Nov 25, 21:18 The 5070 Ti is likely to be a much better value for performance than those higher end models. I got the 5070 because it offered the best gains in performance for the price. I would only order the extreme top end if money was not an issue. Which to me, money was an issue.
That's been my verdict certainly. If pricing here was different, perhaps I'd go with the 5080. The thing is a 5080 is about half a 5090 and is about half the price. That tracks. The 5070ti is a bit of an anomaly. If, for the price, it was also a 12GB card, like the 5070, then I'd likely have gone 5080. However, it does have 16GB, and is on a 256bit bus too. It does however use the slightly slower 28Gb VRAM vs the 5080's 32Gb. The 5080 is a more performant card vs. the 5070Ti, but the pricing is just wrong for me.

It's funny, I've been gaming this eve and the 3070 hasn't missed a beat, solid 60fps (limit I set) in several titles I've played this eve. Not done any X4 though. I'm well aware that FOMO has likely played a more significant factor in this purchase than would be usual for me.
Blaze1st
Posts: 574
Joined: Thu, 13. Feb 25, 13:42

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Blaze1st »

You can get a 5080 for an extra 200 bucks, when you're already paying 750 for a 5070 Ti... totally the 5090 at 2600 bucks is purely if the money is burning a hole in your pocket.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Blaze1st wrote: Sat, 29. Nov 25, 10:15 You can get a 5080 for an extra 200 bucks, when you're already paying 750 for a 5070 Ti... totally the 5090 at 2600 bucks is purely if the money is burning a hole in your pocket.
Here, at the time of purchase, it was £669 for the cheapest* a 5070Ti and £1,059 for the 5080 equivalent (Same manufacturer and model). The absolute cheapest 5080 was £969 but listed as pre-order only. If a 5080 was just $200 (£150) more, I'd certainly have considered it. However, at at least £400 ($530) more though, nope! Note that I've just converted between GBP/USD, I'm aware there are tax differences, but I don't know what they are these days. As a total aside, I just remembered this, I actually had to learn the tax code for ALL US states once upon a time, while developing some invoicing software for a very large company... then I had to do the international billing stuff too. Headache! All long forgotten now, which is likely good for my sanity lol.

* Decent card, reportedly a somewhat weaker (noisy when working hard) cooler. As I'm replacing the cooler that was irrelevant. In the past, I'd often get the "Reference" version of a GPU with the basic blower-style coolers that were often used back then. Horrible, noisy, cheap coolers, which I immediately took off, so I wasn't wasting money removing a decent cooler.

So, yeah, if a 5080 was 15% (maybe even 20%) more expensive for a 15% average uplift in performance, I'd have gotten one of those. However, as it was actually almost 60% (like for like GPU) for that 15% performance bump... that's not good value. I've read and watched lots of reviews and comparisons, so I think this is the right call. 5070Ti + block + coolant is still less than a basic 5080. 5080 will likely still "win", but hopefully the 5070Ti will be living its best life water cooled.

I've had good luck with 70 series cards over the years. I had proper high-end PC's back in the day, best gaming CPU, SLI GPU's (680s!) all custom water cooled. Then SLI basically stopped working, where it used to be fairly trivial to force it, by creating custom profiles. Fundamental change in the Drivers / Windows. I was waiting for the expected 1080Ti, but needed something right away, so got a 1070 as a temporary card. That thing performed so much better than expected for what I play, that I ended up keeping it a while - I didn't even water cool it, it had the best air cooler on an GPU I'd ever had, yet it was the cheapest one available at the time (Palit Gamerock). The 3070 was a similar sort of thing, it was by far the best FPS per £ at the time of purchase during the crypto boom. I got the 3070, thinking I'd perhaps replace it with the 3080Ti / 3090 once prices settle. However, once again, the 3070 performed beyond my expectations in everything I play. If I had to keep using the 3070, I'd not be too unhappy, but considering where things are going currently, I consider the 5070Ti as a decent deal. Plus, while the 3070 runs everything I want it to fine at the moment that will likely change.

I do try to think about these things :)
Blaze1st
Posts: 574
Joined: Thu, 13. Feb 25, 13:42

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Blaze1st »

I went from 2 x 1080 Ti in SLI straight to 4090. Skipped the 2000/3000 series entirely and probably will the 5000s also.

Prices seem generally quite inflated right now so hopefully I future proofed this rig sufficiently for a few more years yet.

If the NVidia prices keep going the way they are going, AMD might be back in consideration for my next upgrade...
Alan Phipps
Moderator (English)
Moderator (English)
Posts: 32552
Joined: Fri, 16. Apr 04, 19:21
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Alan Phipps »

As we have totally stopped discussing X4 and are instead talking hardware and prices, this can go to OT now.
A dog has a master; a cat has domestic staff.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Alan Phipps wrote: Sat, 29. Nov 25, 11:50 As we have totally stopped discussing X4 and are instead talking hardware and prices, this can go to OT now.
Boo hiss! Lol, fair enough, my bad. :)

As we're now officially off-topic:

I've just removed the 3070 from the loop, so for the moment it's just the CPU that's water cooled. Just doing some leak testing now, plus I took the opportunity to re-TIM the CPU with some better stuff. Just waiting for the arrival of the 5070Ti now - due between 14:58 and 15:58 apparently - which I should be able to plug right in. 3070 already needed two PCIe power (2 x 8 pin) and the 5070Ti comes with a 2 x 8 to 12 pin adaptor. It's a 300w card so that's 2x 150w from each 8 pin and 75w from the PCIe lanes. Plenty.

As an aside, checking the 3070 block now it's out of the case - it's transparent Acetal - I see no gunk build-up or discolouration at all, fins look nice and clean. That's good. I plumbed it in a little over three years ago and haven't touched it since. I've used exclusively EK Cryofuel Clear, ordered two more litres of that, arriving Tuesday.
User avatar
alt3rn1ty
Posts: 3925
Joined: Thu, 26. Jan 06, 19:45
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by alt3rn1ty »

Scoob wrote: Sat, 29. Nov 25, 02:57
alt3rn1ty wrote: Fri, 28. Nov 25, 20:50 Specs in signature below, I love the Intel Ultra 9 CPU (Only allowing the Performance cores in the BIOS) and NVidia 5070Ti combination, I have very little to complain about these days with the games performance.
Your specs show a 5070Ti 12GB, is that a typo or do you have a 5070?
No typo, I spent a lot on this laptop, its a 5070 Ti with 12 Gb VRAM at GDDR - 7
https://www.acer.com/gb-en/support/prod ... 1807957676

For a laptop, its a bit of a beast :D
Spec's@2025-05-17 - Laptop - Acer Predator Helios Neo 16 AI - Win 11
CPU - Intel Core Ultra 9 275HX 2.7-5.4ghz, RAM - 32gb DDR5 6400(OC),
Discrete GPU - NVidia Geforce RTX 5070 Ti, VRAM 12gb GDDR7,
SSD - M.2 PCIe NVME 1Tb
, OLED WQXGA 2560x1600.
:goner: Seeker of Sohnen. Long live Queen Polypheides. :boron:
>> Click me for X4 Forum Avatars <<
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

alt3rn1ty wrote: Sun, 30. Nov 25, 00:41 No typo, I spent a lot on this laptop, its a 5070 Ti with 12 Gb VRAM at GDDR - 7
https://www.acer.com/gb-en/support/prod ... 1807957676

For a laptop, its a bit of a beast :D
Ah, that makes sense, I didn't realise the mobile version had less VRAM. Glad you're pleased with it :)

Unfortunately, my 5070Ti didn't arrive yesterday. Went from "Driver running a bit late" (like two hours) to not delivered. I'd just joked to someone that "the driver will likely go home at 6pm" then, moments after 6pm the tracking updated to tomorrow (i.e. today now). However, I still don't have a confirmed delivery time. I did say I'd get one by 12:00, but nope. So I chased the seller (Scan.co.uk) and they chased it in-turn. Seems, from their side, while my package is "out for delivery", it's not been scanned. Basically, I don't know what's happening.

It's been 24hrs since I last played a game - send help! Lol. PC is stripped down ready for the new GPU - I had to break the custom loop, so just the CPU is water cooled for now. More time leak testing I suppose, which isn't bad, but I decided not to go out this weekend just to do this. Ah well. Hopefully it will arrive, if not I suppose Monday. Luckily I will be in...
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Well, my Saturday delivery arrived today! Had to stay home for it. It's in the PC now, but I'm just leak testing the CPU block first - I try to do this each time I move the PC, just in case.

Just hoping to get the dispatch notification for the GPU block today, as it's due tomorrow.
Scoob
Posts: 11800
Joined: Thu, 27. Feb 03, 22:28
x4

Re: nVidia 40 and 50 series GPU owners, how's Frame Generation?

Post by Scoob »

Might create a new thread now I'm going back on-topic.

I want to share some first impressions of Frame gen, now the 5070Ti is installed in my PC.

I set NVIDIA DLSS to On, Frame Generation to "Auto (x2)" (the default once DLSS is enabled) and Super Resolution to Off - default was on.

Regarding the Dynamic Frame Generation setting, I set this to "On" however, my first impression is that this BREAKS the feature entirely!

I have my frame limit set to 60. I was getting a solid 60fps initially, but it'd tank in the map screen. Not normal. Slight dip, perhaps, but not that much. Then a stream of Xenon came through the gate and combat started. Frame rate tanked, far worse than normal, in to the 20's. I then turned Dynamic Frame Generation mode Off. Game instantly went to 60fps, during a moderately large battle where I'd expect an fps dip. This is good.

The description of [url=http://images2.wikia.nocookie.net/starwars/images/9/9a/Vaderchoke.jpg]DFG[/url] Mode is thus: "activates frame generation only when it boosts performance beyond the native frame rate of the game". That doesn't really make sense. The "Native" frame rate of the game during a larger battle SUCKS. 60 (limited) prior to the skirmish breaking out, 30's during the battle. Enabled [url=http://images2.wikia.nocookie.net/starwars/images/9/9a/Vaderchoke.jpg]DFG[/url] and we're looking at 20's. This "native frame rate" seems to mean whatever the game normally drops to, minus a few more lol.

So, in my very limited testing, I'm seeing a solid improvement in my fps - with the game holding at 60 - where it'd normally dip. But ONLY if I turn off Dynamic Frame Generation. I'm going to test some more, hopefully the current skirmish turns in to a battle. If that still holds at 60fps I'm going to be bloody over the moon!!

In summary: Initially impression is that FG can improve fps in combat scenarios - which is typically, for me at least, where most frame drops occur. I will be testing further a bit later, got real-world stuff to get on with sadly.

Oh, GPU block dispatched apparently, that'll be tomorrow's project if it arrives on time.

Edit: Point of interest. Frame Gen doesn't seem to help at all on the map screen, perhaps making things slightly worse. I need to continue to test as things (battle!) gets busier.

Return to “Off Topic English”