AMD Ryzen and XR

Anything not relating to the X-Universe games (general tech talk, other games...) belongs here. Please read the rules before posting.

Moderator: Moderators for English X Forum

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Sun, 19. Mar 17, 21:01

Has anyone here actually bought one?

I realize such a major purchase isn't all that common, but I'm curious.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

User avatar
BugMeister
Posts: 13647
Joined: Thu, 15. Jul 04, 04:41
x4

Post by BugMeister » Mon, 20. Mar 17, 12:50

you'd be crazy to run it on a Windows7 system..
MS have said they can't support the upgrade - same goes for Kaby Lake..
- the whole universe is running in BETA mode - we're working on it.. beep..!! :D :thumb_up:

User avatar
Tamina
Moderator (Deutsch)
Moderator (Deutsch)
Posts: 4550
Joined: Sun, 26. Jan 14, 09:56

Post by Tamina » Mon, 20. Mar 17, 20:28

BugMeister wrote:you'd be crazy to run it on a Windows7 system..
MS have said they can't support the upgrade - same goes for Kaby Lake..
In their first press release they were a little bit vague and harsh.
It sounded a bit like: If you try to run the newest processor generations with Windows 7 it will be blocked.
Did they specify?

Not that I care, I am using Win 10.

Code: Select all

Und wenn ein Forenbösewicht, was Ungezogenes spricht, dann hol' ich meinen Kaktus und der sticht sticht sticht.
  /l、 
゙(゚、 。 7 
 l、゙ ~ヽ   / 
 じしf_, )ノ 

User avatar
Tracker001
Posts: 5948
Joined: Sat, 14. May 05, 17:24
x3tc

Post by Tracker001 » Tue, 21. Mar 17, 04:18

@ linolafett

You may be interested in Jayzz2cents " Video Editing on the Ryzen 1800X - 30 Day Ryzen Challenge "

https://www.youtube.com/watch?v=UIIb5uZfukU

He does do a head to head compair with Ryzen and Intel here as a "Getting Stated"

User avatar
Tamina
Moderator (Deutsch)
Moderator (Deutsch)
Posts: 4550
Joined: Sun, 26. Jan 14, 09:56

Post by Tamina » Tue, 11. Apr 17, 18:30

- Today, AMD has released its Ryzen 5 processor for the main market.
And it looks like they are beating Intel this time in gaming as well (against similar priced processors).
- Game developers also updated some games for Ryzen, increasing performance up to 32 %.
Additionally a Windows 10 power profile for Ryzen was published squeezing out some additional %s.

Hopefully Egosoft optimizes their upcoming games for Ryzen as well.

I am happy, though. Need a new budget PC and/or laptop mainly for work and sometimes a bit Egosoft. Right on cue :D
When prices decrease in the next few weeks I am going to benchmark XR :D

Code: Select all

Und wenn ein Forenbösewicht, was Ungezogenes spricht, dann hol' ich meinen Kaktus und der sticht sticht sticht.
  /l、 
゙(゚、 。 7 
 l、゙ ~ヽ   / 
 じしf_, )ノ 

User avatar
Tracker001
Posts: 5948
Joined: Sat, 14. May 05, 17:24
x3tc

Post by Tracker001 » Wed, 12. Apr 17, 20:38

30 days is up and :

Jayz Two Cents

I used ONLY Ryzen for 30 days... So how did it treat me?

https://www.youtube.com/watch?v=Vf_pUECRmAo

User avatar
Terre
Moderator (English)
Moderator (English)
Posts: 10490
Joined: Mon, 19. Dec 05, 21:23
x4

Post by Terre » Fri, 14. Apr 17, 19:45

It's starting to look like the R7 1700, with it's overclocking headroom, and the R5 1500X are the picks of the bunch.
Open Rights Group - Is your site being blocked
Electronic Frontier Foundation - Online Censorship
The Linux Foundation - Let’s Encrypt
Check if your Email account has been pwned

CulunTse
Posts: 130
Joined: Mon, 15. Jun 15, 08:10
x4

Post by CulunTse » Sun, 23. Apr 17, 13:15

I've posted my brief experiences with XR on a R7 1700X here.

Questions welcome!

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Mon, 24. Apr 17, 20:18

I'm interested in comparison between Core i5 2500k (4.2 as turbo) and Ryzen 7 1700.

I know that clock-for-clock the difference really isn't meaningful. However I only see people comparing to 7700k or other Kaby Lakes, not us older brothers.

I have Core i5 2500k with Asus's DUAL 1060 O.C-3GB and 16Gigs of HyperX ram. Games are installed on HDD not SSD for the time being - only OS is on SSD, but I've already ordered R7 1700 with Gigabyte B350 and 16gigs of 2400MHz DDR4 - wondering if I were to see any difference in XR (or the older titles for the matter) with this upgrade in CPU?

CulunTse
Posts: 130
Joined: Mon, 15. Jun 15, 08:10
x4

Post by CulunTse » Mon, 24. Apr 17, 21:09

I'll be very curious after your direct comparison!
Please let us know your experiences.

I'm guessing you won't see the dramatic improvement I did, the 2500k was/is a great CPU, and I don't think XR would have been bottlenecked by it.
But you are the only person who can give us real numbers there!


P.s. welcome to Team Ryzen :-)

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Mon, 24. Apr 17, 21:13

Thank you CulunTse!

This CPU is great, really. And my Turbo is at 4.2 (actually it might be 4.3 atm, not sure and can't remember heh).

Now - is there any PROPER way to benchmark my system with XR? I mean a benchmark that would give somewhat stable numbers (i.e out of 6 runs, drop best and worst and then manually calculate avg from the rest) to work on...
Or some other test that would correlate to XR? I'm listening :)

I think I could do the rolling demo's from X3, but those are a bit too lightweight for my taste...

CulunTse
Posts: 130
Joined: Mon, 15. Jun 15, 08:10
x4

Post by CulunTse » Mon, 24. Apr 17, 21:16

I haven't found any benchmark mode in XR myself, or I would have provided my own numbers.
I'd gladly take tips too!

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Thu, 27. Apr 17, 05:12

Well, I've been testing all around a lot.

Basically what I've found out is:
GTX 1060 3GB indeed is bottlenecked by Core i5-2500k ! I thought that impossible, but no.

In almost *every* benchmark out there, this setup comes way short on GPU score in comparison. So I fired up 3DMark etc yesterday all around and noticed that my CPU utilization was easily peaking at 99%+, while GPU load peaked at around 80%. And there was a clear pattern when GPU load dropped drastically - it was when CPU load peaked - which clearly indicates either A) issue with hardware or drivers B) GPU capabilities are limited by CPU.

When I ramp up the resolution to ridiculous levels (I do have only 1080p display, but ofc you can render stuff at higher resolution if you so wish), CPU load dropped in comparison to GPU and I was able to put more load on the GPU.

This same goes up with XR as well - just flying around with 1080p and medium settings I have lacking FPS when CPU peaks, but when there is some resources to share I have no issues and GPU load rises in comparison to CPU. This would need a ton of research and I don't know if I have the time, but if I really get in on it, I will be running the same tests on Windows 7 to see how much of a difference Win10 Pro 64bit has against Win 7. And yes, I do have the Creators update on Win10.

Anyway, I'm still waiting for my hardware to arrive - apparently Gigabyte and G.Skill both has some minor delivery issues and at the moment the shipping prediction is on next week, so we'll see. If there is any major differences, I will definately post the results here for you all to see :)

Oh, sorry about the long post, but I did notice that XR is able to really kill all 4 cores of this i5-2500k. Does anyone know how many cores it can truly utilize? This load is looked via task manager, so it's accuracy is whatever at best.

User avatar
Terre
Moderator (English)
Moderator (English)
Posts: 10490
Joined: Mon, 19. Dec 05, 21:23
x4

Post by Terre » Thu, 27. Apr 17, 09:56

aquatica wrote:Anyway, I'm still waiting for my hardware to arrive - apparently Gigabyte and G.Skill both has some minor delivery issues and at the moment
If your motherboard is Gigabyte, keep an eye on the BIOS updates, they do, I'm told, have a bit of a history of not keeping all of the books on the shelf, so always keep a back-up of the previous.
Open Rights Group - Is your site being blocked
Electronic Frontier Foundation - Online Censorship
The Linux Foundation - Let’s Encrypt
Check if your Email account has been pwned

Oldman
Posts: 1661
Joined: Thu, 5. Dec 02, 10:37
x3tc

Post by Oldman » Thu, 27. Apr 17, 11:12

Hi Folks :)

Well...I've just been reading through this here topic/thread, and beings I'm a bit out of date with the 'new' AMD CPU's I've been trying to catch up on the 'next best thing' with AMD stuff! :roll:
Question/s

I take it that the new AMD Ryzen CPU's will only be compatible with the 'new specific' AMD motherboards (X350-X370's I think) ?
edit...I'm a also a bit confused with the prefix (motherboards) 'B' & 'X'.
So the cpu's wont be backwards compatible with previous generation mb's like they were to some extent in the past?

I've just been browsing the 'Scan' website, trying to see what sort of motherboard / cpu combinations there are. A bit confusing at the moment! :D

Oldman :) :oops:

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Thu, 27. Apr 17, 15:26

Terre wrote:If your motherboard is Gigabyte, keep an eye on the BIOS updates, they do, I'm told, have a bit of a history of not keeping all of the books on the shelf, so always keep a back-up of the previous.
My current one is Asus. And I don't want one for some time - just because they fail to follow standards set by industry (ioapic). They have to use some knifed software to fix their stupid jokes.

And as for Gigabyte, well, so far they are the ones that have delivered constantly. I mean in most of the tests they are doing very nicely, they are very actively upgrading their BIOS'es etc.

@
Oldman

I suggest you go and read Tomshardware or similar tech site about AMD's newest. Shortly: It is a new architechture, so of course there is no backwards compatibility.

CulunTse
Posts: 130
Joined: Mon, 15. Jun 15, 08:10
x4

Post by CulunTse » Thu, 27. Apr 17, 17:36

Re: Compatibility:

Ryzen needs a new socket because they are switching from DDR3 to DDR4.
DDR4 has different/more pins, so this is really a hard requirement.
AMD has gone on record saying that they will support this socket until "the market" switches to DDR5, and at least to somewhere in 2021.

(Still, the AM3 socket surviving this long is admirable. How many sockets has Intel had since then? 9? 900?)

They will use AM4 for practically all future processors, from extreme budget to very-hgih-end, for both CPU and APU. This is an improvement compared against the current/previous situation, which has AM3+ (for CPU) and FM2 (for APU).
Only servers and multi-chip workstation chips will have different sockets.


The chipsets go like this:
X370 = top of the line, ALL the features + overclocking
X300 = not yet available, planned mini-ITX chipset, minimum features + overclocking
B350 = mid-range, most features + overclocking
A320 = budget, no overclocking
A300 = budget mini-itx, not yet available


@ aquatica: "of course there is no backward compatibility"
The question isn't all that dumb. AMD has kept AM3 and AM3+ compatible over multiple architectures, from early K8 Athlons to late excavator FX'es. Pretty awesome if you think about it!

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Thu, 27. Apr 17, 17:40

Well, okay CulunTse. AMD did have compatibility for a long time - but they lacked any kind of power for the same time, too...

There are very little differences between X370 and B350. Mostly the fact that X370 supports Dual GPU while B350 doesn't (except for AMD CrossFireX, but I know nothing of it so can't really tell if it differs and how from SLI).

There is a very nice list on the differences between chipsets. Don't forget that the CPU provides some of the funky stuff too and it's no way said, that motherboard manufacturer wouldn't add some stuff - for example someone did add extra Serial ATA chip.

http://www.gamersnexus.net/guides/2763- ... -b350-a320

CulunTse
Posts: 130
Joined: Mon, 15. Jun 15, 08:10
x4

Post by CulunTse » Thu, 27. Apr 17, 18:10

Nice link, especially the summary table all the way at the end.

I personally really like how much is integrated in the Ryzen CPU. Intel motherboards need way more extra chips for the same functionality.

I was intentionally very brief in my chipset list. As you've said, there are other great sources for details. I just wanted to give a starting point of the relative positioning, which will hopefully provide context when reading more.

aquatica
Posts: 186
Joined: Wed, 6. Nov 02, 20:31
x4

Post by aquatica » Tue, 16. May 17, 22:18

Well, shortly:

Core i5 2500k @ 4.7GHz, 16GB of DDR3 HyperX Genesis 1600MHz + GeForce GTX 1060 3GB OC = more than one core at 100% load.

AMD Ryzen R 7 1700 @ 3.0GHz, 16GB of DDR4 G.Skill Flare X "for AMD" 2400MHz + GeForce GTX 1060 3GB OC = one core at ~95% utilization, rest WAY lower.

So far, FPS gain has been impressive. To the point that I set my graphics to "HIGH" and FXAA to full - even in a bit more populated areas I get decent 40FPS+, when with the Core i5? No way. No way even in my furthest dreams. It would really seem that even the highly sought-after Core i5 2500K *is* bottlenecking in XR, which sounds pretty far fetched to me. Really.

OS is the same; Windows 10 Pro insider preview with latest possible drivers.

Anyway, if someone is up for the Ryzen challenge, I would like to point out a few things for all to know, to save you from some... hassle.

1) If you buy Gigabyte, DO NOT USE THEIR OVERCLOCKING SOFTWARE! It *somehow* changes the "default values" in BIOS/Chipset in a way that you cannot "restore" your normal operating mode. In effect, if you OC with their AutoTune, it sticks. Even if you change all the values back to their defaults, you get higher temperatures - I didn't believe it (I did read about it earlier). Well, now I'm stuck with about 5celcius higher temps on idle and 10celcius on full load. And everything is 100% same as before...

Instead you should only use the Ryzen Master software suite. It is a bit flimsy and to me, much too difficult to read - yet designed to be super easy to use... Perhaps someone who is more prone to Apple-like user interface might get off of it better.

2) Do not use, at least for the B350 chipset and at least for the time being, motherboard manufacturer's drivers. They are obsolete - even being only 5 weeks older! This probably gets better after a while, but right now there have been huge updates to increase stability, and AMD releases new chipset drivers quickly

All in all, even without overclocking this stock R7 1700 is so far better than I thought it might be, that it's mindboggling. Take note however, that if you have more powerful Core i-series CPU (i7's higher end for example) or Haswell-E in general, you will gain very little imho. This 17% increase in actual calculating power compared to i5-2500K actually translates in XR from "sloggy, bumpy-FPS difficult experience" to smooth and fun experience, so for me it is a definite plus. I also haven't been in any extremely highly CPU-intensive situations, so there *might* be something yet to see - or not.

As a sidenote, several other games that I should have been able to run at high settings at 1080p and which I couldn't before, now works flawlessly.
Imagine, that an overclocked i5-2500K could bottleneck something like GTX 1060 - which it shouldn't. Even less with PCIe 3.0 16x, which the card ran at. Same games, new CPU memory and motherboard = way better FPS. Even more so if you have physics calculations in the game engine, the difference is only bigger. I wonder why? I tried the i5 with stock clocks and there were no noticeable difference, so it shouldn't be the O.C either.

Well, sorry for the long post. I might suggest the R5 1600x that's coming in later, if you mostly play games. Should be better than R7 1700 for that purpose but who knows :)

Post Reply

Return to “Off Topic English”