Tech Gaming Hardware discussion (& Hardware Sales) thread

That's what everyone expected. Hell, that's what I was expecting for its matchup against Raptor Lake, not Alder Lake. And that's not just the 7600X. The 7950X is losing to the Alder Lake i7 in gaming. It's even losing with PBO Max and overclocked RAM.
relative-performance-games-1280-720.png


i-mean-what-the-hell-is-going-on-whats-going-on-here.gif


Frankly, I'm worried that Intel's corrupt benchmark-fixing machine has gotten to Techpowerup. I was already worried about this with the release of the 5800X3D. Nobody else had the 5800X3D losing to the 12700K in roundups. I was able to write it off when that happened because it was only a few percent, and benchmark setups/suites vary. But this...this is beyond explanation. This is looking like a UserBenchmark moment for what has long been one of my favorite resources for tech review.

Head over to Anandtech:
https://www.anandtech.com/show/1758...ryzen-5-7600x-review-retaking-the-high-end/17
130284.png

130290.png

130298.png

130304.png

130310.png

130316.png

130328.png

Techspot aka Hardware Unboxed"
https://www.techspot.com/review/2534-amd-ryzen-7600x/
Average.png



Tom's Hardware:
AMD Ryzen 9 7950X and Ryzen 5 7600X Review: A Return to Gaming Dominance
u7ibvZJ2UR2VoC69jwusRi-970-80.png.webp


Tweaktown:
https://www.tweaktown.com/reviews/1...4-cpu/index.html#Gaming-and-Power-Consumption

Eurogamer aka Digital Foundry:
https://www.eurogamer.net/digitalfoundry-2022-amd-ryzen-9-7900x-ryzen-5-7600x-review?page=2

Gamers Nexus:



RIP Techpowerup.

So according to GN it sounds like In terms of gaming there’s not much real world difference in any of these GPUs still. I wonder if that will change with the 4090 at 4k
 
Frankly, I'm worried that Intel's corrupt benchmark-fixing machine has gotten to Techpowerup

RIP Techpowerup.
It is an interesting debate.

Supposedly the claim is last month techpowerup went through a brutal process of re-testing every single CPU and updated everything (every single game got the latest patches, new windows updates, new bios/drivers) and it saw a gigantic increase in performance for alderlake. The 12400 for example was originally tested as barely slower than the 5600x in gaming but the most recent setup it actually almost matches a 5900X

So it begs the question, did Intel collaborate with techpowerup to intentionally made 12th gen look better or are all of the other reviewers using outdated test setups that wouldn't actually bench those exact numbers on a true updated system ?

One thing I do like about techpowerup (regardless if there corrupt or not) is that they actually use realistic real world gaming benchmarks. You can scale the games by resolution but each game is running max settings (realistic for a 3080 tier card).

1 of the big gaming websites reviews I casually saw for the 7700X for example it showed "oh it's almost 15% better in gaming than the 12700K" but the basis of almost all of that 15% increase was 2 games they test at freaking medium settings on a RTX 3090Ti.

I understand the whole "cpu bound vs GPU bound" balance you have to do but that's kinda BS. If you threw those 2 outliers out (every other game on the test setup was highest settings) then the 7700X and 12700K where basically identical. Someone in the real world who actually buys a 3080/3090 tier card will think there getting 15% increase because some youtube guy is trying to promote some clickbait headline but actual real world no it's misleading.
 
So according to GN it sounds like In terms of gaming there’s not much real world difference in any of these GPUs still. I wonder if that will change with the 4090 at 4k
I don't think I follow. GPU review wasn't the focus of that video. Steve merely observed that the 7950X performance is so high that it's difficult to benchmark in games because even the best GPUs are bottlenecking the CPU.

The big news today is the massive paradigm shift in how CPUs are designed. The central focus of that video was to highlight that AMD is designing their new CPUs to scale with cooling. The better your cooler, the better the CPU will perform. They will run as far as your cooler will take them, and it's unlikely that anything but the best coolers will max out the CPU's performance will headroom to spare unless we're talking about lower-end CPUs. So moving forward there is going to be a more intense focus on CPU cooling than ever before.

The price of 280mm and 360mm liquid coolers had already gone way up in the past year, but expect them to jump, considerably. Same goes for PSUs because it's not just those NVIDIA GPUs that are going to be drawing crazy amounts of power.
 
It is an interesting debate.

Supposedly the claim is last month techpowerup went through a brutal process of re-testing every single CPU and updated everything (every single game got the latest patches, new windows updates, new bios/drivers) and it saw a gigantic increase in performance for alderlake. The 12400 for example was originally tested as barely slower than the 5600x in gaming but the most recent setup it actually almost matches a 5900X

So it begs the question, did Intel collaborate with techpowerup to intentionally made 12th gen look better or are all of the other reviewers using outdated test setups that wouldn't actually bench those exact numbers on a true updated system ?

One thing I do like about techpowerup (regardless if there corrupt or not) is that they actually use realistic real world gaming benchmarks. You can scale the games by resolution but each game is running max settings (realistic for a 3080 tier card).

1 of the big gaming websites reviews I casually saw for the 7700X for example it showed "oh it's almost 15% better in gaming than the 12700K" but the basis of almost all of that 15% increase was 2 games they test at freaking medium settings on a RTX 3090Ti.

I understand the whole "cpu bound vs GPU bound" balance you have to do but that's kinda BS. If you threw those 2 outliers out (every other game on the test setup was highest settings) then the 7700X and 12700K where basically identical. Someone in the real world who actually buys a 3080/3090 tier card will think there getting 15% increase because some youtube guy is trying to promote some clickbait headline but actual real world no it's misleading.
I can't imagine that reviewers as anal retentive as Gamers Nexus or Anandtech would be using outdated BIOS versions, game versions, or CPU firmwares.
 
I can't imagine that reviewers as anal retentive as Gamers Nexus or Anandtech would be using outdated BIOS versions, game versions, or CPU firmwares.
It's not that there intentionally being negligent or that there not anal retentive it's just a huge undertaking. The Tech powerup review (leading into the 7000 series launch) they had to get 26 different CPUs all re-tested on 12 games updating everything (bios, game drivers, windows 11 patches/updates)

The guy who did the 7000 series review claimed this process was such a big undertaking that it began in Mid-June in order to finish by late September (yeah I don't think that's actually legitimately the norm and would explain why this review was such an outlier, especially since they even claimed "hey this is why the results for these old cpus are so different then before" in the conclusion)

EDIT: Yeah I checked some other reviews and for the ones I saw literally none of them are running updated test setups (there basically just coping and pasting what the CPU performed like a year or 2 ago). Tomshardware for example still has all of the alderlake CPUs running DDR5-4800 (which is actually slower in gaming then DDR4-3200) where techpowerup has both alderlake and Zen4 both running DDR5-6000 to keep the controlled variable the same.
 
Last edited:
I don't think I follow. GPU review wasn't the focus of that video. Steve merely observed that the 7950X performance is so high that it's difficult to benchmark in games because even the best GPUs are bottlenecking the CPU.
No at 4k the bottleneck is on the gpu. That’s why all of the modern cpus are showing nearly identical numbers. I’m saying if we add the new 4090 in, I wonder if we will start seeing a separation in performance based on the cpu.

As it stands, if you’re gaming at 4k (or even 1440), and not concerned with anything else then there is no reason to buy a top tier cpu. A 5600x is virtually identical to a 7950x on the 4k charts, even with a 3090ti. I’m curious to see if the 4090 changes that.
 
I wish CPUs came out every two years like back in the day instead of annually.
 
No at 4k the bottleneck is on the gpu. That’s why all of the modern cpus are showing nearly identical numbers. I’m saying if we add the new 4090 in, I wonder if we will start seeing a separation in performance based on the cpu.

As it stands, if you’re gaming at 4k (or even 1440), and not concerned with anything else then there is no reason to buy a top tier cpu. A 5600x is virtually identical to a 7950x on the 4k charts, even with a 3090ti. I’m curious to see if the 4090 changes that.
Okay, yes, I'm aware, I was confused what you were trying to say. What I was pointing out was that Steve was indicating the 7950X is so powerful that even at the lower resolutions which are our typical focus for CPU reviews even the most powerful GPUs were apparently sometimes limiting the CPU. That's pretty nuts.

I can't say I care too much about the 4090's implications for studying higher resolutions for CPU reviews. I mean, yeah, it's always nice when you can see the GPU isn't bottlenecking CPU comparisons at a higher resolution, specifically when it's the one at which you game, because then there is more of a "real-world" nature to the benchmark's comparison, but practically speaking, I don't care, because it almost never matters. There's little variance.
 
Sorry I was beat been up 18 hours so fighting with people suck and this place has turned into who can suck whos balls harder. I tell people wait the price of GPUs going to crash people make fun of the Fing idea. I tell people to stay away from the used GPU market and people make fun of that one. Ethereum mining rigs and GPUs are being dumped in record numbers. If people want to buy that used GPU for only a little less then a new one good for them.
 
Sorry I was beat been up 18 hours so fighting with people suck and this place has turned into who can suck whos balls harder. I tell people wait the price of GPUs going to crash people make fun of the Fing idea. I tell people to stay away from the used GPU market and people make fun of that one. Ethereum mining rigs and GPUs are being dumped in record numbers. If people want to buy that used GPU for only a little less then a new one good for them.

All of my reaction gifs were meant to be a look of shock and disbelief at the idiot pressure washing GPUs…
 
  • Like
Reactions: PEB
So far there's nothing really drawing me to any of the new CPUs. I play in an ultrawide format (3840 x 1600) and it is bottlenecked by my GPU (3080 10GB) rather than CPU (5600X). The GPUs seem absurdly priced as well so unless something changes or there's some new AAA must-have titles with ridiculous requirements I think I'm out for this generation.

I do wonder if these prices are going to signal a change in what games are made - or more specifically, a stalling in graphics advancement for games. How many people can they reasonably expect will be on these newer generation systems when the prices are so utterly absurd? I can't imagine what it looks like if someone is 16 years old and wanting to build their own PC.
 
@Madmick I need your honest opinion

You seem to know what you talking about so I just go with you. I am in dire need to replace my DDR3 and 10 yeard old CPU, my budget will be up to $2000 for CPU.

Which one would you recommend for gaming and sort of future proofing?

I am debating between 7950x, 13900k but also thinking of 32 core 3970x? Or perhaps you would recommend something different? If there isnt sufficient data yet, would you be able to provide advice when you know?

upload_2022-9-27_12-53-45.png
 
@Madmick I need your honest opinion

You seem to know what you talking about so I just go with you. I am in dire need to replace my DDR3 and 10 yeard old CPU, my budget will be up to $2000 for CPU.

Which one would you recommend for gaming and sort of future proofing?

I am debating between 7950x, 13900k but also thinking of 32 core 3970x? Or perhaps you would recommend something different? If there isnt sufficient data yet, would you be able to provide advice when you know?

View attachment 945513
I don't know, yet. I'm still waiting on the reviews of the Raptor Lake CPUs.

Why do you need so many cores? Do you do something other than gaming? Just as with past generations, there's virtually no benefit to the R9 series (i.e. R9-7950X, R9-7900X) over the R7 series (i.e. R7-7700X) for gaming. Expect the same to be true for Intel. Remember the 13900K will have the same number of "power" cores-- 8-- as the 13700K.
 
I don't know, yet. I'm still waiting on the reviews of the Raptor Lake CPUs.

Why do you need so many cores? Do you do something other than gaming? Just as with past generations, there's virtually no benefit to the R9 series (i.e. R9-7950X, R9-7900X) over the R7 series (i.e. R7-7700X) for gaming. Expect the same to be true for Intel. Remember the 13900K will have the same number of "power" cores-- 8-- as the 13700K.

Why I need so many cores? For gaming future proofing. I like to spend money once for a long time.
 
So far there's nothing really drawing me to any of the new CPUs. I play in an ultrawide format (3840 x 1600) and it is bottlenecked by my GPU (3080 10GB) rather than CPU (5600X). The GPUs seem absurdly priced as well so unless something changes or there's some new AAA must-have titles with ridiculous requirements I think I'm out for this generation.

I do wonder if these prices are going to signal a change in what games are made - or more specifically, a stalling in graphics advancement for games. How many people can they reasonably expect will be on these newer generation systems when the prices are so utterly absurd? I can't imagine what it looks like if someone is 16 years old and wanting to build their own PC.

If you're the type of person that never really upgrades, pretty much if you're currently running anything older than Kaby Lake, you're better off buying the 5800x3D. By the time you'd be ready for an upgrade, the next fancy new socket will be out. The extra money for the mobo and ram aren't worth it for the 7700x. If you take the money you'd save with the 5800x and apply it to the GPU, you could step up a tier in graphics card or be close to it.
If you're the type of person who upgrades every 2 years, that's a different story.
 
Back
Top