Tech Gaming Hardware discussion (& Hardware Sales) thread

Intel's 10th-gen 18-core beats Core i9-9980XE by 11%, Ryzen remains faster
umMz8yb.png


The new 18-core isn't even beating the (still unreleased) 3950X in the leaked Geekbench engineering sample scores.

Intel is so fucked.
 
Apple may have boarded crazy train if this rumor is true.
"
Apple's management doesn't want Nvidia support in macOS, and that's a bad sign for the Mac..."

Apple must have been smoking some crazy stuff if they do not want to continue supporting Nvidia. I know this is a build and buy PC thread and Sherdoggers don't care for Macs I just thought this was wild. Here is the problem why Apples Computer business is dying they just don't understand the market. Instead of giving people a slightly more expensive but competitive computer with superior security and great tools for graphics and animation they throw out a computer without flexibility no native Nvidia support and massive price points. Yet Apple bosses don't understand why outside of app. developers people are moving away from Apple products.

https://forums.appleinsider.com/discussion/209144
 
It looks like Intel stepping up their game again lately and ASUS is announcing crazy new boards see competition is good for everyone.
Intel 10 core Core i9 10000X dominating both the i9 9900X and the threadripper 2950X in overall performance. It also supports an impressive 48 Pcie lanes. Looks like a monster of a board.

https://www.notebookcheck.net/10-co...9-9900K-in-multi-core-workloads.431124.0.html
That would be really promising for Intel...if the TR-2950X was AMD's best processor. It isn't even from their current gen.
https://browser.geekbench.com/processor-benchmarks/

Right now the 16-core i9-9960X reigns supreme thanks to its +100 MHz advantage over the 18-core 9980XE, but Ryzen's 12-core R9-3900X beats anything that is 14-cores or less. Everyone reasonably expects, and so far the above engineering sample leaks have affirmed this expectation, that the R9-3950X will be the top processor in the world when it launches. That is the CPU Intel has to beat.

Also, Asus is finally spooked. They've been as dominant as Intel for as long at the highest end of the motherboard space, but lately MSI has been outperforming them, and some argue the latter own the crown, now. Cool, but I don't think $500+ motherboards are really relevant to most. As far as bang-for-your-buck, which matters most to any gamers who aren't working with unlimited budgets, speaking generally:

Best Value Motherboard Manufacturer
Intel
Z390: Gigabyte
Z370: Gigabyte, MSI
H370/B365/B365: ASRock
AMD
X570: Asus
X470: MSI
B450: MSI, ASRock
 
Apple may have boarded crazy train if this rumor is true.
"
Apple's management doesn't want Nvidia support in macOS, and that's a bad sign for the Mac..."

Apple must have been smoking some crazy stuff if they do not want to continue supporting Nvidia. I know this is a build and buy PC thread and Sherdoggers don't care for Macs I just thought this was wild. Here is the problem why Apples Computer business is dying they just don't understand the market. Instead of giving people a slightly more expensive but competitive computer with superior security and great tools for graphics and animation they throw out a computer without flexibility no native Nvidia support and massive price points. Yet Apple bosses don't understand why outside of app. developers people are moving away from Apple products.

https://forums.appleinsider.com/discussion/209144
Why would they need Nvidia support if they put AMD graphics in their computers?
 
Why would they need Nvidia support if they put AMD graphics in their computers?
CUDA support. A very large portion of their discrete GPU relevance is built around it.
 
I mean, they have AMD cards only in new Mac Pro's? They cannot into CUDA, right?
No, AMD GPUs don't have CUDA cores. This is one of the primary things that attracts photo editors and other graphic artists who work with software that benefits from CUDA acceleration, and obviously Mac is more geared towards those users than it is towards gamers.
 
No, AMD GPUs don't have CUDA cores. This is one of the primary things that attracts photo editors and other graphic artists who work with software that benefits from CUDA acceleration, and obviously Mac is more geared towards those users than it is towards gamers.
Damn it, Mick, I know what CUDA is for.
What I mean, Apple seems not to put CUDA- capable hardware in their computers anymore, both Mac Pro and Mac Book pro have AMD graphics, so why would they support Nvidia?
 
Damn it, Mick, I know what CUDA is for.
What I mean, Apple seems not to put CUDA- capable hardware in their computers anymore, both Mac Pro and Mac Book pro have AMD graphics, so why would they support Nvidia?
Just because they didn't put those in the latest iteration doesn't mean the competition wouldn't be good for the future. They renew their laptop lines much more frequently, and they also have the iMac line.

Nevertheless, I'm not reading these tea leaves as an indication they are abandoning NVIDIA. I'm reading this as yet another sign of what I have been predicting for a long time, and that's that Apple is preparing to eventually dissolve MacOS. All of the ARM and mobile stuff operates on Metal or OpenGL. I think they're still trying to figure out how to merge MacOS into their mobile operating systems without alienating the professionals who have been longtime users of the desktop OS. Apple will never put the NVIDIA chipsets into their computers, and NVIDIA has no presence in the onboard GPU space among mobile chipsets. Apple makes their own SoCs.

One of the first places to trim the fat is to rid themselves of the need to support NVIDIA. Then those software engineers can be refocused on full integrating MacOS into iOS. Apple has always been about streamlining, and MacOS, quite simply, is archaic to the generation of upcoming professionals. They grew up on iPhones. Everything else is either iOS, or a fork of iOS (like tvOS and watchOS). They're going to retain what is useful to the business world from MacOS, and discard what is left of it.
 
One of the first places to trim the fat is to rid themselves of the need to support NVIDIA. Then those software engineers can be refocused on full integrating MacOS into iOS. Apple has always been about streamlining, and MacOS, quite simply, is archaic to the generation of upcoming professionals. They grew up on iPhones. Everything else is either iOS, or a fork of iOS (like tvOS and watchOS). They're going to retain what is useful to the business world from MacOS, and discard what is left of it.
Yes. Exactly. They dont put nvidia products in their new stuff, and dont want to waste resourses to support it. However, getting rid of MacOS is quite a task. We'll see how they proceed with that. I think they wanted to get rid of x86 on desktop as well?
 
Yes. Exactly. They dont put nvidia products in their new stuff, and dont want to waste resourses to support it. However, getting rid of MacOS is quite a task. We'll see how they proceed with that. I think they wanted to get rid of x86 on desktop as well?
Eventually, yeah, I think that's where they want to go. ARM is more power efficient, and right now the rails are pointed towards greater parallel processing (i.e. more GPUs) if one doesn't require mobility. This is one of the reasons some are pausing at the drop in support for NVIDIA, I believe. As it mentions in the article, without concern for CUDA, they have maintained OpenGL support for NVIDIA (since it also operates on that) for external GPUs such as when a professional wants to connect to an external GPU rack using the lightning port, for example, and with little effort since NVIDIA does so much on its side to provide support on the OpenGL standard. This ends that.

Now you can no longer use a MacBook Pro as a hub to control and execute rendering jobs using one of these powerful external GPU racks. This may affect educational and research sectors, not just graphics design.
 
AMD Ryzen 7 3800X vs. 3700X: What's the Difference?

WCCFTech said:
Silicon Lottery recently released some Ryzen 3000 binning data and this suggests the better quality silicon has been reserved for the 3800X. The top 20% of all 3800X processors tested passed their 4.3 GHz AVX2 stress test, whereas the top 21% of all 3700X processors were only stable at 4.15 GHz. Also, all 3800X processors passed the test at 4.2 GHz, while 3700X processors were only good at 4.05 GHz, meaning the 3800X has about 150 MHz more headroom when it comes to overclocking.

Silicon Lottery AMD Ryzen 3000 Binning Results
2019-08-23-image-2.png


In other words, the average 3800X should overclock better than the best 3700X processors, but it's still a minor 6% frequency difference we’re talking about between the absolute worst 3700X and the absolute best 3800X per their testing. For more casual overclockers like us the difference will likely be even smaller. Our 3700X appears stable in our in-house stress test and to date hasn’t crashed once at 4.3 GHz. This is the same frequency limit for the retail 3800X we got. As for the TDP, that’s confusing to say the least, but we'll go over some performance numbers first and then we’ll discuss what we think is going on.
Silicon Lottery finally released actual binning data on these chipsets (it's already disappeared from their website, btw, but WCCF Tech recorded it above). Practically speaking, with a 4.3 GHz ceiling observed even among the best 1/5th of 3800X chips, if you're lucky enough to get one, like all the other Matisse chips, you still won't eclipse the PBO+Max performance of the same chip in games, and in fact, one wouldn't expect the best-overclocking 3800X chips to beat the 3700X in most games (due to the 4.3GHz-4.4GHz turbo spikes on the 3700X's top cores).

In fact, that was even observed in stress tests:
Enabling PBO still saw the 3800X run around 2-3 degrees hotter but now it’s only clocking 25-50 MHz higher than the 3700X, so if you want to turn your 3700X into a 3800X, just enable PBO.
+0.98%
ACO.png



+0.64%
BFV.png



+0.63%
Division.png



+1.96%
TR.png



It consumed 8.5% more power to achieve this ~1% improvement in game performance.

CLIFFS: if you're very, very stupid, you'll buy or recommend this CPU over the R7-3700X.
 
One case I'd like to highlight from my reading today. It's the update/successor to one of the coolest concept lines that has ever existed: Lian Li's briefcase design TU series (PC-TU100, PC-TU200, PC-TU300):
Bit-Tech: Lian Li TU150 Review
eTeknix: Lian Li TU150 Mini-ITX Case Review
TechPowerUp: Lian Li TU150 Review

Tweaktown: Lian Li TU150 Mini-ITX Chassis Review


Lian-Li PC-TU150
$109
9125_33_lian-li-tu150-mini-itx-chassis-review_full.png


TU150-WX / TU150-WA Specifications
  • Dimensions: (W) 203mm x (H) 312mm x (D) 375mm
  • Color: Black, Silver
  • Material: 1.5mm aluminum exterior (top/front/right side panel) / 1.0MM SPCC interior; 3.0mm tempered glass left side panel
  • Weight: 3.6KG
  • Expansion slot: 3
  • Drives: 2 x 2.5” SSD or 1 x 2.5” SSD + 1 x 3.5” HDD
  • I/O ports: 2 x USB 3.0, 1 x USB 3.1 Type-C, Audio ports (located on the top panel)
  • MB type: Mini-ITX / Mini-DTX
  • Radiator: 1 x 120mm (rear)
  • GPU length: 320mm
  • CPU clearance: 165mm
  • System Fan: 1 x 120mm (front) + 1 x 120mm (rear) + 2 x 120mm (base)
  • PSU: SFX or SFX-L
upload_2019-8-25_9-14-37.pngupload_2019-8-25_9-15-25.png

 
Last edited:
So you hot rodders spending like 6 grand on a gaming rig, is it really any better than a $300 xbox?
 
So you hot rodders spending like 6 grand on a gaming rig, is it really any better than a $300 xbox?
Yes AND No

I have both, to me if I’ve been at work on my computer all day I don’t want to “relax” to a game, sitting at my damn computer in my office.

I’ll plop on the couch , turn on the Xbox and play

So i do want both, you can do SO MUCH more with a high end PC, but as far as a “pure gaming “ experience, if I only had to have one I’d keep the Xbox and my work laptop and ditch the PC
 
So you hot rodders spending like 6 grand on a gaming rig, is it really any better than a $300 xbox?
At this point I think most would tell you the greatest advantage has nothing to do with graphics or hardware, but the ability to play pretty much any game ever made in history-- including specifically the most loved ones-- outside a few dozen blocked behind Sony's PS4 and Nintendo's Switch/Wii-U exclusive walls.

Apart from the fact WIndows owns the best open-source emulator for past systems nearly across the board, notice how all the gaming services scattered across platforms have piled up on PC because it has always attempted to remain a more neutral platform:
  • Microsoft Xbox Game Pass including Play Anywhere
    132 game depot (73 games compatible w/Play Anywhere)
  • Sony Playstation Now
    814 game depot
  • EA Origin Access
    230 game depot + DLC/Expansions/Bundles
  • NVIDIA GeForce Now
    542 games compatible with this free service (for NVIDIA GPU owners)
  • Discord Nitro
    97 game depot
  • Twitch Prime
    4-5 gifted games per month & 15+ gifted during Amazon Prime season
  • Humble Bundle Monthly
    Usually 8-12 games delivered via code monthly; $1200+ per year
  • Utomik
    1000 game depot
  • Vortex
    44 game depot; 93 total compatible with service [cloud-streaming only]
  • Jump
    118 game depot [cloud-streaming only]
  • Ubisoft UPlay+ (upcoming)
    116 game depot

Volume is telling. Counting only Windows 7 and later, Steam has 32,132 games, and 21,179 DLCs. Before January 1st, 2019, when they still offered Windows XP and Windows Vista games, there was already at that point over 63,000 total games. These could still be ran in compatibility mode if you acquire them elsewhere.

In addition to that the PC has the most competitive marketplace between Steam, Epic, EA, Uplay, GoG, and Green Man Gaming. The sales crush the console sales. It's also far more favorable to pirates (ex. you don't have to buy a second machine to protect your accounts).
 
Last edited:

Similar threads

Back
Top