The W-34 series of Xeon have a ton of PCIe lanes and memory channels. That’s what you’re paying for
The W-34 series of Xeon have a ton of PCIe lanes and memory channels. That’s what you’re paying for
The thing about codec support is that you essentially have to add specific circuits that are used purely for decoding and encoding video using that specific codec. Each addition takes up transistors and increases the complexity of the chip.
XMX cores are mostly used for XeSS and other AI inferencing tasks as far as I understand. While it could be feasible to create an AI model that encodes video to very small file sizes, it would likely consume a lot of power in the process. For video encoding with relatively high bitrates it’s more likely an ASIC would consume a lot less power.
XeSS is already a worthy competitor/answer to DLSS (in contrast to AMD’s FSR2), so adding XMX cores to accelerate XeSS alone can be worth it. I also suspect Intel GPUs use the XMX cores for raytracing denoising.
There are already encoders and decoders for x.264/265/VP9/AV1 on Intel GPUs, these are codec-specific. The article of this post points to Intel increasing the capabilities of the GPU, which is usually accompanied by an increase in encoding/decoding performance and efficiency.
Does anyone really buy a 13700k and then game on it’s iGPU?
The 13700K has a considerably smaller GPU than a 1360P actually, so no.
Let’s assume you get a 14900K then, overclock it to 6 GHz P-cores using direct die cooling, and throw on 8000 MT/s DDR5 with a Z790 Apex Encore for good measure. You’re now somewhere around 30% faster in games than your previous setup, and GPU utilization will occasionally hit 90% instead of 70%
All for the neat sum of roughly 2000 USD
If you’re purely gaming, there’s no reason to even consider the 14700K unless it’s bundled for a lower price than a 7800X3D setup.
If you actually do so much multithreaded work that you really notice the benefits of a 14700K over a 7800X3D, you might as well go to a 14900K, because every minute saved helps.
The lowest thermal throttling temp you can set on a 14900K is 62C, so you will have to settle for that.
AMD allows you to set the boost temperature target to 60C if I remember correctly, and since the V/F curve is a lot flatter on AMD CPUs in general, the performance deficit from doing so is significantly smaller