If you’re talking about gaming, I can’t imagine a case where one would bottleneck the other. Games known to clobber a GPU don’t even seem to care what CPU you have (Alan Wake 2 eats my 3090 for breakfast and barely even knows I have a 14900k in there).
Likewise for games that are CPU-intensive. I don’t know which. I’ve heard Baldur’s 3 and several 4x games eat CPUs. But the GPU requirements are generally low on those games.
tldr: I don’t think bottlenecking is a really big deal anymore in gaming. Games tend to eat one and ignore the other.
Anecdotally, same for professional data analysis applications (when we’re not using a VM, anyway). Most applications I’m used to just devour my GPU and leave my CPU alone.
If you’re talking about gaming, I can’t imagine a case where one would bottleneck the other. Games known to clobber a GPU don’t even seem to care what CPU you have (Alan Wake 2 eats my 3090 for breakfast and barely even knows I have a 14900k in there).
Likewise for games that are CPU-intensive. I don’t know which. I’ve heard Baldur’s 3 and several 4x games eat CPUs. But the GPU requirements are generally low on those games.
tldr: I don’t think bottlenecking is a really big deal anymore in gaming. Games tend to eat one and ignore the other.
Anecdotally, same for professional data analysis applications (when we’re not using a VM, anyway). Most applications I’m used to just devour my GPU and leave my CPU alone.