It is certainly provocative to say that CPUs will dominate any part of visualization - but I say it with confidence that the data supports why this is happening. The primary drivers are (1) data sizes, (2) minimizing data movement, and (3) ability to change to O(n log n) algorithms. Couple that with the ultra-hot topic of "Software Defined Visualization" that makes these three things possible - and you have a lot to consider about how the world is changing.
Of course, what is "high end" today often becomes common place over time... so this trend may affect us all eventually. It's at least worth understanding the elements at play.
At ISC17, in Germany, this week (June 19-21) Intel is demoing (and selling) their vision of a “dream machine” for doing software defined visualization with a special eye towards in situ visualization development. Jim Jeffers, Intel, and friends are demonstrating it at ISC'17 in Germany, and they will be at SIGGRAPH'17 too. The "dream machine" can support visualization of data sets up to 1.5TB in size. They designed it to address the needs of the scientific visualization and professional rendering markets.
Photo credit (above): Asteroid Deep Water Impact Analysis; Data Courtesy: John Patchett, Galen Glisner per Los Alamos National Laboratory tech report LA-UR-17-21595. Visualization: Carson Brownlee, Intel.
With Jim's help, I wrote an article about how more information about how CPUs now offer higher performance and a lower cost than competing GPU-based solutions for the largest visualization tasks. The full article is posted with coverage at TechEnablement site.
In the full article, aside from my writing about the trend - I do provide links to technical papers the show this trend towards CPUs as the preferred solution for visualization of large data (really really big), as well as links to conferences, and links about the "visualization dream machine" (how I describe it, not what Intel calls it officially).