Intel’s software visualization efforts have resulted in exceptional performance using Embree and Intel CPUs over the last few years. Pressing its case on the role of CPUs for exascale data visualization further, Intel is introducing a Software Defined Visualization (SDVis) appliance claiming superior performance and lower cost than GPU-based solutions. Intel will be demoing the machine on the floor at ISC and SIGGRAPH.
How did we get here?
A few years ago, some huge visualization solutions started emerging that claimed better results on CPUs than on GPUs. HPCwire published an opinion by Jim Jeffers of Intel, of this trend, with data, in 2015 as “Contrary View: CPUs Sometimes Best for Big Data Visualization.”
In his presentation at the Intel Developer Forum, Jeffers gave a talk claiming that a single Intel Xeon processor E7 v3 workstation was able to render a 12-billion particle, 450 GB cosmology dataset at seven frames per second. For that input data set, he said would take more than 75 GPUs to perform the same visualization task. And benchmarks presented at the recent IXPUG conference continued the theme demonstrating better results on CPUs than on GPUs in a number of different examples.
In the past couple of years, the adoption of Software Defined Visualization has expanded – a notable accomplishment being that the “Best Visualization and Data Analytics Showcase award” at SC’16 was won by the Los Alamos’ Data Science at Scale Team. You can see the LANL team’s award winning asteroid impact visualization online (very cool).
In situ visualization, the ability to visualize data where it is created, has been cited as an imperative for Exascale problems including HPC and HPDA/ML/AI workloads because the data sizes at such scale are simply too large to be casually moved around to an alternate machine to do the visualization.
In situ visualization and Software Defined Visualization are hot topics. There was a “Software-Defined Visualization Workshop” held recently at TACC, home of the Stampede supercomputers. There will be a workshop at ISC titled “In Situ Visualization: Introduction & Applications.” If 2016 was any indication, it seems a sure bet there will be Software Defined Visualization and In Situ Visualization discussions at SC17 this fall, including the In Situ Infrastructures for Enabling Extreme-scale Analysis and Visualization Workshop on Sunday November 12 in Denver (call for papers submissions due August 1).
Disruptors? OSPRay and OpenSWR
Intel is behind two major software efforts aimed at enabling CPUs to excel at the visualization of exascale data:
- Highly-optimized CPU-based rendering software: the open-source OSPRay ray tracing library
- High performance OpenSWR raster library in Mesa3d, which has been integrated into popular visualization tools like Kitware’s Paraview and VTK, as well as the community tool, VisIt
Intel cites the transition from OpenGL targeted hardware rasterization to CPU-based rendering as enabling algorithm designers can exploit large memory visualization nodes to create logarithmic runtime algorithms. The assertion is that this is a significant advantage for CPU-based solutions.
Intel’s SDVis Appliance – their bid for a “dream machine” for visualization work
Intel is offering their vision of a “dream machine” for doing software defined visualization with a special eye towards in situ visualization development. Intel says it can support visualization of data sets up to 1.5TB in size, is designed to address the needs of the scientific visualization and professional rendering markets, and offers higher performance and a lower cost than competing GPU-based solutions.
Gautam Shah, President Colax International, says that all the pieces are ready to go now. Colfax builds to order, including installing the software and doing a test burn-in of the machine – a process that takes about four weeks order-to-ship. Not bad for a 14U “dream machine.” Pricing starts at about $79,000 per machine ready to use.
Demos at ISC and SIGGRAPH
Intel is demoing the machine at ISC featuring live animation of the LANL/TACC “Asteroid/Earth Collision” data set (see a couple photos below). Jim Jeffers, of Intel, hopes people will drop by and see them at ISC this coming week, or at SIGGRAPH later this year, to see the impressive capabilities. Jim said “It’s exciting to talk about our machine and SDVis, but seeing is really believing. We are able to support big data use on HPC clusters without the memory limits and cost of GPU-based solutions.”
Summary
We all know that moving data around is worth avoiding, and this applies to visualization work as well. The movement to “in situ visualization” and “software-defined visualization” combine in interesting ways, and Intel is offering their take on how to solve the problem in a very flexible manner complete with the software/hardware combination to make it real.
For More Information
I recommend the following sites for more detailed information:
- SDVis Appliance website – includes detailed data sheet and information on how to order
- Two articles previously published in HPCwire:
- Workshop at ISC titled “In Situ Visualization: Introduction & Applications” – website: http://www.woiv.org/speakers.html
- ISAV 2017: In Situ Infrastructures for Enabling Extreme-scale Analysis and Visualization
- Embree is a collection of high-performance ray tracing kernels.
- Intel’s Software Defined Visualization website
- Presentations and videos (still being populated) from the recent “Software-Defined Visualization Workshop @ TACC (May 22 – 25, 2017)”
- Intel paper “High Fidelity and High Performance Visualization”
Leave a Reply