At Siggraph 2013, we met with several members of the SPEC graphics performance committee. (SPEC is short for "standard performance evaluation corporation.")
The SPEC organization produces a number of benchmarks, and the latest one is Specviewperf V12. it is a benchmark used by hardware vendors and by end users purchasing and upgrading workstations.
The process of creating view sets is very involved, because they are tied to working at a graphics driver level. SPEC went to a new direction with version 12: a timing framework that loads executable programs. For example, in volume rendering, the CPU calculates texture coordinates for every frame displayed on the screen. In addition, SPEC is no longer tied to OpenGL and so can use DirectX applications. Overall, the benchmark is more general, they tell us.
Application Benchmarks for CAD
Among CAD systems, Specviewperf support Catia V6, Creo V2.0, SolidWorks, and NX. We don't need the actual CAD software; instead the benchmark captures the graphics calls made by the program, because SPEC wants the benchmarks to be independent of the licensing, yet be representative of the workload. This way providers of graphics systems can optimize their drivers for each app. For instance, nVidia uses this to ensure future GPUs don't have holes in them that slow down performance.
The benchmark measures mostly the graphics performances, seeing how much load is on the CPU and on the GPU. Feedback is given to applications programmers when too much graphics load is on the CPU.
Released today at Siggraph was the updated benchmark for NX 8.5. (The last one was for NX 6, so there was some catching up to do.) The benchmark committee is working to produce the benchmarks more quickly, what with software vendors and their annual upgrades.
Because SPEC is independent, large CAD vendors have come to rely on them to find where the graphics codes could be optimized. By using SPEC, they don't have to write their own benchmarks, and so end up using one that are more representative for real-world performance.
"Why only MCAD?" we ask. Other types of CAD software are not left out due to a lack of desire, but for practicality: the SPEC members haven't gotten around to AEC, etc due to a lack of time. The new version of the spec, however, might make it possible to measure CAE performance. Perhaps the solution is use sub committees, who look at areas such as energy, oil and gas, and architecture.
Workstation Benchmarks in Beta
WPC (short for "workstation performance characterization") is scheduled to be released this fall. It handles CPU computation workloads, I/O, graphics, and combinations thereof. (I/O is short for "input and output," and includes memory and disk.)
"What about the Windows Experience Index?" we ask. It is actually pretty good, the guys tell us, because it does low-level analysis of the different systems -- graphics, disk I/O, and so on. The problem, however, is that Microsoft has not updated it in quite a while. As a result, all workstations tend to get the top score of 7.9 in Windows 7, and 9.9 in Windows 8. (Later, we found that Microsoft has removed Windows Experience Index from Windows v8.1.)
http://www.spec.org
Comments