Open side-bar Menu
 Jeff's MCAD Blogging
Jeff Rowe
Jeff Rowe
Jeffrey Rowe has almost 40 years of experience in all aspects of industrial design, mechanical engineering, and manufacturing. On the publishing side, he has written well over 1,000 articles for CAD, CAM, CAE, and other technical publications, as well as consulting in many capacities in the design … More »

Measuring Up: SPECapc Releases New NX 9 and 10 Benchmarks

 
July 7th, 2016 by Jeff Rowe

Like them or not, PC benchmarks let you evaluate performance, identify potential bottlenecks, and choose effective system upgrades of both hardware and software. Unfortunately, too many users still think that system performance is simply a matter of CPU frequency or memory capacity, which leads them to think that dropping in a faster CPU or more memory will automatically yield significant performance improvements. Unfortunately, however, this is not always the case.

While CPU and memory upgrades can help in some instances, it often makes more sense to upgrade the storage subsystem or the graphics board if you’re looking for perceptible improvement in system responsiveness or performance. For example, if you run a series of benchmarks and identify the components holding your system back, you’ll be able to choose the most effective upgrade for your current system – or at least determine which components make the most sense in a new system suited to your particular needs.

Today, several different types of benchmarks are available for evaluating a system’s performance. Some use synthetic tests that don’t necessarily reflect real-world usage, while others employ scripted tests that rely on actual applications and simulated real-world workloads. Some benchmarks assess the performance of a single component, while others measure total system performance. To best gauge the overall performance of a PC, consider running some combination of all of these different types of benchmarks, based on your usage patterns.

NX_9cylinderRadialEngine_full
 Radial Engine Designed With Siemens NX

For measuring performance in the world of MCAD hardware and software, there are two essential benchmarks – SPECapc and SPECviewperf from SPEC/GWPG (Graphics & Workstation Performance Group), a non-profit organization that sponsors the development of standardized, application-based benchmarks for the vendor and user communities. The two benchmark suites have different purposes and different types of users.

SPEC/GWPG’s Application Performance Characterization Group (SPECapc) was formed in 1997. The group addresses graphics and workstation performance evaluation based on actual software applications, and provides a broad-ranging set of standardized benchmarks for professional-level graphics and workstation applications.

SPECapc was formed to address performance evaluation based on professional workstation applications. Current members include AMD, Dell, Fujitsu, HP, Intel, Lenovo, NVIDIA and VMware. SPECapc is part of the Standard Performance Evaluation Corp. (SPEC), a non-profit corporation formed to establish, maintain and endorse a standardized set of relevant benchmarks that can be applied to the newest generation of computers.

The SPECapc project provides methods for evaluating and comparing the performance of computers across software vendor platforms and configurations. SPECapc’s benchmarks are application-based, representative of end-user needs, and measure total system performance.

SPECapc benchmarks are designed to measure, as much as possible, total performance for graphics and workstation applications. They typically include tests for graphics, I/O and CPU performance, and they require that the user has a license for the application on which they are based. SPECapc benchmarks are based on large models and complex interactions, and tend to take a long time to run.

SPECapc benchmarks require that users have the correct version and build of the application software in order to run the benchmarks. SPECapc benchmarks are continuously updated, and represent primarily CAD/CAM, digital media, and visualization applications.

The group’s SPECviewperf benchmark is the most popular standardized software worldwide for evaluating performance based on professional-level CAD/CAM, digital content creation, and visualization applications.

Viewsets, the benchmarks that run on SPECviewperf, exercise only the graphics functionality of the application. Because it strips away application overhead, SPECviewperf allows direct performance comparisons of graphics hardware. SPECviewperf does not require users to have licenses of the applications on which its viewsets are based. This makes it more accessible to a wider range of users. SPECviewperf is also easier to use and faster to run than SPECapc benchmarks.

This week, the SPECapc project group released new performance evaluation software for systems running Siemens NX 9.0 and NX 10.0 CAD/CAM applications. The new benchmark can be run with Siemens NX 9.0 or NX 10.0, although results from the two different application versions are not comparable. Users need an NX 9.0 or NX 10.0 license that includes the NX Shape Studio or equivalent bundle for Advanced Studio rendering. The benchmark is designed to run on Microsoft Windows 7 64-bit platforms.

SPECapc for Siemens NX 9/10 is a collaborative effort between Siemens and SPECapc. It runs on the Windows 7 64-bit operating system.

Siemens NX Fluorescent Bulb Video Tutorial

The SPECapc for NX 9/10 benchmark can be run with Siemens NX 9.0 or NX 10.0, although results from the two different application versions are not comparable. Users need an NX 9.0 or NX 10.0 license that includes the NX Shape Studio or equivalent bundle for Advanced Studio rendering.

Seven models representing common use cases are included in SPECapc for NX 9/10. The benchmark executes graphics tests that include rotation, pan, zoom and clipping for each model. CPU tests within the benchmark measure performance for field of view (FOV) and feature regeneration operations. Anti-aliasing can be enabled or disabled to allow users to assess performance differences between the two modes.

The benchmark produces composite GPU and CPU scores, as well as sub-scores for each of the display modes tested during a run. Display modes include shaded, shaded with edges, wireframe, wireframe with hidden edges, and wireframe with dim edges, as well as the application-specific Advanced Studio, True Studio and Face Analysis modes. Scoring is weighted based on how NX 9.0 and NX 10.0 users exercise functionality in their work.

“We consider this the most comprehensive performance evaluation software available for systems running Siemens NX 9 or 10,” says Trey Morton, SPECapc chair. “It represents all the graphics display modes used in the application and closely simulates user interaction when running NX 9 or 10 in the real world.”

Seven models representing common use cases are included in SPECapc for NX 9/10. The benchmark executes graphics tests that include rotation, pan, zoom and clipping for each model. CPU tests within the benchmark measure performance for field of view (FOV) and feature regeneration operations. Anti-aliasing can be enabled or disabled to allow users to assess performance differences between the two modes.

The benchmark produces composite GPU and CPU scores, as well as sub-scores for each of the display modes tested during a run. Display modes include shaded, shaded with edges, wireframe, wireframe with hidden edges, and wireframe with dim edges, as well as the application-specific Advanced Studio, True Studio, and Face Analysis modes. Scoring is weighted based on how NX 9.0 and 10.0 users exercise functionality in their day-to-day work.

SPECapc for NX 9/10 debuts a new user interface and results reporting format that will be the basis for all future SPECapc benchmarks. The new GUI automatically records system configuration information, allows the user to select specific parameters for the benchmark run, and displays results when testing is completed.

SPECapc for NX 9/10 is available for immediate download on the SPEC website under a two-tiered pricing structure: free for non-commercial users and $2,500 for commercial entities. Commercial entities are defined as organizations using the benchmark for the purpose of marketing, developing, testing, consulting for and/or selling computers, computer services, graphics devices, drivers or other systems.

While its legacy is benchmarking software and physical computing hardware, SPEC’s first benchmark suite to measure cloud performance is SPEC Cloud_IaaS 2016, targeted at cloud providers, cloud consumers, hardware vendors, virtualization software vendors, and application software vendors. The benchmark addresses the performance of infrastructure-as-a-service (IaaS) public or private cloud platforms. The benchmark is designed to stress provisioning, as well as runtime aspects of a cloud using I/O and CPU intensive cloud computing workloads. SPEC selected the social media NoSQL database transaction and K-Means clustering using map/reduce as two significant and representative workload types within cloud computing.

The benchmark is designed to stress provisioning as well as runtime aspects of a cloud using I/O and CPU intensive cloud computing workloads.

The key benchmark metrics are:

  • Scalability that measures the total amount of work performed by application instances running in a cloud.
  • Elasticity that measures whether the work performed by application instances scales linearly in a cloud when compared to the performance of application instances during baseline phase. Elasticity is expressed as a percentage.
  • Mean Instance Provisioning Time that measures the time interval between the instance provisioning request and connectivity to port 22 on the instance. This metric is an average across all instances in valid application instances.

While some in the MCAD community dismiss the relevance of benchmarks in today’s diverse computing environments, they still have real significance where performance needs to be measured for a variety of reasons. The ability to objectively measure and compare results based on pre-defined parameters definitely still does have relevance in highly demanding computing environments.

Related posts:

Tags: , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

SolidCAM: Surprising Advantages of Integrated CAM Webinar 10/19/2016
MasterCAM



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy Advertise