Open side-bar Menu
 The Interoperability Advisor

Posts Tagged ‘3D MBD’

Processes & Metrics – Missing Pieces of the ESI Puzzle (Part 2 of 5)

Wednesday, December 22nd, 2010

“The Case for Automated Remastering” is a 5 part series exploring a changing paradigm within Engineering Systems Interoperability (ESI). This latest post explores how processes and metrics can be applied to ESI.

Part 2: Processes & Metrics – Missing Pieces of the ESI Puzzle

When it comes to architecting processes for PLM solutions, many manufacturers are drowning in a sea of technical and administrative due diligence.  However, organizations often neglect to apply this same type of due diligence to architecting and implementing Engineering Systems Interoperability (ESI) solutions. 

The situation offers tremendous upside for managers that oversee process and/or quality controls.  In some cases, the internal rate of return on an ESI project can be two to ten times the investment.  The problem is that some manufacturers treat ESI as a problem point within the value chain, when in essence, these problems can (and will) propagate themselves at multiple points within the value chain. 

With the model-based enterprise (MBE), as more data is populated in the upstream CAD model, conversely, there is more that can go wrong in the downstream systems.  In cases where these models are constantly being manipulated and reused (but not in a consistent way), process controls, operational metrics, and technical oversight are the primary means of enforcing MBD policies.  Automated remastering can help solve these problems by providing processes, metrics, and activity tracking.

The Automated Remastering Process – A Holistic View

Figure 2.1 illustrates an end-to-end holistic view of automated remastering to help engineering IT managers create the overall project vision. 

Using an automated remastering process offers three distinct tactical advantages: 1) scales to accommodate multiple programs, the enterprise, and/or the supply chain, 2) portable to other programs/products and 3) includes API “hooks” into PLM software applications and workflow.

The process also dovetails into three financial advantages:  1) incorporates/integrates your team’s existing ESI technologies, 2) uses a hybrid model that balances both automation and manual intervention to offset labor costs, and 3) leverages Lean building blocks to justify, measure and oversee the process.

Figure 2.1 – Automated Remastering Top-Level Process

Lean Goals Automated Remastering

Each point within the process above represents a separate workflow that can be modified to fit within your PLM environment and/or IT infrastructure. Figure 2.2 (below) represents a simplified, project-specific automated remastering workflow for a 5,000-part dataset.

Figure 2.2 – Automated Remastering Sample Workflow

Automated Remastering Workflow

Forming the Basis for an Internal Rate of Return

As with any ESI process, regardless of domain (CAD, CAE, CAM, PLM, MRO, etc.), a manager’s ability to measuring the internal investment return and subsequently realizing those results, is critical.  For many purchasing situations, finance recognizes Lean methodologies to help justify capital expenditures. 

In the example below, we illustrate how process efficiencies and labor waste may be depicted in a visual management model depicting value-added and non-value added time.  For this particular project, labor hours were assigned to each bar on the chart; the subsequent report included detailed workflows.

Figure 2.3 – Depicting ESI Value-Added (VA) & Non-Value Added (NVA) Time

ESI Value Added Time

Collecting ESI Metrics for Cost-Benefit Analyses

 The Process of automated remastering includes a qualification workflow to help managers analyze data sets, predict potential trouble spots, mitigate risk and prioritize spending.  The technology within this workflow provides both top-level and in-depth analyses for MBD data moving between systems, including 3D models and assemblies, the 2D drawings and Product Manufacturing Information (PMI).

Figure 2.4 – Sample of Metrics or Cost-Benefit Analyses

Cost Benefit Analyses resized 600

ESI Projects should leverage Process-Centric Methods & Measures

The reasons why automated remastering is an acceptable option for any ESI initiative is that the solution offers a scalable, repeatable and portable process that captures metrics and measures.  The technology powering the process offers managers the capability of overseeing the details of what is actually happening to their intellectual property.  In the next article, we will explore the technologies that managers can use to deploy a cost-effective hybrid solution consisting of automation and manual intervention. 

Learn How to Achieve Lean Goals through Automated Remastering

To learn more about using a process-centric approach to Engineering Systems Interoperability, please register for our 30-minute webinar, Achieving Lean Goals through Automated Remastering, facilitated by my colleague, Program Manager & Senior Consultant, Tony Provencal. To register, click here.

CAD Model Translation – A Commodity in Question (Part 1 of 5)

Friday, December 10th, 2010

“The Case for Automated Remastering” is a five-part article series that explores a changing paradigm within Engineering Systems Interoperability (ESI). Part 1, “CAD Model Translation – The Commodity in Question,” explores how the maturation of CAD is changing the industry’s approach to ESI.

Part 1: CAD Model Translation – A Commodity in Question
Architecting and implementing ESI solutions is becoming increasingly difficult – you can’t throw a translator at the problem and expect it to be a cure-all. There is still a need for automation tools that manipulate geometry, but the interoperability market is rapidly expanding beyond geometry-centric translation point solutions.

Manufacturers need consultative ESI solution providers that know how to architect processes, pair them with automation tools, and integrate these solutions into their PLM environments.

The Maturation of CAD and the Effects of Model-Based Definition (MBD)
Eight-to-ten years ago, interoperability software applications were often limited to desktop-based, 2D and 3D-BREP software translators (see Figure 1.1). Clients evaluated these solutions like any other point solution – by comparing features, functionality, price, and performance. In most cases, buyers hoped to achieve 100% success, and were mostly satisfied when it was technically possible to get close (90-98%).

Figure 1.1 – Legacy CAD Translation Scenario
Legacy CAD Translation resized 600
Today, 3D models have essentially become holding containers for intellectual property (see Figure 1.2). With the proliferation of MBD, and the advances in 3D modeling technologies, more entities are now being introduced into the CAD model. It is increasingly difficult to maintain 100% of these entities, despite the fact that they are managed and manipulated throughout the product development value chain.
Figure 1.2 – Today’s MBD Translation Scenario

MBD Translation

As CAD systems continue to mature, and PLM systems become more complex, the ability to achieve 100% translation success in an MBD-centric environment is diminishing rapidly. In most cases, some form of manual intervention is required. Even with feature-based translation, model completion is often required to ensure design intent and model quality, and preserve drawings and/or manufacturing information.

Move Away from the “Translator” Mindset and Execute Strategically

MBD has created an explosion of new data, and this proliferation of intellectual property requires a different approach to CAD data management. Manufacturers are becoming strict about how they deliver data to partners and suppliers. Technical environments are changing; the size of models is increasing, as is the need for improved hardware and robust infrastructures.

MBD strategists are looking for solutions that offer better performance, robustness and investment returns. This is where automated remastering can play a strategic role in moving MBD data between containers (CAD systems). The automated remastering process is scalable and repeatable, and in many cases, serve as a confidence-building first step in constructing an ESI strategy.

In the next article, we will examine the process of automated remastering, and how this process can factor into your Lean Manufacturing initiatives.

Acknowledgments: My thanks to colleagues Tony Provencal and Peter Heath for their time and contributions to this article series. For more information about the solutions their teams provide, visit: and, respectively.

Manufacturing Risks Resulting from CAD Version Upgrades

Thursday, July 29th, 2010

The increasing number of manufacturers pursuing MBD strategies has resulted in demands for new features and functionality to be added to direct modelers. However, changes to the modeler sometimes result in changes to the data entities (i.e. geometry, attributes, product structure, PMI information and graphical representation) because the new software version is interpreting the model in a different way.
Without a process or tool for confirming the integrity of your CAD data, the data itself begins to pose risks to a number of downstream processes. These risks become greater when automated software updates are invoked by either the PLM system or a user because changes made to the CAD models propagate throughout the entire product/program faster.

Rolling CAD versions not only cause perpetual data instability issues for designers, the situation also impacts simulation, tooling and assembly, and in many cases, production rates. With right improvements, the infusion of a data analysis process after each CAD version roll can help mitigate the weeks or months of troubleshooting that’s likely to follow.

This article offers a snapshot of key risk areas and some examples of how an early warning system can be used to discover, illustrate and document model changes before they implode a master model initiative.

Risks to Product Manufacturing Information (i.e. GD&T)

CAD version updates pose the highest risks to your product manufacturing information because this CAD modeling functionality is new and rapidly evolving. Any change to the PMI information changes the manufacturing definition, which can cause simulation, machining, and product assembly failures, and incurs labor waste associated with troubleshooting and diagnostics.

In this example, our company’s diagnostic tool found PMI changes to the pilot hole annotations in this CAD model:


Risks to Product Shape Definition

Automated CAD version updates forces the system to re-interpret the model. Changing an attribute, adding a feature, and then saving the model can introduce unintentional changes that may not be detected until the model is modified or used by a downstream process. In short, the model is just fine until you save it in the new version.

Risks to Graphical Representation

When you open a CAD model, you are viewing a graphical representation of the geometry, structure, attributes and PMI. Graphical representations may change as the CAD revisions change, which may cause users to make changes to the data because the on-screen representation is inaccurate.

Also, platform changes (i.e. switching from 32-bit to 64-bit platforms) can affect floating point scales, which can also affect how the data is represented on-screen.

Risk Mitigation for CAD Data Stability

One way to mitigate these risks is to automatically detect changes in your product shape, PMI or graphics before your propagate the CAD version roll, and determine the impact to downstream applications.> By using an early detection system, you can remediate these changes and avoid downstream failures, labor waste, and ultimately, production delays.

If your organization would like to learn more about our best practices for detecting, diagnosing, documenting and remediating data stability issues, register for our forty-minute online workshop, “Data Stability for Manufacturing,” by visiting or emailing me at

Risk Mitigation for CAD Validation Deployments

Monday, June 7th, 2010


An increasing number of engineering enterprises have built successful business cases for 3D CAD validation; this demand for automated solutions has propelled the release of several validation software products within the last year.  With a plethora of new CAD validation offerings now available, the industry’s attempts at commoditizing CAD validation poses substantial risks to the engineering and IT decision-makers that evaluate, procure, and oversee CAD validation initiatives. 


As more applications enter the market, industry veterans of 3D CAD validation have experienced a substantial increase in the number of remediation engagements associated with failed projects that stem from untested and immature software applications.  Validation software helps organizations avoid scrapped parts, labor waste, and product recalls, but sub-standard deployments will wreak havoc in many downstream processes.  This article explores three ways managers can limit their exposure:



1.  Know the Common Denominators of Failed Projects


There are three common denominators associated with failed CAD validation projects:  1) engineering managers were unfamiliar with the implementation requirements, process changes, and the downstream impacts associated with CAD validation and lacked the knowledge to mitigate the risks,  2) engineering influencers and technology champions assumed that all CAD validation software solutions were mature, and 3) IT managers applied the same decision-making processes and criteria to validation solutions as they do to a commodity purchase (i.e. hardware).



2.  Use Specific Investigative Criteria During Your Discovery Phase


The CAD validation market is on the cusp of stabilization, but do not assume that the market has matured to the point of commoditization.  Because of the impact validation poses to downstream applications and processes, decision-makers should rely on a consultative pre-acquisition strategy that requires potential validation suppliers to provide more than just data analysis results.  Require your suppliers to provide pre-sales consultative input on process improvements, deployment architectures, diagnostic prioritization, usability, risk mitigation, standardization, statistics and reporting, and measurements for success.



3.  Consider Possible Reuse Scenarios


Validation software is sold as a point solution or integrated into quality-centric software product suites; most are licensed for either desktop or server use and a few can be integrated into PLM environments.  The wrong solution architecture or deployment strategy will negatively impact uptime, scalability, and skew validation results, particularly if the demand for the technology increases.  Consult with your validation provider to determine all possible reuse scenarios for all points in the value chain (i.e. design, analysis, manufacturing, and sustainment).  Doing so will ensure a successful deployment strategy and promote consistency in your analysis results, software availability, scalability, reporting, and performance.




Jamie Flerlage is a Senior Consultant for ITI TranscenData, an interoperability consulting firm specializing in strategic services and software products for Fortune 500 manufacturers.  Since 1994, ITI TranscenData has assisted global enterprises with the acquisition, implementation, integration and customization of CAD validation solutions.  For more information, visit or email Jamie, at

Kenesto: 30 day trial

Internet Business Systems © 2018 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise