Subscribe via E-mail

Your email:

Follow Me

The Interoperability Advisor

Current Articles | RSS Feed RSS Feed

Processes & Metrics - Missing Pieces of the ESI Puzzle (Part 2 of 5)

  
  
  
  
  

The Case for Automated Remastering Series

“The Case for Automated Remastering” is a five-part article series that explores a changing paradigm within Engineering Systems Interoperability (ESI).  Part 2, “Processes & Metrics - Missing Pieces of the ESI Puzzle,” explores ways in which processes and metrics can be applied to Engineering Systems Interoperability (ESI).


Part 2:  Processes & Metrics - Missing Pieces of the ESI Puzzle

When it comes to architecting processes for PLM solutions, many manufacturers are drowning in a sea of technical and administrative due diligence.  However, organizations often neglect to apply this same type of due diligence to architecting and implementing Engineering Systems Interoperability (ESI) solutions. 

The situation offers tremendous upside for managers that oversee process and/or quality controls.  In some cases, the internal rate of return on an ESI project can be two to ten times the investment.  The problem is that some manufacturers treat ESI as a problem point within the value chain, when in essence, these problems can (and will) propagate themselves at multiple points within the value chain. 

With the model-based enterprise (MBE), as more data is populated in the upstream CAD model, conversely, there is more that can go wrong in the downstream systems.  In cases where these models are constantly being manipulated and reused (but not in a consistent way), process controls, operational metrics, and technical oversight are the primary means of enforcing MBD policies.  Automated remastering can help solve these problems by providing processes, metrics, and activity tracking.

 

The Automated Remastering Process – A Holistic View

 Figure 2.1 illustrates an end-to-end holistic view of automated remastering to help engineering IT managers create the overall project vision. 

Using an automated remastering process offers three distinct tactical advantages: 1) scales to accommodate multiple programs, the enterprise, and/or the supply chain, 2) portable to other programs/products and 3) includes API “hooks” into PLM software applications and workflow.

The process also dovetails into three financial advantages:  1) incorporates/integrates your team’s existing ESI technologies, 2) uses a hybrid model that balances both automation and manual intervention to offset labor costs, and 3) leverages Lean building blocks to justify, measure and oversee the process.

Figure 2.1 – Automated Remastering Top-Level Process

Lean Goals Automated Remastering

Each point within the process above represents a separate workflow that can be modified to fit within your PLM environment and/or IT infrastructure. Figure 2.2 (below) represents a simplified, project-specific automated remastering workflow for a 5,000-part dataset.

Figure 2.2 – Automated Remastering Sample Workflow

Automated Remastering Workflow


Forming the Basis for an Internal Rate of Return

As with any ESI process, regardless of domain (CAD, CAE, CAM, PLM, MRO, etc.), a manager’s ability to measuring the internal investment return and subsequently realizing those results, is critical.  For many purchasing situations, finance recognizes Lean methodologies to help justify capital expenditures. 

In the example below, we illustrate how process efficiencies and labor waste may be depicted in a visual management model depicting value-added and non-value added time.  For this particular project, labor hours were assigned to each bar on the chart; the subsequent report included detailed workflows.

Figure 2.3 – Depicting ESI Value-Added (VA) & Non-Value Added (NVA) Time 

ESI Value Added Time

 

Collecting ESI Metrics for Cost-Benefit Analyses

The process of automated remastering includes a qualification workflow to help managers analyze data sets, predict potential trouble spots, mitigate risk and prioritize spending.  The technology within this workflow provides both top-level and in-depth analyses for MBD data moving between systems, including 3D models and assemblies, the 2D drawings and Product Manufacturing Information (PMI).

Figure 2.4 – Sample of Metrics for Cost-Benefit Analyses

Cost Benefit Analyses resized 600

 

ESI Projects Should Leverage Process-Centric Methods & Measures

The reasons why automated remastering is an acceptable option for any ESI initiative is that the solution offers a scalable, repeatable and portable process that captures metrics and measures.  The technology powering the process offers managers the capability of overseeing the details of what is actually happening to their intellectual property.  In the next article, we will explore the technologies that managers can use to deploy a cost-effective hybrid solution consisting of automation and manual intervention. 

 

Learn How to Achieve Lean Goals through Automated Remastering

To learn more about using a process-centric approach to Engineering Systems Interoperability, please register for our 30-minute webinar, Achieving Lean Goals through Automated Remastering, facilitated by my colleague, Program Manager & Senior Consultant, Tony Provencal.  To register, click here.

CAD Model Translation - A Commodity in Question (Part 1 of 5)

  
  
  
  
  

The Case for Automated Remastering Series

“The Case for Automated Remastering” is a five-part article series that explores a changing paradigm within Engineering Systems Interoperability (ESI).  Part 1, “CAD Model Translation – The Commodity in Question,” explores how the maturation of CAD is changing the industry’s approach to ESI.


Part 1:  CAD Model Translation – A Commodity in Question

Architecting and implementing ESI solutions is becoming increasingly difficult – you can’t throw a translator at the problem and expect it to be a cure-all.  There is still a need for automation tools that manipulate geometry, but the interoperability market is rapidly expanding beyond geometry-centric translation point solutions. 

Manufacturers need consultative ESI solution providers that know how to architect processes, pair them with automation tools, and then integrate these solutions into their PLM-centric environments. 

 

The Maturation of CAD and the Effects of Model-Based Definition (MBD)

Eight-to-ten years ago, interoperability software applications were often limited to desktop-based, 2D and 3D-BREP software translators (see Figure 1.1).  Clients evaluated these solutions like any other point solution – by comparing features, functionality, price, and performance.  In most cases, buyers hoped to achieve 100% success, and were mostly satisfied when it was technically possible to get close (90-98%).

Figure 1.1 - Legacy CAD Translation Scenario

Legacy CAD Translation resized 600

Today, 3D models have essentially become holding containers for intellectual property (see Figure 1.2).  With the proliferation of MBD, and the advances in 3D modeling technologies, more entities are now being introduced into the CAD model than ever before.  It is increasingly difficult to maintain 100% of these entities, despite the fact that they are managed and manipulated throughout the product development value chain.

Figure 1.2 – Today’s MBD Translation Scenario

MBD Translation

As CAD systems continue to mature, and PLM systems become more complex, the ability to achieve 100% translation success in an MBD-centric environment is diminishing rapidly.  In most cases, some form of manual intervention is required.  Even with feature-based translation, model completion is often required to ensure design intent and model quality, and preserve drawings and/or manufacturing information.

 

Move Away from the “Translator” Mindset and Execute Strategically

MBD has created an explosion of new data, and this proliferation of intellectual property requires a different approach to CAD data management.  Manufacturers are becoming strict about how they deliver data to partners and suppliers. Technical environments are changing; the size of models is increasing, as is the need for improved hardware and robust infrastructures. 

MBD strategists are looking for solutions that offer better performance, robustness and investment returns. This is where automated remastering can play a strategic role in moving MBD data between “containers.”  The automated remastering process is scalable and repeatable, and can serve as first step in constructing an ESI strategy. 

 

In the next article, we will examine the process of automated remastering, and how this process can factor into your Lean Manufacturing initiatives.

 

Acknowledgements:  My thanks to colleagues Tony Provencal and Peter Heath for their time and contributions to this article series.  For more information about the solutions their teams provide, visit http://www.transcendata.com/solutions/proficiency/index.htm and http://www.transcendata.com/solutions/plm/index.htm, respectively.

Manufacturing Risks Resulting from CAD Version Upgrades

  
  
  
  
  

The increasing number of manufacturers pursuing MBD strategies has resulted in demands for new features and functionality to be added to direct modelers. However, changes to the modeler sometimes result in changes to the data entities (i.e. geometry, attributes, product structure, PMI information and graphical representation) because the new software version is interpreting the model in a different way.

Without a process or tool for confirming the integrity of your CAD data, the data itself begins to pose risks to a number of downstream processes. These risks become greater when automated software updates are invoked by either the PLM system or a user because changes made to the CAD models propagate throughout the entire product/program faster.  

Rolling CAD versions not only cause perpetual data instability issues for designers, the situation also impacts simulation, tooling and assembly, and in many cases, production rates.  With right improvements, the infusion of a data analysis process after each CAD version roll can help mitigate the weeks or months of troubleshooting that’s likely to follow. 

This article offers a snapshot of key risk areas and some examples of how an early warning system can be used to discover, illustrate and document model changes before they implode a master model initiative.

Risks to Product Manufacturing Information (i.e. GD&T)

CAD version updates pose the highest risks to your product manufacturing information because this CAD modeling functionality is new and rapidly evolving. Any change to the PMI information changes the manufacturing definition, which can cause simulation, machining, and product assembly failures, and incurs labor waste associated with troubleshooting and diagnostics. 

In this example, our company’s diagnostic tool found PMI changes to the pilot hole annotations in this CAD model:

CAD Version Update Example


Risks to Product Shape Definition

Automated CAD version updates forces the system to re-interpret the model. Changing an attribute, adding a feature, and then saving the model can introduce unintentional changes that may not be detected until the model is modified or used by a downstream process. In short, the model is just fine until you save it in the new version.

Risks to Graphical Representation

When you open a CAD model, you are viewing a graphical representation of the geometry, structure, attributes and PMI. Graphical representations may change as the CAD revisions change, which may cause users to make changes to the data because the on-screen representation is inaccurate.

Also, platform changes (i.e. switching from 32-bit to 64-bit platforms) can affect floating point scales, which can also affect how the data is represented on-screen.

Risk Mitigation for CAD Data Stability

One way to mitigate these risks is to automatically detect changes in your product shape, PMI or graphics before your propagate the CAD version roll, and determine the impact to downstream applications. By using an early detection system, you can remediate these changes and avoid downstream failures, labor waste, and ultimately, production delays.

If your organization would like to learn more about our best practices for detecting, diagnosing, documenting and remediating data stability issues, register for our forty-minute online workshop, “Data Stability for Manufacturing,” by visiting www.transcendata.com or emailing me at jjf@transcendata.com.

Best Practices for CAD to CAE Interoperability Projects

  
  
  
  
  

CAE tool providers attempt to close the data exchange gap between CAD and CAE with built-in pre-processors. Unfortunately, the overall success rates are low, ranging from 20-50%.  These failures are associated with conversion, repair, or simplification processes and add a significant amount of non-value-added labor.

Third-party solutions push success rates to 75-95%.  However, in global organizations, the proliferation of multiple tools, processes, and methodologies eventually erode productivity gains. This article explores best practices to help you increase your internal rate of return on CAD-to-CAE interoperability investments.

1.  Use Value Stream Maps to Illustrate Labor Waste

Many CAE teams struggle to find investment capital for interoperability initiatives.  By using value stream maps, non-value-added time (NVAT) is illustrated within the context of the process and quantified in terms of hours and labor dollars.  If you wish to see an example, email me.

2.  Minimize Downstream Risks by Classifying Data Formats

Modeling kernels all pose different risks to downstream applications.  If your CAE teams consume data from multiple sources, document all import and export formats, and include any requirements that involve custom CAD packages and proprietary analsysis tools.

3.  Standardize Your CAE Interoperability Platform to Reduce Costs

In cases where multiple CAD and/or CAE packages are used, acquire technology that will enable the standardization and consolidation of your CAE interoperability needs under a single platform.  For instances where proprietary CAD and/or CAE packages are used, use a third-party provider to develop interfaces that will integrate with your chosen platform.

4.  Stabilize the Data Before Engaging in CAE Pre-Processing

Unknown, destabilizing conditions within complex parts or assemblies can pose serious time delays for teams performing specialized analysis scenarios.  Third-party data stability analysis tools can be used to predict and/or troubleshoot downstream usability failures in models used for simulation, before they happen.

5.  Centralize CAE Interoperability Technologies

Establish a technical center of excellence that will centralize your CAD-to-CAE interoperability processes and technologies.  For small-to-medium companies, a single application implemented as either a workstation or server solution can fill the gap.

For global enterprises with a myriad of CAD and CAE applications, a CAE Interoperability Center of Excellence can be implemented and scaled to include support for native and/or neutral formats, proprietary systems, and integrated within PLM workflows.  The figure below illustrates an in-production use case:

CAE Interoperability Automation

Risk Mitigation for CAD Validation Deployments

  
  
  
  
  

An increasing number of engineering enterprises have built successful business cases for 3D CAD validation; this demand for automated solutions has propelled the release of several validation software products within the last year.  With a plethora of new CAD validation offerings now available, the industry’s attempts at commoditizing CAD validation poses substantial risks to the engineering and IT decision-makers that evaluate, procure, and oversee CAD validation initiatives. 

As more applications enter the market, industry veterans of 3D CAD validation have experienced a substantial increase in the number of remediation engagements associated with failed projects that stem from untested and immature software applications.  Validation software helps organizations avoid scrapped parts, labor waste, and product recalls, but sub-standard deployments will wreak havoc in many downstream processes.  This article explores three ways managers can limit their exposure:

1.  Know the Common Denominators of Failed Projects

There are three common denominators associated with failed CAD validation projects:  1) engineering managers were unfamiliar with the implementation requirements, process changes, and the downstream impacts associated with CAD validation and lacked the knowledge to mitigate the risks,  2) engineering influencers and technology champions assumed that all CAD validation software solutions were mature, and 3) IT managers applied the same decision-making processes and criteria to validation solutions as they do to a commodity purchase (i.e. hardware).

2.  Use Specific Investigative Criteria During Your Discovery Phase

The CAD validation market is on the cusp of stabilization, but do not assume that the market has matured to the point of commoditization.  Because of the impact validation poses to downstream applications and processes, decision-makers should rely on a consultative pre-acquisition strategy that requires potential validation suppliers to provide more than just data analysis results.  Require your suppliers to provide pre-sales consultative input on process improvements, deployment architectures, diagnostic prioritization, usability, risk mitigation, standardization, statistics and reporting, and measurements for success.

3.  Consider Possible Reuse Scenarios

Validation software is sold as a point solution or integrated into quality-centric software product suites; most are licensed for either desktop or server use and a few can be integrated into PLM environments.  The wrong solution architecture or deployment strategy will negatively impact uptime, scalability, and skew validation results, particularly if the demand for the technology increases.  Consult with your validation provider to determine all possible reuse scenarios for all points in the value chain (i.e. design, analysis, manufacturing, and sustainment).  Doing so will ensure a successful deployment strategy and promote consistency in your analysis results, software availability, scalability, reporting, and performance.

Four Consolidation Strategies for Interoperability Technologies

  
  
  
  
  

Organizations that maintain multiple tools and technologies for interoperability should consider a software consolidation strategy. Consolidation reduces labor and hardware costs, as well as maintenance, support and training expenses. Here are four suggestions to help get you started:

1.  Eliminate Redundant Applications & Processes

 Conduct an inventory of your interoperability tools; classify them by use case, Engineering IT function (i.e. CAD, CAM, CAE and PLM), manufacturing function (i.e. design, manufacturing, analysis, data management), and license type (node-locked, floating, server). Determine which applications can be re-purposed and reused for the greatest number of use cases across the company. Establish technical evaluation criteria and performance metrics to aid in the elimination of redundant applications.

2. Consolidate Applications Using Server-Based Technologies

Once you generate a list of approved technologies, use an interoperability integration platform to consolidate your applications. The solution should be vendor-neutral, SOA-compliant and deployed as back office application to maximize ease of use and minimize technical complexities. In Example 1.1, an interoperability software integration platform (ISIP) is used for consolidating vendor, third-party, and customized software applications:

Using an ISP

3. Ensure that the Technology Is Accessible

 Interoperability software is often procured as an “occasional-use, on-demand” asset. Consolidating applications into a server environment also means centralizing them, which allows you to maximize usage. In most instances, point solutions that offer batch processing capabilities may be consolidated under your server platform. If the use case requires you to distribute the interoperability software to workstations, serve up these licenses from a single location.

Once your interoperability applications are centralized under one platform, you can provide global access to your solutions and a simplified user experience by using web portals in order to reduce training and administrative costs. In the example below, users submit jobs via a simple, Java-based commercial-off-the-shelf application. By this time, the server-based solution has authenticated the users’ login, secured the transmission via SFTP, encrypted the intellectual property package, and ensured end-to-end ITAR compliance.

blog1 pic2 resized 600

4. Interoperability Software Consolidation as an Engineering IT Operations Strategy

The strategies used for interoperability software consolidation often lead to operational improvements. By centralizing interoperability technologies, companies not only maximize software usage; they can leverage their purchasing power for enterprise licensing and consulting, while minimizing their technical administration and support costs. Ancillary benefits of centralization include creating operating environments that are more stable, secure, and scalable. Finally, in eliminating redundant software applications, interoperability processes are consolidated and improved.

All Posts

James Flerlage

Jamie Flerlage is a Senior Consultant for ITI TranscenData, an interoperability firm in Cincinnati, Ohio, specializing in consulting services, CAD, CAM, CAE interoperability software systems, and PLM/ERP integrations, for Fortune 500 manufacturers. He may be reached at jjf@transcendata.com.

Flerlage possesses an MBA from Keller Graduate School of Management; his credentials also include multiple degrees, IT certifications, and sixteen years experience in enterprise technology planning, security, and management. In addition to his professional credentials, Flerlage has spent fourteen years freelancing for national publishers on the topics of enterprise technology management and engineering IT.