Archive for January, 2012
Friday, January 27th, 2012
There are several types of CAE-related manufacturing applications for optimizing the use of materials, tools, shape and time, and machine layout by simulating and analyzing specific manufacturing processes. However, probably the most common method for getting CAE into a manufacturing environment, finite element analysis (FEA) for parts and tooling.
FEA is a numerical technique for calculating the strength and behavior of structures. It can be used to calculate deflection, stress, vibration, buckling, and other behaviors. Typical applications for FEA would include minimizing weight and/or maximizing the strength of a part or assembly.
In FEA, structures are divided into small, simple units, called elements. While the behavior of individual elements can be described with a relatively simple set of equations, a large set of simultaneous equations are required to describe the behavior of a complex structure. When the equations are solved, the computer and FEA tool displays the physical behavior of the structure based on the individual elements.
FEA tools can be used for innovating or optimizing mechanical designs. Optimization is a process for improving a design that results in the best physical properties for minimum cost. However, optimization using FEA tools can prove difficult, because each design variation takes time to evaluate, making iterative optimization time consuming. On the other hand, FEA tools can really shine when seeking new and unique ways of designing things – the most crucial aspect of innovation.
Before committing to any CAE tool, however, be sure it is compatible with your existing CAD and CAM tools, the types of parts and assemblies you design, and your general workflow.
Keep in mind that there is no one tool that serves everyone’s needs. Some will be interested fluid flow, others in structural mechanical properties, and still others in thermal issues. Get input from as many groups within your organization as are likely to benefit from CAE tools. When evaluating CAE tools, make sure you evaluate them with your models; not just models supplied by a vendor. That way, you’ll be able to objectively evaluate different CAE tools that best suit your needs in your environment, and not be overly swayed by what a vendor wants you to see. Obviously, it’s in your best interest for objectivity to use the same parts or assemblies with different CAE tool vendors.
Finally, a word of caution. Don’t expect CAE tools to solve all your problems with all of your parts. Like CAD and CAM tools, they should be used in conjunction with experience and common sense to arrive at optimized and innovative designs. Calculating return on investment when using CAE tools can be as complicated as performing analyses on complex assemblies. However, you can probably count on estimating ROI from time saved during the design process, lower material costs, reduced numbers of physical prototypes and ECOs, and possibly greatly reducing the number of product liability lawsuits. CAE tools cannot perform miracles by themselves because they still require a significant human element, but employed wisely, will likely improve your workflow and provide tangible benefits.
Wednesday, January 25th, 2012
By now you’ve almost certainly got MCAD and CAM tools as a vital component of your business. With them you’ve hopefully seen how they have positively impacted the way you work, as well as the way you interact with your customers and vendors. Looking for a way to further increase your productivity, while continuing to optimize your processes?
If you haven’t already, it’s time you considered integrating tools into your workflow for simulation and analysis of virtually any aspect of the product development lifecycle. Although known in some circles as computer-aided engineering (CAE) tools, that acronym has largely been replaced by simulation and analysis, although they all mean roughly the same thing.
It wasn’t all that long ago that CAE was relegated to the latter stages of the design and manufacturing (product development) process — too many times as an afterthought. This is changing, though, on two fronts. First, realizing the potential payback in terms of reduced production time and getting it right the first time, many design and manufacturing organizations have moved CAE tools further forward in the development process. Some are even using them in the earliest stages of design, the conceptual phase. Second, software vendors are getting better at integrating CAE with their CAD and CAM tools.
A major roadblock to CAE’s wider acceptance has been the perception that only high-priced analysis specialists (math PhDs?) could understand and work with CAE tools. While specialists are required for some of the high-end tools for performing complex analyses, there are many CAE tools now on the market that require just some basic training and practice to become proficient in a relatively time.
Admittedly, all CAE tools require a technical mindset, but you don’t necessarily have to have a doctorate in math anymore to run many types of analysis and simulation. It really just requires familiarity with the interface of a CAE tool for creating and loading digital models, and then reviewing and interpreting the results. A really nice thing is that many CAE tools now work from within the familiar UI of your CAD or CAM tool. Finally, computer prices that continue to drop have helped popularize CAE tools, because some of them require a lot computing horsepower when working with large assemblies or very precise engineering constraints.
If this all sounds easy, it is to a point, but there are some caveats. That’s what we’ll discuss next time, as well as the most commonly used CAE tool — FEA.
Friday, January 20th, 2012
Like all aspects of the product development process, to justify its existence, simulation and test productivity are becoming an evermore pressing issue. Vendors say that in many cases, customers are demanding significant tangible proof of ROI in months, not years.
A major obstacle to wider acceptance of virtual prototyping and manufacturing simulation is a persisting lack of interoperability between CAD, CAM, and digital prototyping in the bigger PLM scenario. In this context, working toward data interoperability is not regarded as a value-added activity. Overall, however, one of the primary goals of digital test and simulation is to make the overall engineering activity sequence more of a value center and less of a cost center. Another goal is the ability to simulate the entire product lifecycle – from concept through production through sustainment to retirement.
Integrating the analytical, virtual, and physical is disruptive and is an obstacle to acceptance because the integration forces people to work differently than they had done previously. This integration only works through evolutionary implementation, and not necessarily everything all at once.
Many of the digital prototyping tools are still too difficult to use, and vendors need to pay more attention to ease of learning/use. Ease of use is important because vendors, even Tier 1 automotive suppliers, with their low margins cannot afford to hire and employ Ph.D.s to run their digital prototyping software.
On the other hand and in their defense, though, these same vendors are not interested in simplifying (“dumbing-down”) their software so much that they can solve only relatively simple problems. This is a big issue, and one that is even bigger than CAD, where ease of learning/use have made great strides for most vendors the past couple of years. Conversely, many vendors feel that the legacy workforce is not well-suited or qualified for the digital prototyping tools available today.
One way to address the ease of use issue is to provide a scaleable user interface on test/analysis applications to suit different user needs and skill levels at different times.This is tough to address because it requires flexibility and adaptability.
Finally, there is the trust factor that can be an obstacle. In the simulation/test arena, there is an adage that roughly goes, “Everyone trusts test results except test engineers, and everyone trusts analysis results except analysts.” Just about everyone agrees, however, that even with the best digital methods, physical testing will never go away.
The decision of whether to use physical versus digital prototyping is a delicate balance of tradeoffs. In fact, many companies employ virtual testing and simulation as a decision-making tool for conducting physical testing.
So how will digital prototyping ultimately succeed? It’s not hardware or software that makes or breaks digital prototyping, it’s people. While great people can overcome marginal or bad hardware and software, marginal people can cause the best hardware and software to fail. In this context, digital prototyping is no different than any other technical endeavor with regard to the absolute importance of the “people factor” for success.
Wednesday, January 18th, 2012
Market speak aside and regardless of whether it’s called, digital or virtual prototyping for manufacturing processes basically comes down to simulating something in the physical world, whether it’s simulating the machining of a part, placement of machines on a plant floor, or optimizing workflow.
To set the record straight, digital prototyping of anything, including manufacturing processes, is not necessarily CAD or CAM, per se. In fact, it primarily involves digital simulation and test to verify and validate designs and processes, and is an intensely math-based method of viewing them. Some vendors define digital simulation and test as simply good, old-fashioned computer-aided engineering (CAE), although most don’t anymore.
Prototypes of any type, whether physical or digital, provide a basis for making predictions about behavior for making better design, manufacturing, and business decisions. Ideally, intelligent digital prototyping is not only computer based, but a synergy of simulation (virtual) and testing (physical) information based on experience.
Much like CAD/CAM, the main areas that digital prototyping for manufacturing processes aim to influence in a positive manner include:
- Accelerating time to market
- Reducing cost
- Increasing safety of the designed product
- Improving product quality, reliability, and performance.
Figures bandied about by various industry pundits and analyst organizations predict that integrated digital prototyping is resulting in cumulative savings for product design and manufacturing processes of billions of dollars, and that’s only the beginning.
One of the greatest benefits of employing math-based methods in digital prototyping is that you can actually see cause and effect and track things that can’t be physically measured. Math captures reality. Digital prototyping is changing the traditional product development cycle from designbuildtestfix to designanalyzetestbuild. This newer paradigm reduces cycle times and is much less physical facility intensive. However, for its value to be fully realized, analysis through digital prototyping should be regarded as important as design of products and processes.
That all sounds good, right? Well, like just about anything that aims to change the status quo, there are obstacles to acceptance of virtual prototyping and manufacturing simulation. Overcoming these barriers will be the topic of the next MCADCafe Blog.
Friday, January 13th, 2012
Last time, based largely on vendors’ marketing language, PLM was defined as a comprehensive system and process that integrates, interfaces, and interacts with every other IT system within an organization, including CAD, ERP, CRM, etc. While this occurs at a peer level, the PLM oversees and, to a certain extent, controls all data exchanges.
I think, however, there is a better definition and model of what PLM actually should be. Unlike many vendors’ definitions, PLM is not a peer system to other systems, such as ERP, SCM, and CRM. Rather, PLM is the intellectual property backbone of an enterprise. While the other subsystems deliver indirect cost-reducing improvements, none of them have any measurable impact on delivering top-line, revenue enhancing results and only a minor impact on lowering direct costs. The only way to positively impact top-line revenues is to develop and build innovative, higher-quality products, and PLM is the only system of the four that addresses these issues.
In this context, PLM transforms ideas to profits, capturing customer experiences, and generating ideas for new products. Along the way, the intellectual property undergoes several transformations (such as ideas to concepts, concepts-to-prototypes, prototypes-to-products, and so on) and interacts with the other systems. Ideally, a well-implemented PLM system provides a comprehensive framework that lets all the other systems and disparate groups of users to easily interact with an enterprises’ intellectual property so anyone can add value to it.
I think the revised definition and vision finally get to the heart of what a PLM was always envisioned to be, but thus far, executed and implemented by only a few PLM vendors – an intellectual property asset manager that can be used universally within an organization.
Ultimately, the success of PLM is dependent on two things. First, it is imperative that vendors communicate comprehensively and truthfully what their PLM offerings can do and integrate with, as well as what their customers can reasonably expect in terms of gains and ROI. Second, customers must educate themselves to the true needs of their organizations and how they expect PLM to fit in with the rest of their existing and future IT infrastructures. Only then will customer expectations and vendor promises meet for improving processes and resulting products through intellectual property asset management.
Can vendors pull off what PLM was truly meant to fulfill? I think so, and more and more vendors will do so, increasingly with cloud-based services that are just beginning, but should decrease implementation costs and increase productivity through being available to anyone anywhere.
Wednesday, January 11th, 2012
Like many of the ingredients in a manufacturing organization’s computer technology alphabet soup, such as ERP, SCM, CRM, not to mention CAD, CAM, and CAE, product lifecycle management (PLM) for years has been touted as being the “next big thing” and the the final frontier for integrating all manufacturing IT functions. Honestly, though, can it truly provide all that the various vendors are promising? I have asked myself that question for several years now — is PLM a great hope or just another great hype?
It seems that every vendor defines PLM in a manner that best suits their respective existing product lines and business practices, and not always necessarily the processes of the customers they are trying to serve. Therein lies a big part of the PLM problem. PLM should address processes and not just products – neither the vendors’ nor their customers’ – and too few vendors to this point have stressed the processes they are claiming to improve over the products and services they are trying to sell.
It also seems like everybody (yes, now including just about every CAD vendor big and small) is at least trying to get into the PLM act, regardless of whether they should or should not based on their development and integration capabilities or the needs of their customers. Even database giant, Oracle, says it wants to be a major PLM player, although the company has eluded that it doesn’t want to dirty its hands with traditional CAD/CAM stuff — it wants to look at the bigger picture, although it doesn’t elaborate what that picture is.
Although they are quite different in requirements, approach, implementation, and task load, I continue to see PLM and PDM (product data management) regarded practically as equals in vendors’ conference presentations and promotional advertising. Using these acronyms interchangeably only adds to the confusion that already exists in the PLM marketplace. However, it does give more vendors more opportunities to say that they “do PLM.” By definition, PDM handles only data and is a subset of PLM; whereas PLM, to many peoples’ thinking, should interface and interact with every other IT system within an organization, including ERP, CRM, etc. at a similar level as a peer system.
So, is PLM fulfilling the prophesy that the vendors have promised? That’s the question we’ll tackle in the next MCADCafe Blog.
Friday, January 6th, 2012
Whenever the topic of outsourcing manufacturing to overseas companies or facilities is brought up these days, a fact of business life called offshoring, you usually get one of two reactions anger or fear. Sometimes you get a little of both. Is North American manufacturing headed down the road to oblivion with little ability to stem or reverse the descent? We need to view all of this from an historical perspective.
What is happening to North American manufacturing today in terms of gross numbers of employees is hardly unprecedented. Historically, probably the best analogy to what is taking place in manufacturing today with regard to reduced numbers and overall effect is agricultural farming. In the U.S, between the years 1890 and 1960, the percentage of the job market that was directly tied to the farming sector dropped from about 45 percent to less than two percent. Automation on the farm did not just make the jobs flee to other countries; it made them completely disappear. Even with much lower employment numbers, the farming sector thrived in terms of productivity. Automation helped make farming more productive than it ever was when it was strictly the province of human hands and manual labor and today we enjoy surpluses that allow us to usually export huge amounts of farm goods. When jobs vanished on the farm, people turned to the emerging industrial sector for employment and a new way of life in cities.
Just as industrial manufacturing replaced farming, today in the world economy, services are replacing manufacturing.
There are major differences, however, between how offshoring has affected and will continue to affect both manufacturing and service jobs. Offshoring of manufacturing jobs affects primarily blue-collar jobs in certain industries, whereas offshoring of services affects primarily white-collar jobs across potentially all industries.
Luckily, not all manufacturing or service jobs can be outsourced or offshored because several criteria must be met, including:
- Little face to face customer contact
- Information is a major component of the product
- Work can be done via remote communications
- Low set-up barriers
- High wage differentials
So, while some consider offshoring a necessary evil to North American manufacturing and service workers, their employers are discovering that an the opposite force, known as reshoring, is an essential component of helping their businesses not only thrive, but in many cases, survive.
Because it is such an important emerging movement, reshoring will be the topic of an MCADCafe blog post in the very near future.
Wednesday, January 4th, 2012
Last year we witnessed the launch of PTC’s Creo with great interest. At that time, PTC claimed Creo was a reinvention and rebranding of several of its venerable mechatronics design products that included Pro/ENGINEER and CoCreate. The launch, however, left a lot of unanswered questions. Since then, we have realized that Creo really is something evolutionary and new, and not just a repackaging of the monolithic Pro/ENGINEER, CoCreate, and ProductView lines. Functionality for Creo was pulled out of those former products as role-based apps that provide what PTC termed “any mode modeling.”
We wondered to what degree does Creo Parametric (formerly Pro/ENGINEER) possess direct modeling capabilities and to what degree does Creo Direct (formerly CoCreate) possess parametric capabilities? We discovered that there’s an extension for Creo Parametric, called the Creo Flexible Modeling Extension (FMX) that offers “direct modeling like” capabilities. This is suited for users of Creo Parametric who want to stay in that same environment and edit their model in ways similar to direct modeling. In other words, it enables users to directly edit parametric models, but with the simplicity and flexibility found in Creo Direct.
Creo Elements/Direct is exclusively designed for direct modeling. It serves as the core product development tool, supporting engineering teams in developing complete products from art-to-part using the direct modeling approach. There’s an extension called Advanced Design, that enables users to add relations and constraints to models.
Creo Parametric has what we have consider flexible modeling inside of it for a more dedicated user who needs parametrics. On the other hand, Creo Direct, which contains no parametric capabilities, is targeted at a more casual type of user.
We also wondered if, ultimately, would Creo Parametric and Creo Direct become one app? That gets back to old monolithic PTC product philosophy, and having direct and parametric modeling capabilities in one package can be a good thing. However, there are no plans for Creo Parametric and Creo Direct to become one app. They will continue to be developed as seperate apps, focused on different user roles, and modeling approaches, leveraging a common data model. In Creo 1.0, there are two 3D modes people can work in, direct modeling and parametric modeling. For parametric modeling, Creo Parametric is the app for that.
As direct modeling addresses a number of different needs, it’s available in a number of ways. As mentioned earlier, there’s an extension for Creo Parametric, called Creo Flexible Modeling Extension (FMX). This is ideal for users of Creo Parametric who want to stay in that same environment and edit their model in ways similar to direct modeling. It enables users to directly edit parametric models, but with the simplicity and flexibility found in Creo Direct.
Sometime in the near future, in MCADCafe Weekly, we hope to review and compare Creo Parametric and Direct, and their respective features and benefits.