Open side-bar Menu
 MCADCafe Editorial
Jeff Rowe
Jeff Rowe
Jeffrey Rowe has over 40 years of experience in all aspects of industrial design, mechanical engineering, and manufacturing. On the publishing side, he has written over 1,000 articles for CAD, CAM, CAE, and other technical publications, as well as consulting in many capacities in the design … More »

Introducing CAE Into Your CAD/CAM Workflow: Getting Onboard

 
January 25th, 2012 by Jeff Rowe

By now you’ve almost certainly got MCAD and CAM tools as a vital component of your business. With them you’ve hopefully seen how they have positively impacted the way you work, as well as the way you interact with your customers and vendors. Looking for a way to further increase your productivity, while continuing to optimize your processes?

If you haven’t already, it’s time you considered integrating tools into your workflow for simulation and analysis of virtually any aspect of the product development lifecycle. Although known in some circles as computer-aided engineering (CAE) tools, that acronym has largely been replaced by simulation and analysis, although they all mean roughly the same thing.

It wasn’t all that long ago that CAE was relegated to the latter stages of the design and manufacturing (product development) process — too many times as an afterthought. This is changing, though, on two fronts. First, realizing the potential payback in terms of reduced production time and getting it right the first time, many design and manufacturing organizations have moved CAE tools further forward in the development process. Some are even using them in the earliest stages of design, the conceptual phase. Second, software vendors are getting better at integrating CAE with their CAD and CAM tools.

A major roadblock to CAE’s wider acceptance has been the perception that only high-priced analysis specialists (math PhDs?) could understand and work with CAE tools. While specialists are required for some of the high-end tools for performing complex analyses, there are many CAE tools now on the market that require just some basic training and practice to become proficient in a relatively time.

Admittedly, all CAE tools require a technical mindset, but you don’t necessarily have to have a doctorate in math anymore to run many types of analysis and simulation. It really just requires familiarity with the interface of a CAE tool for creating and loading digital models, and then reviewing and interpreting the results. A really nice thing is that many CAE tools now work from within the familiar UI of your CAD or CAM tool. Finally, computer prices that continue to drop have helped popularize CAE tools, because some of them require a lot computing horsepower when working with large assemblies or very precise engineering constraints.

If this all sounds easy, it is to a point, but there are some caveats. That’s what we’ll discuss next time, as well as the most commonly used CAE tool — FEA.

Virtual Prototyping and Manufacturing Simulation – Obstacles To Acceptance

 
January 20th, 2012 by Jeff Rowe

Like all aspects of the product development process, to justify its existence, simulation and test productivity are becoming an evermore pressing issue. Vendors say that in many cases, customers are demanding significant tangible proof of ROI in months, not years.

A major obstacle to wider acceptance of virtual prototyping and manufacturing simulation is a persisting lack of interoperability between CAD, CAM, and digital prototyping in the bigger PLM scenario. In this context, working toward data interoperability is not regarded as a value-added activity. Overall, however, one of the primary goals of digital test and simulation is to make the overall engineering activity sequence more of a value center and less of a cost center. Another goal is the ability to simulate the entire product lifecycle – from concept through production through sustainment to retirement.

Integrating the analytical, virtual, and physical is disruptive and is an obstacle to acceptance because the integration forces people to work differently than they had done previously. This integration only works through evolutionary implementation, and not necessarily everything all at once.

Many of the digital prototyping tools are still too difficult to use, and vendors need to pay more attention to ease of learning/use. Ease of use is important because vendors, even Tier 1 automotive suppliers, with their low margins cannot afford to hire and employ Ph.D.s to run their digital prototyping software.

On the other hand and in their defense, though, these same vendors are not interested in simplifying (“dumbing-down”) their software so much that they can solve only relatively simple problems. This is a big issue, and one that is even bigger than CAD, where ease of learning/use have made great strides for most vendors the past couple of years. Conversely, many vendors feel that the legacy workforce is not well-suited or qualified for the digital prototyping tools available today.

One way to address the ease of use issue is to provide a scaleable user interface on test/analysis applications to suit different user needs and skill levels at different times.This is tough to address because it requires flexibility and adaptability.

Finally, there is the trust factor that can be an obstacle. In the simulation/test arena, there is an adage that roughly goes, “Everyone trusts test results except test engineers, and everyone trusts analysis results except analysts.” Just about everyone agrees, however, that even with the best digital methods, physical testing will never go away.

The decision of whether to use physical versus digital prototyping is a delicate balance of tradeoffs. In fact, many companies employ virtual testing and simulation as a decision-making tool for conducting physical testing.

So how will digital prototyping ultimately succeed? It’s not hardware or software that makes or breaks digital prototyping, it’s people. While great people can overcome marginal or bad hardware and software, marginal people can cause the best hardware and software to fail. In this context, digital prototyping is no different than any other technical endeavor with regard to the absolute importance of the “people factor” for success.

Virtual Prototyping and Manufacturing Simulation – From Products To Processes

 
January 18th, 2012 by Jeff Rowe

Market speak aside and regardless of whether it’s called, digital or virtual prototyping for manufacturing processes basically comes down to simulating something in the physical world, whether it’s simulating the machining of a part, placement of machines on a plant floor, or optimizing workflow.

To set the record straight, digital prototyping of anything, including manufacturing processes, is not necessarily CAD or CAM, per se. In fact, it primarily involves digital simulation and test to verify and validate designs and processes, and is an intensely math-based method of viewing them. Some vendors define digital simulation and test as simply good, old-fashioned computer-aided engineering (CAE), although most don’t anymore.

Prototypes of any type, whether physical or digital, provide a basis for making predictions about behavior for making better design, manufacturing, and business decisions. Ideally, intelligent digital prototyping is not only computer based, but a synergy of simulation (virtual) and testing (physical) information based on experience.

Much like CAD/CAM, the main areas that digital prototyping for manufacturing processes aim to influence in a positive manner include:

  • Accelerating time to market
  • Reducing cost
  • Increasing safety of the designed product
  • Improving product quality, reliability, and performance.

Figures bandied about by various industry pundits and analyst organizations predict that integrated digital prototyping is resulting in cumulative savings for product design and manufacturing processes of billions of dollars, and that’s only the beginning.

One of the greatest benefits of employing math-based methods in digital prototyping is that you can actually see cause and effect and track things that can’t be physically measured. Math captures reality. Digital prototyping is changing the traditional product development cycle from designbuildtestfix to designanalyzetestbuild. This newer paradigm reduces cycle times and is much less physical facility intensive. However, for its value to be fully realized, analysis through digital prototyping should be regarded as important as design of products and processes.

That all sounds good, right? Well, like just about anything that aims to change the status quo, there are obstacles to acceptance of virtual prototyping and manufacturing simulation. Overcoming these barriers will be the topic of the next MCADCafe Blog.

PLM 2012 Part II – Can Vendors Pull It Off?

 
January 13th, 2012 by Jeff Rowe

Last time, based largely on vendors’ marketing language,  PLM was defined as a comprehensive system and process that integrates, interfaces, and interacts with every other IT system within an organization, including CAD, ERP, CRM, etc.  While this occurs at a peer level, the PLM oversees and, to a certain extent, controls all data exchanges.

I think, however, there is a better definition and model of what PLM actually should be. Unlike many vendors’ definitions,  PLM is not a peer system to other systems, such as ERP, SCM, and CRM. Rather, PLM is the intellectual property backbone of an enterprise. While the other subsystems deliver indirect cost-reducing improvements, none of them have any measurable impact on delivering top-line, revenue enhancing results and only a minor impact on lowering direct costs. The only way to positively impact top-line revenues is to develop and build innovative, higher-quality products, and PLM is the only system of the four that addresses these issues.

In this context, PLM transforms ideas to profits, capturing customer experiences, and generating ideas for new products. Along the way, the intellectual property undergoes several transformations (such as ideas to concepts, concepts-to-prototypes, prototypes-to-products, and so on) and interacts with the other systems. Ideally, a well-implemented PLM system provides a comprehensive framework that lets all the other systems and disparate groups of users to easily interact with an enterprises’ intellectual property so anyone can add value to it.

I think the revised definition and vision finally get to the heart of what a PLM was always envisioned to be, but thus far, executed and implemented by only a few PLM vendors – an intellectual property asset manager that can be used universally within an organization.

Ultimately, the success of PLM is dependent on two things. First, it is imperative that vendors communicate comprehensively and truthfully what their PLM offerings can do and integrate with, as well as what their customers can reasonably expect in terms of gains and ROI. Second, customers must educate themselves to the true needs of their organizations and how they expect PLM to fit in with the rest of their existing and future IT infrastructures. Only then will customer expectations and vendor promises meet  for improving processes and resulting products through intellectual property asset management.

Can vendors pull off what PLM was truly meant to fulfill? I think so, and more and more vendors will do so, increasingly with cloud-based services that are just beginning, but should decrease implementation costs and increase productivity through being available to anyone anywhere.

PLM 2012 Part I – Will Vendors Pull It Together to Fulfill the Prophesy?

 
January 11th, 2012 by Jeff Rowe

Like many of the ingredients in a manufacturing organization’s computer technology alphabet soup, such as ERP, SCM, CRM, not to mention CAD, CAM, and CAE, product lifecycle management (PLM) for years has been touted as being the “next big thing” and the the final frontier for integrating all manufacturing IT functions.  Honestly, though, can it truly provide all that the various vendors are promising? I have asked myself that question for several years now — is PLM a great hope or just another great hype?

It seems that every vendor defines PLM in a manner that best suits their respective existing product lines and business practices, and not always necessarily the processes of the customers they are trying to serve. Therein lies a big part of the PLM problem. PLM should address processes and not just products – neither the vendors’ nor their customers’ – and too few vendors to this point have stressed the processes they are claiming to improve over the products and services they are trying to sell.

It also seems like everybody (yes, now including just about every CAD vendor big and small) is at least trying to get into the PLM act, regardless of whether they should or should not based on their development and integration capabilities or the needs of their customers. Even database giant, Oracle, says it wants to be a major PLM player, although the company has eluded that it doesn’t want to dirty its hands with traditional CAD/CAM stuff — it wants to look at the bigger picture, although it doesn’t elaborate what that picture is.

Although they are quite different in requirements, approach, implementation, and task load, I continue to see PLM and PDM (product data management) regarded practically as equals in vendors’ conference presentations and promotional advertising. Using these acronyms interchangeably only adds to the confusion that already exists in the PLM marketplace. However,  it does give more vendors more opportunities to say that they “do PLM.” By definition, PDM handles only data and is a subset of PLM; whereas PLM, to many peoples’ thinking, should interface and interact with every other IT system within an organization, including ERP, CRM, etc. at a similar level as a peer system.

So,  is PLM fulfilling the prophesy that the vendors have promised? That’s the question we’ll tackle in the next MCADCafe Blog.

PLM – Can Vendors Pull It Together?

Like many of the recent past ingredients of a manufacturing organization’s computer technology alphabet soup, such as ERP, SCM, CRM, not to mention CAD, CAM, and CAE, product lifecycle management (PLM) is without a doubt this year’s biggest buzzword and a technology that is touted as being the “next big thing.” But in all honesty, can it truly provide all that the various vendors are promising? I keep asking myself, is PLM a great hope or just another great hype?

 

It seems that every vendor defines PLM in a manner that best suits their respective product lines and business practices, and not necessarily the processes of the customers they are trying to serve, and therein lies a big part of the PLM problem. PLM should address processes and not just products – neither the vendors’ nor their customers’ – and too few vendors to this point have stressed process over product.

 

It also seems like everybody (yes, including just about every CAD vendor big and small) is trying to get into the PLM act, regardless of whether they honestly should or should not. Even database giant, Oracle, wants to be a major player, although the company has as much said that it doesn’t want to dirty its hands with traditional CAD/CAM stuff, it wants to look after the bigger picture, although it doesn’t elaborate even vaguely what that picture is.

 

Although they are quite different in approach and task load, I’ve even seen PLM and PDM (product data management) regarded practically as equals in conference presentations and promotional advertising literature. Using these acronyms interchangeably only adds to the confusion that already exists in the marketplace, but it does give more vendors more opportunities to say that they “do PLM.” By definition, PDM handles only data and is a subset of PLM; whereas PLM, to many peoples’ thinking, should interface and interact with every other IT system within an organization, including ERP, CRM, etc. at a similar level as a peer system.

Offshoring – Anger or Fear?

 
January 6th, 2012 by Jeff Rowe

Whenever the topic of outsourcing manufacturing to overseas companies or facilities is brought up these days, a fact of business life called offshoring, you usually get one of two reactions anger or fear. Sometimes you get a little of both. Is North American manufacturing headed down the road to oblivion with little ability to stem or reverse the descent? We need to view all of this from an historical perspective.

What is happening to North American manufacturing today in terms of gross numbers of employees is hardly unprecedented. Historically, probably the best analogy to what is taking place in manufacturing today with regard to reduced numbers and overall effect is agricultural farming. In the U.S, between the years 1890 and 1960, the percentage of the job market that was directly tied to the farming sector dropped from about 45 percent to less than two percent. Automation on the farm did not just make the jobs flee to other countries; it made them completely disappear. Even with much lower employment numbers, the farming sector thrived in terms of productivity. Automation helped make farming more productive than it ever was when it was strictly the province of human hands and manual labor and today we enjoy surpluses that allow us to usually export huge amounts of farm goods. When jobs vanished on the farm, people turned to the emerging industrial sector for employment and a new way of life in cities.

Just as industrial manufacturing replaced farming, today in the world economy, services are replacing manufacturing.

There are major differences, however, between how offshoring has affected and will continue to affect both manufacturing and service jobs. Offshoring of manufacturing jobs affects primarily blue-collar jobs in certain industries, whereas offshoring of services affects primarily white-collar jobs across potentially all industries.

Luckily, not all manufacturing or service jobs can be outsourced or offshored because several criteria must be met, including:

  • Little face to face customer contact
  • Information is a major component of the product
  • Work can be done via remote communications
  • Low set-up barriers
  • High wage differentials

So, while some consider offshoring a necessary evil to North American manufacturing and service workers, their employers are discovering that an the opposite force, known as reshoring, is an essential component of helping their businesses not only thrive, but in many cases, survive.

Because it is such an important emerging movement, reshoring will be the topic of an MCADCafe blog post in the very near future.

PTC’s Creo 1.0 — An Update

 
January 4th, 2012 by Jeff Rowe

Last year we witnessed the launch of PTC’s Creo with great interest.  At that time, PTC claimed Creo was a reinvention and rebranding of several of its venerable mechatronics design products that included Pro/ENGINEER and CoCreate. The launch, however, left a lot of unanswered questions. Since then, we have realized that Creo really is something evolutionary and new, and not just a repackaging of the monolithic Pro/ENGINEER, CoCreate, and ProductView lines. Functionality for Creo was pulled out of those former products as role-based apps that provide what PTC termed “any mode modeling.”

We wondered to what degree does Creo Parametric (formerly Pro/ENGINEER) possess direct modeling capabilities and to what degree does Creo Direct (formerly CoCreate) possess parametric capabilities? We discovered that there’s an extension for Creo Parametric, called the Creo Flexible Modeling Extension (FMX) that offers “direct modeling like” capabilities. This is suited for users of Creo Parametric who want to stay in that same environment and edit their model in ways similar to direct modeling. In other words, it enables users to directly edit parametric models, but with the simplicity and flexibility found in Creo Direct.

Creo Elements/Direct is exclusively designed for direct modeling. It serves as the core product development tool, supporting engineering teams in developing complete products from art-to-part using the direct modeling approach. There’s an extension called Advanced Design, that enables users to add relations and constraints to models.

Creo Parametric has what we have consider flexible modeling inside of it for a more dedicated user who needs parametrics. On the other hand, Creo Direct, which contains no parametric capabilities, is targeted at a more casual type of user.

We also wondered if, ultimately, would Creo Parametric and Creo Direct become one app? That gets back to old monolithic PTC product philosophy, and having direct and parametric modeling capabilities in one package can be a good thing. However, there are no plans for Creo Parametric and Creo Direct to become one app. They will continue to be developed as seperate apps, focused on different user roles, and modeling approaches, leveraging a common data model. In Creo 1.0, there are two 3D modes people can work in, direct modeling and parametric modeling. For parametric modeling, Creo Parametric is the app for that.

As direct modeling addresses a number of different needs, it’s available in a number of ways. As mentioned earlier, there’s an extension for Creo Parametric, called Creo Flexible Modeling Extension (FMX). This is ideal for users of Creo Parametric who want to stay in that same environment and edit their model in ways similar to direct modeling. It enables users to directly edit parametric models, but with the simplicity and flexibility found in Creo Direct.

Sometime in the near future, in MCADCafe Weekly, we hope to review and compare Creo Parametric and Direct, and their respective features and benefits.

Streamlining Concept Design and Downstream Engineering – Choosing the Right Tools For Successful Workflows

 
December 23rd, 2011 by Jeff Rowe

A recent study entitled, “Trends in Concept Design,” conducted by PTC, found that the majority of respondents are recreating concept designs once the concept design is released to downstream engineering stages. For example, recreating drawings, sketches, and models that were generated during the concept phase and released to the engineering department for further development. This approach is known as throwing a design “over the transom,” not knowing how the final product will be realized as compared with the original design intent from the concept phase.

Clearly, this approach is not only inefficient, but also usually contributes to too many unexpected and undesirable results between the concept stage and the marketplace.

Fortunately, today there are tools and approaches to help manufacturers eliminate the need for data recreation, streamlining the concept design stage of product development and downstream engineering processes. Even at the concept stage, manufacturing companies are increasingly reusing existing design data instead of creating everything from scratch – a potential big time saver reusing design data that is already known to work.

At the concept stage, using a tool, such as Creo Direct, you can create regular geometry for  3D purposes. In Creo Direct, you can create and edit 3D designs through direct interaction with their geometry. You can make changes to the basic design elements at any point with little impact to the overall design process. The resulting geometry is compatible with all the Creo applications, including Creo Parametric that is used for refining designs downstream in the product development process. In fact, even 2D sketches captured with Creo Sketch are compatible with Creo Parametric.

The Creo Sketch and Creo Direct user interface is similar to that of Creo Parametric, and so supports and streamlines the design process.

Creo Parametric can share data seamlessly with other Creo apps, notably Creo Direct and Creo Simulate. This means that time is not wasted on data recreation or translation, resulting in costly errors. Users can seamlessly move between different modes of modeling and 2D and 3D design data can easily move between apps while retaining original design intent. This all provides a very high level of interoperability productivity gains throughout many product development processes between design and engineering groups.

In the end, successful product development, from the concept stage, to engineering, to production all comes down to interoperability between the various groups at various stages and the tools they use. Interoperability is vital for optimizing collaboration between groups and stages and for maximizing the potential for a product’s ultimate success.

Universal 3D Creativity: Reality or Myth?

 
December 20th, 2011 by Jeff Rowe

The past couple of years we’ve been hearing that anyone can be a creative genius by using a variety of tools ranging from CAD products to rapid prototyping machines. While most people have the capacity to be creative in a certain capacity, this heady claim fails to take in to consideration that not everyone might be truly creative when it comes to creating something of real value in 3D – for either themselves or others. In other words, even with the best tools in the world, there is no guarantee that everyone will be able to create things that anybody else would really want.

The current popularity of the DIY movement has helped the “creativity for everyone” movement, but I think back 20+ years ago when another “revolution” took place in desktop publishing. Prior to desktop publishing for all, printed documents for most home computer users consisted of Courier, Helvetica, and maybe Times Roman fonts on dot matrix printers. With the introduction of Encapsulated Postscript (EPS) and more capable printers, the font possibilities were endless. So endless, in fact, that many early EPS documents with their myriad fonts looked more like ransom notes than business documents. It took a while, but eventually most people got that more fonts is not necessarily better for printed communication. I’m afraid the same thing is happening with 3D printing for visual and tactile communication, but should get more realistic and sound with time and more than just silly toys.

As an industrial designer, over the years I have critiqued designs and reviewed portfolios of ID students at design schools and conferences. The most prominent trend I’ve seen over the past several years is that students have a ton of digital tools at their disposal, but they are concentrating too much attention on the tools themselves and presentation, and not enough on the design problem they are trying to solve. Closely related to this is the fact that while a lot of pretty product designs are being created, relatively few can be manufactured economically, if at all.

When I look way back to my own creative beginnings, as a child I loved to draw on paper and Etch-A-Sketch in 2D and build things in 3D with Legos, Erector Sets, Lincoln Logs, and wood with nails. Did I create anything of real value that appealed to anyone but myself? Honestly, no, but it did spark a an interest in a formal education in design and engineering, as well as fostering a lifelong interest and appreciation for good design.

With the advent of easier to use and affordable (a lot even free) 3D software and hardware, will creativity proliferate? Certainly it will to an extent, but let’s be realistic on the quality and value of the vast majority of the things produced. While there is some reality to the thought that everyone can be creative and produce things in 3D, there is also a good deal of myth with regard to what is actually being created. However, new creative hands-on skills are being learned and put into practice, which is a good thing.

I applaud the efforts that some of the 3D software and hardware vendors have put forth in getting their technologies into the hands of a new type of user. I would just caution the vendors from overselling the promise of creativity for everyone that will result in stunning designs. A small percentage of the designs might have value, but as with products coming out of the professional community, many probably won’t.

Don’t get me wrong, along with DIY, I think it’s a great movement and should be strongly encouraged, we just need to sort out myth from reality when it comes to creating things in 3D.

Since this is such an interesting, strong movement, I’ll come back to this topic many times in the future, especially as I check out a number of the 3D software and hardware products and services for myself. Let’s get creative!

Concept Design and Cost Management In Early Phase Product Development – A Vital First Step

 
December 16th, 2011 by Jeff Rowe

Virtually all new product developments projects begin with a conceptual design phase. During this early stage, industrial designers and engineers rapidly explore and refine several ideas by engaging in free-flowing, collaborative brainstorming sessions. These sessions are intended to originate a wide range of potential design solutions from hand-drawn sketches, 2D drawings and layouts, 3D models, and renderings. All of these concept design methods come with inherent advantages and disadvantages. Designs coming from the sessions are considered and evaluated until a final concept design is chosen and pursued for further development – usually determined by functional, marketing, and manufacturing requirements.

During the concept phase, ideas are generated using methods ranging from rough sketches on paper or white boards to using a 3D CAD tool. A recent study entitled, “Trends in Concept Design,” conducted by PTC, discusses the different methods by which concept designs are initiated and captured. According to the survey, the largest percentage of the survey’s participants indicated that concept designs were captured electronically in the form of 3D data, however, several participants indicated that concepts were still created and shared through hand-drawn paper sketches. Regardless of how concept designs are generated, manual or digital, the vast majority of those involved with concept design have the ability to visualize and create designs in 3D. This is only natural since we all live in a 3D world.

Another reason why concept design is such a critically important phase of successful new product design is because this is usually when the majority of the total development costs are committed to developing, manufacturing, and bringing a product to market. The PTC survey found that the majority of the manufacturing cost of a typical product is committed by the end of the conceptual phase. As a result, if poor decisions made during this early phase of design, manufacturers stand to lose much of the money that was committed before production even starts. The bottom line is that a high-quality concept design model is essential for accurately determining and committing to product costs.

PTC’s Creo family of design apps is well-suited for both concept design and detailed design. Creo Sketch is a tool for capturing early concepts in the form of 2D sketches, while Creo Direct is suited for efficiently creating a high-quality 3D model that can be used for a multitude of purposes. In the Creo Direct environment, you can create and edit 3D designs through direct interaction with their geometry. You can make changes to the basic design elements at any point with little impact to the overall design process. In this design environment, the shape of a 3D model is how it appears from the outside. Additionally, the resulting geometry is compatible with all downstream Creo applications, like Creo Parametric or Creo Simulate.

So, while some manufacturers have downplayed  conceptual design in the early phase of product development as an unnecessary cost, successful manufacturers have embraced concept design and have been rewarded with better overall designs and cost management up front – ultimately leading to more satisfied customers and higher profits.




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise