Archive for July, 2019
Wednesday, July 31st, 2019
PROSTEP has published a white paper covering a number of important aspects to consider when implementing a Digital Twin. These include what information should be included in these twins, how they can be structured, and what role cloud and platforms play for Digital Twin applications. The white paper underlines the importance of digital consistency.
PROSTEP has published a white paper covering a number of important aspects to consider when implementing a Digital Twin. These include what information should be included in these twins, how they can be structured, and what role cloud and platforms play for Digital Twin applications. The white paper underlines the importance of digital consistency.
For some years now, digitalization has been the central future topic for German industry. The manufacturing industry in particular is shaping the digital future of production with many initiatives, often inspired by the Industry 4.0 future project launched by the German government in 2012. Industry 4.0 is the manufacturing-focused variant of the digital transformation process that needs to be successfully shaped for society as a whole. In particular, the digital future of production goods that were previously completely dominated by mechanics is generating constant pressure to innovate.
It is not only in the automotive industry that future business success will depend to a large extent on new, innovative business models. These need to be met by adapting product development and production accordingly, and by providing good support for digital usage concepts. The Digital Twin is one of the central innovations that enables companies to successfully shape this digital transformation.
The forms of the Digital Twin are as different as the various products that the manufacturing industry produces. As different as the business models on the market are, as diverse are the information required in the Digital Twin. For many Digital Twin concepts, it is very important to combine production information with information from the current use of the product. In the best case scenario, the service technician of the elevator optimized by predictive maintenance mechanisms is already provided with the parts list of the parts actually installed in this elevator instance in the service center.
But product development also wants to participate in the Digital Twin. Unusual accumulations of faults in certain components are to be eliminated quickly during the further development of the product. Skilful use of this feedback has the potential to directly improve product quality and to reduce service costs, which are particularly relevant for manufacturers with “as a service” concepts, more quickly.
In order to fully exploit the potential of the Digital Twin, a digital end-to-end process chain must be created that provides the right information in the Digital Twin reliably, quickly and automatically.
A particular challenge for the Digital Twin is that the required data must come from a wide variety of sources. During the usage phase, the status data is ideally available via an IoT solution. However, production data relating to the specific product instance may also be required. Ideally, one would also like to be able to access information from product development. These areas are characterized by a multitude of information systems and are themselves under strong pressure to change, which is characterized by the topics Systems Engineering, Industry 4.0 and Industrial IoT.
With regard to the Digital Twin, decision-makers in companies are faced with the question of how to make it successful in a highly complex infrastructure. Closely related to this are further topics such as the question of what effects the emerging platform structures will have on this process, how to find the appropriate cloud strategy, which are the most important skills for designing a continuous process chain spanning several independent and individual platforms and which architecture concepts are necessary for this.
Since its foundation 25 years ago, PROSTEP AG has been designing and implementing digital end-to-end processes in product development. On the basis of the experience gained, we have compiled a number of topics in a white paper that are intended to help companies move from their status quo to a sustainable design of their process and IT landscape and to master the diverse challenges of the Digital Twin. The white paper is available for download here.
No Comments »
Saturday, July 27th, 2019
The IT landscapes in the maritime industry are characterized by the fact that special shipbuilding applications are often used for initial, basic and detail design as well as for production preparation. Although they enabled the specialist departments to carry out their work particularly efficiently, they require powerful integrations from an entrepreneurial point of view in order to be able to use the digital information throughout the process. Consistent processes and information flows are the prerequisite for building a digital ship model that can accompany the entire ship life cycle.
The development of proprietary interfaces is not only time-consuming and cost-intensive, but also complicates the exchange of existing applications or the rapid integration of new ones. Based on the proven OpenPDM technology, which is used by many companies for the integration of different enterprise systems, data migration and cross-company collaboration, PROSTEP has therefore created an extension of this integration platform specially designed for the maritime industry. An essential component of this platform are standards-based connectors that simplify both the horizontal integration between different authoring systems and their vertical integration into the enterprise systems which manage the product structures (PDM, PLM, ERP, etc.).
OpenPDM SHIP enables data exchange between special shipbuilding applications such as NAPA, AVEVA Marine, CADMATIC or ShipConstructor and mechanical CAD systems such as CATIA or NX. The latter are often used for the construction of complex interiors, e.g. for public areas in cruise ships or large yachts. When transferring information from mechanical to shipbuilding specific CAD applications and vice versa, the integration platform maps the different data models to each other. This allows companies to use the CAD tool of their choice for any task in the ship development process without losing valuable information during conversion and data exchange.
On the other hand, OpenPDM SHIP supports the creation of complex CAx process chains with arbitrary synchronization points from initial design in NAPA or NAPA Steel to basic and detail design in AVEVA Marine or CADMATIC to production preparation, for which some shipyards use the NESTIX software. The challenge with CAx integration is that the coherent ship geometry for the subsequent processes and systems must be broken down into manufacturable components and transferred with the production-relevant information. The integration platform supports this process and enables the consistent use of digital information in all phases of the ship development process.
OpenPDM SHIP also provides connectors to common PDM/PLM and ERP systems (3D Experience, ARAS Innovator, Teamcenter, SAP, Windchill, etc.) to merge CAx data from different source systems into a digital ship model and control this model through the ship life cycle. The vertical integration of the authoring systems into the data and process management environment is a prerequisite for comprehensible ship development processes and consistent management of all information generated. At the same time, the integration platform offers the possibility to link the digital ship model for Digital Twin applications with the real operating data.
OpenPDM SHIP is now available and will be continuously expanded with new integrations. In cooperation with SSI and SSI’s European sales partner NDAR (Nick Danese Advanced Research), PROSTEP is currently developing an OpenPDM SHIP connector for the SSI Enterprise platform. It is the basis for exchanging CAD models between NAPA Steel and the Autodesk-based ship development platform ShipConstructor and for importing the ShipConstructor data into common PDM/PLM systems.
No Comments »
Sunday, July 21st, 2019
Companies that do not want to set up and maintain their own data exchange solution can use OpenDXM GlobalX as a service without any installation or operating effort. PROSTEP provides the world’s leading Managed File Transfer (MFT) solution in DARZ’s highly secure cloud environment as a SaaS model (Software as a Service). You can use the solution practically from a standing start.
As a cloud service, OpenDXM GlobalX minimizes installation and operating costs. As part of the initial setup, the system is configured so that customers can use it directly and only need to set up their users and data exchange partners. If required, they can also outsource these administrative tasks to PROSTEP. Software updates that maintain the high security standards or make new, helpful functions available are included in the price of the cloud service.
Using the MFT solution as a private or hybrid cloud service offers customers the greatest possible flexibility. They can store the user data securely encrypted in their own infrastructure at different locations and initiate data exchange directly from their back-end systems via the existing PDM system integration. PROSTEP also offers customers the option of using OpenDXM GlobalX with their corporate design and their own web address, regardless of whether they want to use the software as a multi-tenant application, as a private or hybrid cloud service.
The DARZ in Darmstadt is a high-security data center certified by the BSI (Federal Office for Information Security) according to KRITIS. It is characterized by multi-level access controls, automatic fire protection systems with several fire protection zones, air conditioning technology, uninterruptible power supply, redundantly designed server systems as well as a direct connection to the Internet backbone around the clock, which guarantee the highest possible protection and maximum availability of the data. Regular security checks of the OpenDXM GlobalX software by external penetration tests ensure the best possible application security for the cloud service.
Protection and security of customer data in the cloud are therefore guaranteed at all times. PROSTEP’s and DARZ’s processes and methods are subject to the strict requirements imposed by ISO 9001 and 27001 certification and the EU Data Protection Ordinance (DSGVO). In addition, PROSTEP is a member of the German IT Security Association (Bundesverband IT Sicherheit e.V.). (TeleTrusT) and has received the coveted quality mark “IT Security made in Germany” for OpenDXM GlobalX.
By Frank Timmermann
No Comments »
Monday, July 15th, 2019
On 23 October 2019, PROSTEP will be hosting the first ever 3D PDF Customer Experience Day at PERCUMA, an event location in Eppstein, Germany. The new event is all about comparing notes with leading industry representatives, who be sharing on their experiences with 3D PDF technology. They will subsequently be available to answer any questions participants might have. Places are still available.
Although networking and sharing experiences will play an important role at the 3D PDF Customer Experience Day, there will be no shortage of information. Customers and experts from PROSTEP will explain to participants how 3D PDF technology can be used in different areas of application – from the offer phase to the provision of technical data packages for manufacturing and assembly through to the creation of spare parts catalogs in the service department – and the savings that can be achieved by automating the preparation and provision of information.
Among other things, André Hieke from Siemens will report on how Siemens Large Drives Applications (LDA) is using 3D PDF to provide offer documents for customer-specific configurations of large electric motors and converters almost at the touch of a button. Christian Thomas from Atotech Deutschland GmbH, a manufacturer of electroplating systems, will explain to participants the advantages of the technology when it comes to the automatic preparation and provision of spare parts catalogs. And Patrick Stockden from MEYER WERFT will describe how the shipbuilder intends to optimize production processes by dispensing with paper drawings.
Leading 3D PDF experts such as Ulrich Isermeyer from Adobe Systems and Andreas Vogel from theorie3.De will provide participants with an overview of the advantages of 3D PDF technology, the wide range of possible application areas it offers and the new functions, e.g. for publishing information in the Internet. You will also gain insight into the progress being made with regards to ISO standardization, which makes the PDF standard an even more versatile tool, e.g. for the long-term archiving of product data.
No Comments »
Friday, July 12th, 2019
Last year, Professor Dr. Jens C. Göbel took over as successor to Professor Martin Eigner as the head of the Institute for Virtual Product Engineering at the University of Kaiserslautern. He has already been concerned with PLM-related topics for many years, both at research level and with regard to industrial applications. In an interview for the PROSTEP Newsletter, he explained what the future of PLM will look like.
Question: What are the technological trends that PLM users and vendors are going to have to get to grips with?
Göbel: In the future, smart product systems with integrated services that are very closely networked and form part of other systems (systems of systems) will play a much more central role in product lifecycle management. The end-to-end implementation of PLM concepts throughout the entire product lifecycle will, on the one hand, be made possible for the first time by this networking capability and, on the other, is also urgently required – for example in order to permit the integrated management of digital product twins. PLM is therefore a vital underpinning for the lifecycle of smart products. However, PLM will have to open out and develop in different directions if it is to achieve this aim. This applies, for example, to the integration of other specialist disciplines in the very early systems engineering phases, the inclusion of product instances in later product utilization phases, and the use of AI for the purposes of “engineering intelligence”.
Question: Where do you see the greatest obstacles to the implementation and optimization of PLM applications in industry?
Göbel: Many of the obstacles hindering PLM implementation continue to be of an organizational nature or relate to the field of project or acceptance management. At the same time, however, increasing functional scope and, most importantly, integration capabilities, are making the implementation of PLM solutions even more complex. We will need new skills that include the digitalization of the products themselves and combine this with the digitalization of the processes in the product lifecycle. However, the progress of PLM implementation is being helped by industry’s increasing awareness of the correlation between standard PLM tasks and the success of digital transformation in the engineering field, as well as throughout the entire product lifecycle.
Question: Do companies have the necessary technical PLM skills, for example in order to develop smart products and services?
Göbel: Far from it! The offering of smart products and services is directly linked to the enterprise-specific design of new business models and flexible collaboration models. At this level, it will be necessary to create PLM interfaces and extensions that reach through to the very early innovation and product planning processes. This will require a fundamental mind shift in individual specialist departments as well as in PLM and IT departments. Another current challenge, in particular in large companies, is the PLM-specific integration of and cooperation between somewhat traditional company departments and highly agile start-up divisions.
Question: What role do currently hyped-up topics such as the digital thread and the digital twin play with regard to the strategic importance of PLM in companies?
Göbel: Generally speaking, it can be seen that companies have recognized that PLM is key and forms a vital basis for these issues. That is why it is often becoming the focus of strategic enterprise initiatives to a much greater extent than before and is being considered from a number of very new perspectives. However, to avoid false expectations and unrealistic ideas of the potential benefits, we need to be clear about what these terms mean and consider them against the background of concrete application scenarios, that is to say not as an end in themselves. At the VPE, we have already shown that this is possible in a number of research projects, e.g. in the case of the use of digital product twins for service-based business models in the agricultural machinery and automotive industries.
Question: It is said that monolithic PLM systems are no longer in fashion. However, PLM vendors are constantly extending their solutions with new modules. Isn’t that a contradiction?
Göbel: PLM solutions must continue to develop radically at both the functional and technological levels if they are to keep up with the fast-moving dynamics that characterize products and business models. In this context, the integrative nature of PLM is of particular importance. It is becoming increasingly important that other discipline-specific IT systems and platforms be integrated quickly and easily throughout the entire product lifecycle, including for temporary applications. It is therefore vital for success that PLM vendors keep their promises regarding openness and press ahead even further in this direction, including in joint initiatives such as the prostep ivip Association’s Code of PLM Openness (CPO).
Question: You wrote your doctoral thesis on the harmonization of hybrid PLM environments. Has industry made progress in this area on recent years?
Göbel: Yes, it has been pleasing to see the visible progress and success achieved in the field of PLM harmonization in recent years. Businesses have learned from the past and pursued more methodical approaches that consider processes, methods, IT systems and company organization more holistically. However, this topic is still extremely relevant today. Here, too, technological further developments and additional PLM requirements are demanding new approaches, for example the use of semantic technologies for linking data; in this context, we refer to this as semantic PLM.
Question: Do we maybe need something like a meta-PLM system that links together the data from the different data silos?
Göbel: In principle, yes. But it mustn’t be rigid. Any such meta-PLM will have to keep pace with the ever faster dynamics of internal organizational structures and value-added networks. For example, in the AKKORD project, which was recently launched by the German Ministry of Education and Research (BMBF), we are working together with various industries to achieve intelligent data networking in the product lifecycle. In this project, we are attempting to flexibly integrate not only PLM but also ERP, CRM and other systems and perform AI-based product data analysis, for example in order to exclude possible sources of error and predict costs on the basis of quality data as early as the development phase.
Question: No other organization is working harder to further develop the PLM concept in the direction of SysLM than the VPE Institute. With which of these terms do you identify most?
Göbel: For me, the content that stands behind the terms is more important than the terms themselves, which in any case are not used uniformly. We will orient the development of PLM in the direction of the aspects we have discussed – and in some areas will have to completely rethink it. The concept of SysLM primarily reflects the idea of an interdisciplinary systems approach, which is increasing taking the place of our traditional product concept. As such, this term represents an important element of these further developments. However, it is not the only direction of PLM development that we are working on.
Question: In the future, where will you place the emphasis in terms of research and teaching? Is MBSE still at the very top of the curriculum?
Göbel: MBSE continues to be our central topic, in particular as an “enabler” for smart engineering. For example, we are currently taking part in a large interdisciplinary research project involving the knowledge-based roadmapping and cross-value chain modeling of overall system architectures in very early innovation phases in the automotive industry. A few weeks ago, we became one of the first German university research institutes to be included in the OMG’s SysML v2 Submission Team (SST), where we will help design the next SysML generation. We are currently incorporating the results of our research activities in a new cross-discipline “Smart Systems Engineering” study module which we, together with colleagues from the IT and Electronic Engineering departments, will offer in the upcoming winter semester. We opened the e4lab (engineering 4.0 lab) demonstration and testing laboratory in Kaiserslautern last July with the aim of making the potential applications of our research results tangible and accessible for industrial enterprises. These are just a few examples – many more will follow. Be ready to be surprised!
Question: The digitized engineering conference SYSLM2019 will be held in Kaiserslautern in October. Why has the name of this event been changed?
Göbel: We at the VPE have been working on the digitalization of engineering ever since the institute was founded by Professor Dankwort in 1994, that is to say for 25 years. I think that the name very accurately expresses this overarching core idea and also emphasizes the key topics we are addressing today. In the future, we want to direct even greater attention forward towards visionary ideas and trends that are also of relevance to industry in order to provide a guide for participants and stimulate an inspirational dialog about the future of digital engineering. I am very much looking forward to this year’s program, which contains some top-quality contributions, including the participation of PROSTEP AG, and is therefore ideal for promoting just such a dialog.
Mr. Göbel, thank you very much for talking to me.
(This interview was conducted by Michael Wendenburg)
About Jens C. Göbel
Professor Dr. Jens C. Göbel has headed up the Institute for Virtual Product Engineering (VPE) at the University of Kaiserslautern since 2018. Göbel studied industrial engineering (mechanical engineering) at the University of Siegen. After graduating, he worked on PLM-related topics at Bosch Rexroth, Keiper Recaro and Schmitz Cargobull and also conducted research into the fundamental principles of the lifecycle management of integrated product-service systems at Bochum University. There, he wrote his doctoral thesis on the harmonization of hybrid PLM environments at the Department of Information Technology in Mechanical Engineering under Professor Dr.-Ing. Michael Abramovici. Since 2010, he has served as research coordinator and senior engineer and head of the Lifecycle Management research group in the same department.
No Comments »
Monday, July 8th, 2019
Monolithic system architectures with static interfaces do not do justice to the dynamics in the development of smart networked products. PROSTEP has therefore developed a new integration approach that makes it possible to directly link the data of the various disciplines and domains instead of replicating them in a PLM cockpit. OpenCLM makes the status of interdisciplinary development projects easier to understand.
With the development of OpenCLM, PROSTEP is meeting the requirements of its customers, who are faced with the challenge of agile integration of new tools and methods into their system landscapes and the need to replace existing ones without losing transparency in the product development process when developing smart products. As the proportion of software and electronics increases, not only the dynamics of product changes increases, but also the dynamics of technologies for their development and production. At the same time, the coordination of the disciplines and domains involved in product development requires a depth of integration that can no longer be achieved with classic integration approaches or only with immense effort. This is all the more true as the engineering landscapes in most companies today are very heterogeneous and – as already mentioned – are constantly changing.
The development of smart products in this heterogeneous system landscape creates a huge amount of data that is related to each other and changes continuously. Project managers and their teams are finding it increasingly difficult to maintain an overview of the interrelationships and to document project progress comprehensively throughout all phases of the product development process. The manual collection of project results also involves a considerable amount of time for essentially repetitive work.
Snapshot of the digital master
OpenCLM automates this process by automatically generating the currently valid view or configuration of the information statuses and their relationships to other artifacts during the approach to previously defined coordination points (milestones). This view documents which requirements triggered which activities (tasks) and which deliverables were generated or are still to be generated. If you then draw a baseline, a snapshot of the digital master, it can be automatically derived in 3D PDF format if required.
Combining all product-relevant information for baselining in one backbone system would entail enormous effort for the integration of the authoring or supporting management systems (such as TDM, ALM, SDM) and would probably still not ensure the required depth of information. Most companies, for example, do not record the requirements in their PLM systems in a granularity that would allow individual requirements to be linked to specific artifacts. As an alternative to data replication, data linking using OpenCLM is therefore recommended.
Link data instead of replicating it
With OpenCLM, PROSTEP has developed a lightweight and easy-to-configure web application for cross-discipline Configuration Lifecycle Management (CLM). It makes it possible to easily link data and documents from different source systems. Its basis is the integration platform OpenPDM, to which not only common TDM, PDM/PLM and ERP systems, but also simulation data management systems, ALM systems (Application Lifecycle Management) and other enterprise applications can be connected via industry-proven connectors or standards such as OSLC. OpenCLM displays the linked information objects with metadata such as status, change date, owner, etc. in a clear and concise cockpit. There they can easily be compared with other data statuses.While other linking concepts aim to automatically establish semantic references between the data at database level with the aid of intelligent search algorithms, PROSTEP focuses on targeted linking of the original data based on the specific process requirements and taking into account their status information in the respective management systems. Among other things, this has the advantage that OpenCLM can not only display the linked data, but also enable write access. The project manager can then use the cockpit, for example, to add missing attributes or initiate a status change, provided he or she has the appropriate authorization in the process and in the source system and the write function is enabled in OpenCLM.
Project plan as starting point
The starting point for data linking is a concrete project plan that is oriented to the phases of the product development process. For each milestone, this plan specifies which artifacts from which source system have to be linked from one step to the next, what semantic meaning the relationships have and what quality the results to be generated should have. There are predefined link types with the permitted start and end artifacts that specify how a request is linked to a certain task, such as a simulation step, which is the work result (e.g. a test report) and where it can be found, even if it does not yet exist or only exists as a placeholder at the time the links are defined. This distinguishes OpenCLM from AI-based linking approaches, which can only calculate these correlations afterwards.
The effort for the manual preparation of the linking is manageable, since the references in principle only have to be created once. OpenCLM offers the option of creating templates for the various project types with the specifications for the respective milestones or baselines, which the user links to the concrete information objects in the current project. OpenCLM then always provides a current version of the linked information for each milestone. The project manager can create the templates himself, e.g. on the basis of his existing project structure plans, or have them created by PROSTEP as part of customizing.
Tool for project control
OpenCLM is first and foremost a tool for project managers and participants with which they or their teams can structure their work and control its progress. The project team members see an overview of all tasks and artifacts that belong to an upcoming milestone in the cockpit, and in which status they are. You can navigate in the structure and arrange it according to different criteria, e.g. to display all information belonging to a specific function across all disciplines. The software automatically identifies open points or artifacts that are still missing and can also automatically suggest their degree of maturity. In addition, project managers receive immediate visual feedback on problem points, e.g. when a particular simulation task has been performed on the basis of an outdated model.
The new PROSTEP solution ensures a high level of transparency and traceability of the development objects across all disciplines and domains. Because the relationship information is managed transiently in OpenCLM, the data models of the source systems do not need to be touched. Among other things, this has the advantage that the departments can use a different tool or data source in the next project if required, without the process or software having to be constantly adapted. The baselines of OpenCLM define which work results they have to deliver and when. This ensures consistent, comprehensible and rule-compliant product documentation, as required by SPICE, CMMI, DIN ISO 26262. plan as starting point.
By Lutz Lämmer
No Comments »
Friday, July 5th, 2019
Theegarten-Pactec is the world’s leading manufacturer of continuous-motion packaging machines for confectionery, food and non-food products. In order to secure its market position, the Dresden-based company has decided to use PLM to speed up the development of new machines and its order-oriented design work. PROSTEP will be advising the company on the optimization of its PLM processes, requirements analysis and selection of the system.
Unlike clocked or discontinuous packing machines, which are also part of the company’s product portfolio, continuous-motion packaging means that hard candies, chocolates and other products are constantly in motion. In an end-to-end process, they are wrapped or folded in packaging material by a number of tools mounted on a rotating drum. The advantages of this technology lie in the higher throughput and gentler packaging process, as Dr. Egbert Röhm, managing director of Theegarten-Pactec GmbH & Co. KG explains: “Our continuous-motion machines package up to 2,300 products per minute, clocked machines not even half of them.”
“The story of our success is not a short one,” says the company’s home page. It is also a very German reunification success story. In Dresden, the company VEB Verpackungsmaschinenbau Dresden developed the first continuous-motion packaging machines to market maturity as early as 1981 and successfully exported them worldwide, even in the days of the GDR. At that time, the state-owned company formed part of the NAGEMA combine and, since 1946 had bundled the potential of a number of Dresden-based mechanical engineering companies whose owners had been expropriated after the Second World War. Following German reunification, the combine was first transformed into a stock corporation, which subsequently gave rise to a number of individual companies, including the packaging machine division.
As of 1991, the newly founded Verpackungsmaschinenbau GmbH, which comprised various parts of the original combine, traded as Pactec Dresden GmbH. It employed only a small, high-performance workforce that represented approximately 10 percent of the original company, which previously had 3,000 employees. In 1994, Pactec Dresden was acquired by the Cologne-based family-owned company Rose-Theegarten, which had been manufacturing packaging machines for confectionery products since as early as 1934. Because the site in Dresden offered advantages both in terms of the usable industrial area and available skilled workers, the company’s headquarters were moved to Dresden a few years later, as Röhm recounts: “That is certainly something that didn’t happen very often in the history of German reunification.”
Series machines packed with engineering
In Dresden, Theegarten-Pactec now manufactures both clocked and continuous-motion packaging machines and systems for all types of small products that have to be packaged quickly: hard candies, soft caramels, chocolate products as well as stock cubes and dishwasher tablets. “We service the global market with around about 430 employees,” says Röhm. Nearly 100 of these work in development and design because the engineering effort is considerable. Even though, fundamentally, these are series machines that have become increasingly modular in design in recent years, the design engineers have to adapt them to the format of the product that is to be packaged, the desired fold type and other specific customer requirements. Each year, the mid-sized company ships between 100 and 120 machines and packaging systems.
The cross-departmental processes involved in handling customer orders are time-consuming and tie up significant capacities because the support provided by software applications is not optimal. The measure of all things is currently the ERP system. There is no independent PLM system, just a module integrated in the ERP system that manages the authoring systems’ CAD objects and data. “The ERP system also manages the order-neutral variant BOMs. This means that we have to check any number of dependencies every time there is a change,” explains Röhm. “Once we’ve got the BOM for the order, our processes work very well. However, the administrative effort required to create an order-specific BOM is huge because we force our designers to follow the production and assembly-oriented logic of the ERP processes. To speed up development, we are thinking about how we can decouple the development process more thoroughly from the downstream processes.” The aim is to speed up the offer and order process and to bring a new machine onto the market every year rather than every other year.
Analyzing the information flow
In 2018, as a first step, Theegarten-Pactec called in external consultants to shed some light on the shortcomings in the process and IT landscape. The Institute for Industrial Management (FIR) performed a detailed examination of the ERP-related processes, while the PLM consulting and software company PROSTEP took a close look at the PLM processes. “We weren’t thinking in terms of lofty Industry 4.0 issues but about how to better organize the existing processes involving our ERP system, as well as about topics such as variant and BOM management, how to cut down the administrative overhead for our design engineers, and how to make the knowledge we have about our machines more readily available,” emphasizes Röhm.
The consultants came to the conclusion that the interaction between ERP and PLM needed to be reorganized and that it was essential to implement clear-cut PLM processes with system support. “The highest priority, however, is to construct a PLM landscape that will make the existing information and information flows more transparent. We have placed the focus on PLM and on integrating this approach throughout the company in order to create a shared data store for all our processes,” explains Dr. Dirk Hagen Winter, project manager Change Management in corporate management.
Before starting to look for a suitable PLM solution, Theegarten-Pactec commissioned the PLM experts to undertake an initial subproject with representatives from all the affected departments in order to set up an end-to-end information model. To do this, PROSTEP used a standardized method that makes it possible to identify redundancies, bottlenecks and discontinuities in the information flows. This showed that the main problem facing the company lies in making the comprehensive know-how that it has acquired over the years available and searchable quickly enough. Indeed, knowledge is often concentrated in individual employees and is not universally accessible.
Potential in variant management
“Together with PROSTEP, we also took a close look at how we get the information we receive from the customer when an order is placed in a well-structured manner in the order processing process,” says Winter. In principle, the aim is to structure the customer requirements functionally and to transfer the configuration with the desired options at least in part automatically into the mechatronic parts list and CAD structures. The idea is to manage the variant parts list in PLM in the future and then transfer the configured design parts list to an assembly-oriented production parts list and transfer it to the ERP system.
For historical reasons, the company’s product and module structure tended to be assembly-oriented in the past. As a result, the entire company is obliged to think assembly-oriented. The Engineering department does not develop end-to-end in a functional-oriented way, but instead develops the modules in the way they will subsequently be assembled. “Of course we still need an assembly-oriented BOM, but nowadays it really ought to be possible to derive it as a second mapping from a functional view.” As Röhm goes on to say: “PROSTEP has made it clear to us that our current approach wastes a lot of potential in development.”
In addition, it is currently very difficult for Theegarten-Pactec to track the lifecycle of its machines once they have left the factory. There is no time-based view of the shipped machines in the form of an “as-maintained” view of the digital twin. Such a view, however, is also difficult to maintain, since, for example, a food manufacturer with 100 machines in different development and delivery stages does not necessarily inform Theegarten-Pactec which spare part he has just ordered for which machine or which modifications he has made himself.
Concept for the PLM system development
In a second subproject with PROSTEP, employees from the various departments examined the question of what neutral formats could be used to provide information in the future. In this context, the topic of product manufacturing information (PMI) associated with CAD objects played just as much of a role as the derivation of simplified geometry models for creating electronic spare parts catalogs or project planning. “Our vision for the future is that all the information will be available in the 3D model, which is not the case at present,” says Röhm. With the exception of NC programming for mechanical machining and sheet metal working operations, which is performed on the basis of CAD models, the processes used for manufacturing and assembly are predominantly drawing-based. The tasks of deriving and preparing these drawings takes up a lot of the company’s design engineering capacity.
In a third subproject, which is due to be completed by the end of the year, a concept for the structure of the future PLM system and a requirements specification for the selection of the system will be drawn up. A key component of the development plan is a central integration platform that will act as an intermediate layer permitting the more flexible interconnection of the ERP, PLM and authoring systems. As Röhm explains: “At present, the direct connection between the PDM module and the ERP system means that we have to update the CAx systems whenever there is an ERP system release change and vice versa. We want to decouple these components by means of the integration layer in order, among other things, to make it easier to replace certain software modules.”
PROSTEP’s experts will also help Theegarten-Pactec select an appropriate system. Röhm is extremely happy with the way the PLM consulting and software company is planning and conducting the projects. “PROSTEP understands the path ahead very clearly. However, we don’t want to merely follow blindly but instead want to work on this task together.” Those involved know very well that it will take several years to implement the initiated changes and that systematic implementation of the PLM approach will demand a realignment of the currently employed ERP applications and the relevant processes.
No Comments »
|