Archive for the ‘PLM’ Category
Friday, July 12th, 2019
Last year, Professor Dr. Jens C. Göbel took over as successor to Professor Martin Eigner as the head of the Institute for Virtual Product Engineering at the University of Kaiserslautern. He has already been concerned with PLM-related topics for many years, both at research level and with regard to industrial applications. In an interview for the PROSTEP Newsletter, he explained what the future of PLM will look like.
Question: What are the technological trends that PLM users and vendors are going to have to get to grips with?
Göbel: In the future, smart product systems with integrated services that are very closely networked and form part of other systems (systems of systems) will play a much more central role in product lifecycle management. The end-to-end implementation of PLM concepts throughout the entire product lifecycle will, on the one hand, be made possible for the first time by this networking capability and, on the other, is also urgently required – for example in order to permit the integrated management of digital product twins. PLM is therefore a vital underpinning for the lifecycle of smart products. However, PLM will have to open out and develop in different directions if it is to achieve this aim. This applies, for example, to the integration of other specialist disciplines in the very early systems engineering phases, the inclusion of product instances in later product utilization phases, and the use of AI for the purposes of “engineering intelligence”.
Question: Where do you see the greatest obstacles to the implementation and optimization of PLM applications in industry?
Göbel: Many of the obstacles hindering PLM implementation continue to be of an organizational nature or relate to the field of project or acceptance management. At the same time, however, increasing functional scope and, most importantly, integration capabilities, are making the implementation of PLM solutions even more complex. We will need new skills that include the digitalization of the products themselves and combine this with the digitalization of the processes in the product lifecycle. However, the progress of PLM implementation is being helped by industry’s increasing awareness of the correlation between standard PLM tasks and the success of digital transformation in the engineering field, as well as throughout the entire product lifecycle.
Question: Do companies have the necessary technical PLM skills, for example in order to develop smart products and services?
Göbel: Far from it! The offering of smart products and services is directly linked to the enterprise-specific design of new business models and flexible collaboration models. At this level, it will be necessary to create PLM interfaces and extensions that reach through to the very early innovation and product planning processes. This will require a fundamental mind shift in individual specialist departments as well as in PLM and IT departments. Another current challenge, in particular in large companies, is the PLM-specific integration of and cooperation between somewhat traditional company departments and highly agile start-up divisions.
Question: What role do currently hyped-up topics such as the digital thread and the digital twin play with regard to the strategic importance of PLM in companies?
Göbel: Generally speaking, it can be seen that companies have recognized that PLM is key and forms a vital basis for these issues. That is why it is often becoming the focus of strategic enterprise initiatives to a much greater extent than before and is being considered from a number of very new perspectives. However, to avoid false expectations and unrealistic ideas of the potential benefits, we need to be clear about what these terms mean and consider them against the background of concrete application scenarios, that is to say not as an end in themselves. At the VPE, we have already shown that this is possible in a number of research projects, e.g. in the case of the use of digital product twins for service-based business models in the agricultural machinery and automotive industries.
Question: It is said that monolithic PLM systems are no longer in fashion. However, PLM vendors are constantly extending their solutions with new modules. Isn’t that a contradiction?
Göbel: PLM solutions must continue to develop radically at both the functional and technological levels if they are to keep up with the fast-moving dynamics that characterize products and business models. In this context, the integrative nature of PLM is of particular importance. It is becoming increasingly important that other discipline-specific IT systems and platforms be integrated quickly and easily throughout the entire product lifecycle, including for temporary applications. It is therefore vital for success that PLM vendors keep their promises regarding openness and press ahead even further in this direction, including in joint initiatives such as the prostep ivip Association’s Code of PLM Openness (CPO).
Question: You wrote your doctoral thesis on the harmonization of hybrid PLM environments. Has industry made progress in this area on recent years?
Göbel: Yes, it has been pleasing to see the visible progress and success achieved in the field of PLM harmonization in recent years. Businesses have learned from the past and pursued more methodical approaches that consider processes, methods, IT systems and company organization more holistically. However, this topic is still extremely relevant today. Here, too, technological further developments and additional PLM requirements are demanding new approaches, for example the use of semantic technologies for linking data; in this context, we refer to this as semantic PLM.
Question: Do we maybe need something like a meta-PLM system that links together the data from the different data silos?
Göbel: In principle, yes. But it mustn’t be rigid. Any such meta-PLM will have to keep pace with the ever faster dynamics of internal organizational structures and value-added networks. For example, in the AKKORD project, which was recently launched by the German Ministry of Education and Research (BMBF), we are working together with various industries to achieve intelligent data networking in the product lifecycle. In this project, we are attempting to flexibly integrate not only PLM but also ERP, CRM and other systems and perform AI-based product data analysis, for example in order to exclude possible sources of error and predict costs on the basis of quality data as early as the development phase.
Question: No other organization is working harder to further develop the PLM concept in the direction of SysLM than the VPE Institute. With which of these terms do you identify most?
Göbel: For me, the content that stands behind the terms is more important than the terms themselves, which in any case are not used uniformly. We will orient the development of PLM in the direction of the aspects we have discussed – and in some areas will have to completely rethink it. The concept of SysLM primarily reflects the idea of an interdisciplinary systems approach, which is increasing taking the place of our traditional product concept. As such, this term represents an important element of these further developments. However, it is not the only direction of PLM development that we are working on.
Question: In the future, where will you place the emphasis in terms of research and teaching? Is MBSE still at the very top of the curriculum?
Göbel: MBSE continues to be our central topic, in particular as an “enabler” for smart engineering. For example, we are currently taking part in a large interdisciplinary research project involving the knowledge-based roadmapping and cross-value chain modeling of overall system architectures in very early innovation phases in the automotive industry. A few weeks ago, we became one of the first German university research institutes to be included in the OMG’s SysML v2 Submission Team (SST), where we will help design the next SysML generation. We are currently incorporating the results of our research activities in a new cross-discipline “Smart Systems Engineering” study module which we, together with colleagues from the IT and Electronic Engineering departments, will offer in the upcoming winter semester. We opened the e4lab (engineering 4.0 lab) demonstration and testing laboratory in Kaiserslautern last July with the aim of making the potential applications of our research results tangible and accessible for industrial enterprises. These are just a few examples – many more will follow. Be ready to be surprised!
Question: The digitized engineering conference SYSLM2019 will be held in Kaiserslautern in October. Why has the name of this event been changed?
Göbel: We at the VPE have been working on the digitalization of engineering ever since the institute was founded by Professor Dankwort in 1994, that is to say for 25 years. I think that the name very accurately expresses this overarching core idea and also emphasizes the key topics we are addressing today. In the future, we want to direct even greater attention forward towards visionary ideas and trends that are also of relevance to industry in order to provide a guide for participants and stimulate an inspirational dialog about the future of digital engineering. I am very much looking forward to this year’s program, which contains some top-quality contributions, including the participation of PROSTEP AG, and is therefore ideal for promoting just such a dialog.
Mr. Göbel, thank you very much for talking to me.
(This interview was conducted by Michael Wendenburg)
About Jens C. Göbel
Professor Dr. Jens C. Göbel has headed up the Institute for Virtual Product Engineering (VPE) at the University of Kaiserslautern since 2018. Göbel studied industrial engineering (mechanical engineering) at the University of Siegen. After graduating, he worked on PLM-related topics at Bosch Rexroth, Keiper Recaro and Schmitz Cargobull and also conducted research into the fundamental principles of the lifecycle management of integrated product-service systems at Bochum University. There, he wrote his doctoral thesis on the harmonization of hybrid PLM environments at the Department of Information Technology in Mechanical Engineering under Professor Dr.-Ing. Michael Abramovici. Since 2010, he has served as research coordinator and senior engineer and head of the Lifecycle Management research group in the same department.
No Comments »
Monday, July 8th, 2019
Monolithic system architectures with static interfaces do not do justice to the dynamics in the development of smart networked products. PROSTEP has therefore developed a new integration approach that makes it possible to directly link the data of the various disciplines and domains instead of replicating them in a PLM cockpit. OpenCLM makes the status of interdisciplinary development projects easier to understand.
With the development of OpenCLM, PROSTEP is meeting the requirements of its customers, who are faced with the challenge of agile integration of new tools and methods into their system landscapes and the need to replace existing ones without losing transparency in the product development process when developing smart products. As the proportion of software and electronics increases, not only the dynamics of product changes increases, but also the dynamics of technologies for their development and production. At the same time, the coordination of the disciplines and domains involved in product development requires a depth of integration that can no longer be achieved with classic integration approaches or only with immense effort. This is all the more true as the engineering landscapes in most companies today are very heterogeneous and – as already mentioned – are constantly changing.
The development of smart products in this heterogeneous system landscape creates a huge amount of data that is related to each other and changes continuously. Project managers and their teams are finding it increasingly difficult to maintain an overview of the interrelationships and to document project progress comprehensively throughout all phases of the product development process. The manual collection of project results also involves a considerable amount of time for essentially repetitive work.
Snapshot of the digital master
OpenCLM automates this process by automatically generating the currently valid view or configuration of the information statuses and their relationships to other artifacts during the approach to previously defined coordination points (milestones). This view documents which requirements triggered which activities (tasks) and which deliverables were generated or are still to be generated. If you then draw a baseline, a snapshot of the digital master, it can be automatically derived in 3D PDF format if required.
Combining all product-relevant information for baselining in one backbone system would entail enormous effort for the integration of the authoring or supporting management systems (such as TDM, ALM, SDM) and would probably still not ensure the required depth of information. Most companies, for example, do not record the requirements in their PLM systems in a granularity that would allow individual requirements to be linked to specific artifacts. As an alternative to data replication, data linking using OpenCLM is therefore recommended.
Link data instead of replicating it
With OpenCLM, PROSTEP has developed a lightweight and easy-to-configure web application for cross-discipline Configuration Lifecycle Management (CLM). It makes it possible to easily link data and documents from different source systems. Its basis is the integration platform OpenPDM, to which not only common TDM, PDM/PLM and ERP systems, but also simulation data management systems, ALM systems (Application Lifecycle Management) and other enterprise applications can be connected via industry-proven connectors or standards such as OSLC. OpenCLM displays the linked information objects with metadata such as status, change date, owner, etc. in a clear and concise cockpit. There they can easily be compared with other data statuses.While other linking concepts aim to automatically establish semantic references between the data at database level with the aid of intelligent search algorithms, PROSTEP focuses on targeted linking of the original data based on the specific process requirements and taking into account their status information in the respective management systems. Among other things, this has the advantage that OpenCLM can not only display the linked data, but also enable write access. The project manager can then use the cockpit, for example, to add missing attributes or initiate a status change, provided he or she has the appropriate authorization in the process and in the source system and the write function is enabled in OpenCLM.
Project plan as starting point
The starting point for data linking is a concrete project plan that is oriented to the phases of the product development process. For each milestone, this plan specifies which artifacts from which source system have to be linked from one step to the next, what semantic meaning the relationships have and what quality the results to be generated should have. There are predefined link types with the permitted start and end artifacts that specify how a request is linked to a certain task, such as a simulation step, which is the work result (e.g. a test report) and where it can be found, even if it does not yet exist or only exists as a placeholder at the time the links are defined. This distinguishes OpenCLM from AI-based linking approaches, which can only calculate these correlations afterwards.
The effort for the manual preparation of the linking is manageable, since the references in principle only have to be created once. OpenCLM offers the option of creating templates for the various project types with the specifications for the respective milestones or baselines, which the user links to the concrete information objects in the current project. OpenCLM then always provides a current version of the linked information for each milestone. The project manager can create the templates himself, e.g. on the basis of his existing project structure plans, or have them created by PROSTEP as part of customizing.
Tool for project control
OpenCLM is first and foremost a tool for project managers and participants with which they or their teams can structure their work and control its progress. The project team members see an overview of all tasks and artifacts that belong to an upcoming milestone in the cockpit, and in which status they are. You can navigate in the structure and arrange it according to different criteria, e.g. to display all information belonging to a specific function across all disciplines. The software automatically identifies open points or artifacts that are still missing and can also automatically suggest their degree of maturity. In addition, project managers receive immediate visual feedback on problem points, e.g. when a particular simulation task has been performed on the basis of an outdated model.
The new PROSTEP solution ensures a high level of transparency and traceability of the development objects across all disciplines and domains. Because the relationship information is managed transiently in OpenCLM, the data models of the source systems do not need to be touched. Among other things, this has the advantage that the departments can use a different tool or data source in the next project if required, without the process or software having to be constantly adapted. The baselines of OpenCLM define which work results they have to deliver and when. This ensures consistent, comprehensible and rule-compliant product documentation, as required by SPICE, CMMI, DIN ISO 26262. plan as starting point.
By Lutz Lämmer
No Comments »
Friday, July 5th, 2019
Theegarten-Pactec is the world’s leading manufacturer of continuous-motion packaging machines for confectionery, food and non-food products. In order to secure its market position, the Dresden-based company has decided to use PLM to speed up the development of new machines and its order-oriented design work. PROSTEP will be advising the company on the optimization of its PLM processes, requirements analysis and selection of the system.
Unlike clocked or discontinuous packing machines, which are also part of the company’s product portfolio, continuous-motion packaging means that hard candies, chocolates and other products are constantly in motion. In an end-to-end process, they are wrapped or folded in packaging material by a number of tools mounted on a rotating drum. The advantages of this technology lie in the higher throughput and gentler packaging process, as Dr. Egbert Röhm, managing director of Theegarten-Pactec GmbH & Co. KG explains: “Our continuous-motion machines package up to 2,300 products per minute, clocked machines not even half of them.”
“The story of our success is not a short one,” says the company’s home page. It is also a very German reunification success story. In Dresden, the company VEB Verpackungsmaschinenbau Dresden developed the first continuous-motion packaging machines to market maturity as early as 1981 and successfully exported them worldwide, even in the days of the GDR. At that time, the state-owned company formed part of the NAGEMA combine and, since 1946 had bundled the potential of a number of Dresden-based mechanical engineering companies whose owners had been expropriated after the Second World War. Following German reunification, the combine was first transformed into a stock corporation, which subsequently gave rise to a number of individual companies, including the packaging machine division.
As of 1991, the newly founded Verpackungsmaschinenbau GmbH, which comprised various parts of the original combine, traded as Pactec Dresden GmbH. It employed only a small, high-performance workforce that represented approximately 10 percent of the original company, which previously had 3,000 employees. In 1994, Pactec Dresden was acquired by the Cologne-based family-owned company Rose-Theegarten, which had been manufacturing packaging machines for confectionery products since as early as 1934. Because the site in Dresden offered advantages both in terms of the usable industrial area and available skilled workers, the company’s headquarters were moved to Dresden a few years later, as Röhm recounts: “That is certainly something that didn’t happen very often in the history of German reunification.”
Series machines packed with engineering
In Dresden, Theegarten-Pactec now manufactures both clocked and continuous-motion packaging machines and systems for all types of small products that have to be packaged quickly: hard candies, soft caramels, chocolate products as well as stock cubes and dishwasher tablets. “We service the global market with around about 430 employees,” says Röhm. Nearly 100 of these work in development and design because the engineering effort is considerable. Even though, fundamentally, these are series machines that have become increasingly modular in design in recent years, the design engineers have to adapt them to the format of the product that is to be packaged, the desired fold type and other specific customer requirements. Each year, the mid-sized company ships between 100 and 120 machines and packaging systems.
The cross-departmental processes involved in handling customer orders are time-consuming and tie up significant capacities because the support provided by software applications is not optimal. The measure of all things is currently the ERP system. There is no independent PLM system, just a module integrated in the ERP system that manages the authoring systems’ CAD objects and data. “The ERP system also manages the order-neutral variant BOMs. This means that we have to check any number of dependencies every time there is a change,” explains Röhm. “Once we’ve got the BOM for the order, our processes work very well. However, the administrative effort required to create an order-specific BOM is huge because we force our designers to follow the production and assembly-oriented logic of the ERP processes. To speed up development, we are thinking about how we can decouple the development process more thoroughly from the downstream processes.” The aim is to speed up the offer and order process and to bring a new machine onto the market every year rather than every other year.
Analyzing the information flow
In 2018, as a first step, Theegarten-Pactec called in external consultants to shed some light on the shortcomings in the process and IT landscape. The Institute for Industrial Management (FIR) performed a detailed examination of the ERP-related processes, while the PLM consulting and software company PROSTEP took a close look at the PLM processes. “We weren’t thinking in terms of lofty Industry 4.0 issues but about how to better organize the existing processes involving our ERP system, as well as about topics such as variant and BOM management, how to cut down the administrative overhead for our design engineers, and how to make the knowledge we have about our machines more readily available,” emphasizes Röhm.
The consultants came to the conclusion that the interaction between ERP and PLM needed to be reorganized and that it was essential to implement clear-cut PLM processes with system support. “The highest priority, however, is to construct a PLM landscape that will make the existing information and information flows more transparent. We have placed the focus on PLM and on integrating this approach throughout the company in order to create a shared data store for all our processes,” explains Dr. Dirk Hagen Winter, project manager Change Management in corporate management.
Before starting to look for a suitable PLM solution, Theegarten-Pactec commissioned the PLM experts to undertake an initial subproject with representatives from all the affected departments in order to set up an end-to-end information model. To do this, PROSTEP used a standardized method that makes it possible to identify redundancies, bottlenecks and discontinuities in the information flows. This showed that the main problem facing the company lies in making the comprehensive know-how that it has acquired over the years available and searchable quickly enough. Indeed, knowledge is often concentrated in individual employees and is not universally accessible.
Potential in variant management
“Together with PROSTEP, we also took a close look at how we get the information we receive from the customer when an order is placed in a well-structured manner in the order processing process,” says Winter. In principle, the aim is to structure the customer requirements functionally and to transfer the configuration with the desired options at least in part automatically into the mechatronic parts list and CAD structures. The idea is to manage the variant parts list in PLM in the future and then transfer the configured design parts list to an assembly-oriented production parts list and transfer it to the ERP system.
For historical reasons, the company’s product and module structure tended to be assembly-oriented in the past. As a result, the entire company is obliged to think assembly-oriented. The Engineering department does not develop end-to-end in a functional-oriented way, but instead develops the modules in the way they will subsequently be assembled. “Of course we still need an assembly-oriented BOM, but nowadays it really ought to be possible to derive it as a second mapping from a functional view.” As Röhm goes on to say: “PROSTEP has made it clear to us that our current approach wastes a lot of potential in development.”
In addition, it is currently very difficult for Theegarten-Pactec to track the lifecycle of its machines once they have left the factory. There is no time-based view of the shipped machines in the form of an “as-maintained” view of the digital twin. Such a view, however, is also difficult to maintain, since, for example, a food manufacturer with 100 machines in different development and delivery stages does not necessarily inform Theegarten-Pactec which spare part he has just ordered for which machine or which modifications he has made himself.
Concept for the PLM system development
In a second subproject with PROSTEP, employees from the various departments examined the question of what neutral formats could be used to provide information in the future. In this context, the topic of product manufacturing information (PMI) associated with CAD objects played just as much of a role as the derivation of simplified geometry models for creating electronic spare parts catalogs or project planning. “Our vision for the future is that all the information will be available in the 3D model, which is not the case at present,” says Röhm. With the exception of NC programming for mechanical machining and sheet metal working operations, which is performed on the basis of CAD models, the processes used for manufacturing and assembly are predominantly drawing-based. The tasks of deriving and preparing these drawings takes up a lot of the company’s design engineering capacity.
In a third subproject, which is due to be completed by the end of the year, a concept for the structure of the future PLM system and a requirements specification for the selection of the system will be drawn up. A key component of the development plan is a central integration platform that will act as an intermediate layer permitting the more flexible interconnection of the ERP, PLM and authoring systems. As Röhm explains: “At present, the direct connection between the PDM module and the ERP system means that we have to update the CAx systems whenever there is an ERP system release change and vice versa. We want to decouple these components by means of the integration layer in order, among other things, to make it easier to replace certain software modules.”
PROSTEP’s experts will also help Theegarten-Pactec select an appropriate system. Röhm is extremely happy with the way the PLM consulting and software company is planning and conducting the projects. “PROSTEP understands the path ahead very clearly. However, we don’t want to merely follow blindly but instead want to work on this task together.” Those involved know very well that it will take several years to implement the initiated changes and that systematic implementation of the PLM approach will demand a realignment of the currently employed ERP applications and the relevant processes.
No Comments »
Tuesday, June 4th, 2019
PROSTEP AG founds a new branch in Wroclaw, Poland (formerly Breslau). At the beginning of April, PROSTEP sp.z.o.o. started its business activities to support and strengthen the development team of the Berlin-based PROSTEP branch. It is planned to set up two Scrum teams that will develop specific solutions for major automotive customers.
PROSTEP decided to move to Poland because the country has a relatively large pool of qualified IT specialists who are becoming increasingly difficult to find in Germany. Wroclaw was supported by the fact that the city on the Oder River is the country’s second largest university location and that for historical reasons more people speak German than in other parts of the country. We are not going to Poland with the development for cost reasons, because Poland is no longer a low-wage country, but because we can find qualified personnel there with German language skills.
The Polish software developers will implement agile development projects on site in cooperation with their Berlin colleagues and in close coordination with the key developers and business analysts. The decentralized model has proven its worth in Berlin, where PROSTEP currently has a development team of more than 20 programmers.
The new branch in Poland will employ 15 to 20 people in the medium term. We are currently looking for personnel and intend to hire eight to ten new people by the end of the year. The first team leader will start work in September 2019. Filip Plochocki will be responsible for the new location, whose office will be located in a startup center in the center of Wroclaw during the first few months.
No Comments »
Saturday, June 1st, 2019
Peter Pfalzgraf, head of the Products business unit at PROSTEP AG, has been named president of the 3D PDF Consortium. The global initiative is committed to establishing and further developing 3D PDF, and the PDF format in general, as an open standard for visualization, data communication and long-term archiving. The appointment of Pfalzgraf underscores PROSTEP‘s long-standing commitment to the dissemination of 3D PDF technology.
The 3D PDF Consortium was originally founded at the suggestion of Adobe to demonstrate the openness of the 3D PDF format. Thanks to its collaboration with the PDF Association and, in particular, recognition by the American National Standards Institute (ANSI) of the Consortium as the US TAG Administrator for the PDF ISO standard , it has become the world’s most important organization when it comes to PDF standardization. In addition to Adobe and other leading software vendors – including the PLM consulting and software company PROSTEP – its members include large industrial companies such as Boeing.
On the one hand, the 3D PDF Consortium involves itself with the further development and consolidation of the various ISO PDF standards for archiving (PDF/A), engineering (PDF/E), electronic signatures (PSdES) and universal access (PDF U/A), which have been created within the framework of the ISO 32000 standard for the full function PDF. On the other hand, it provides software houses and industrial companies with support when implementing the 3D PDF standard. The 3D PDF Implementor Forum is a collaborative testing program designed to ensure the quality and usability of 3D PDF in engineering workflows by means of joint testing.
An important activity performed by the 3D PDF Consortium is the planned extension of the 3D PDF format that will allow STEP AP242 data to be embedded directly in 3D PDF documents without conversion to the internal PRC or U3D format and to be viewed with Adobe Reader. As Pfalzgraf says, “We expect this to lead to even broader acceptance of 3D PDF, especially with regard to the long-term archiving of 3D data. This option is of particular interest to manufacturers of products with very long lifecycles and strict obligations to provide documentation, such as the aerospace industry.” The 3D PDF Consortium therefore wants to work more closely with LOTAR International, an initiative for long-term archiving and retrieval launched by leading aircraft manufacturers.
No Comments »
Monday, May 6th, 2019
PROSTEP continues to expand the world’s leading PLM integration platform OpenPDM by adding standard connectors that allow a wide range of different PLM, ERP and other backend systems to be connected “as-is” and the data to be synchronized or migrated. The company recently created a seamless integration between the cloud PLM solution from Arena Solutions and SAP’s ERP system, which is already in productive use at customer sites in the USA.
Arena Solutions, a US American PLM manufacturer, is a true pioneer when it comes to cloud PLM. The company has been offering its customers cloud-based SaaS applications for product lifecycle management and quality assurance for over 15 years. They are primarily used by smaller start-ups and leading high-tech companies in the USA but also by customers in 80 other countries.
The company has created an integration solution between Arena PLM and SAP together with PROSTEP’s US subsidiary that makes it possible for customers to bring their new products to market faster. The OpenPDM-based integration ensures that manufacturing is always working with the latest version of BOMs, article master data, approved manufacturing lists and associated documents.
The OpenPDM connector accesses the Arena PLM module ERP Exchange, which exports product information like BOMs, change orders, article master data and parts master data as a PDX package in XML format. OpenPDM validates the data and any changes made to it and then automatically imports it into SAP. It is also possible to supply different instances of the ERP system with data. Data transfer is logged and can be monitored in a dashboard to enable immediate response to any errors that occur while the data is being transferred.
With its OpenPDM connector for Arena PLM, PROSTEP demonstrates that the world’s leading integration platform can also be used to connect cloud-based PLM solutions with on-premise enterprise applications. The company is currently developing an enhanced connector to the cloud-based ERP solution SAP S/4 HANA, which will allow OpenPDM to also support hybrid cloud/cloud scenarios in the future.
No Comments »
Thursday, May 2nd, 2019
The automotive and industrial supplier Schaeffler has decided to use PROSTEP’s OpenPDM integration platform to connect its existing PLM and ERP system landscape to the new Engineering Cockpit in Aras Innovator. We are also developing a new connector for Schaeffler that will connect the application lifecycle management (ALM) system PTC Integrity Lifecycle Management (ILM).
Aras Innovator is Schaeffler’s overarching PLM platform for mechatronic product development and model-based systems engineering (MBSE). The plan is to merge not only the mechanical and electrical/electronic product data in the new Engineering Cockpit but also the software development statuses. At Schaeffler, the latter are managed using PTC Integrity, which controls the entire software development process. Hence the request to connect the ALM system to the Engineering Cockpit.
Schaeffler decided last year not to develop the interfaces for connecting the various IT systems in-house but instead implement OpenPDM as middleware. The fact that PROSTEP’s standards-based integration platform offers maximum investment protection and requires less effort to develop and maintain the integrations – thanks to the fact that tried-and-tested connectors for the existing CAD, PLM and ERP systems PTC Creo, PTC Windchill and SAP are already available – worked to its advantage. As we did not yet offer integration with PTC ILM, Schaeffler commissioned us to develop an appropriate connector.
The new PTC ILM connector allows us to expand our portfolio of standard integrations and take an important step towards supporting application lifecycle management. ALM is becoming increasingly important in the context of developing smart, connected products. The first release of the new connector was delivered to Schaeffler before Easter and can in the future also be used by other customers.
No Comments »
Friday, April 5th, 2019
At this year’s Schiff&Hafen Maritim 4.0 conference, PROSTEP’s shipbuilding experts outlined the challenges faced when it comes to end-to-end digitalization in the maritime supply chain and the benefits of a digital vessel twin. The choice of topic for their presentation was a good one, as the event is primarily attended by shipping experts who are interested in optimizing their working fleet.
Maritim 4.0 in Hamburg was well attended with approximately 100 representatives from shipping companies, equipment manufacturers and classification societies. The event focused less on shipbuilding than on shipping, i.e. the challenges facing ship operators. We explained to the participants the role the digital vessel twin (DVT) plays in monitoring and optimizing operation of the vessels. The term DVT refers to a digital representation of the vessel that is linked to the physical asset and enables new services such as predictive maintenance or remote inspection – something mentioned by a representative from the classification society DNVGL. Without the DVT, the vision of autonomous vessels would also never be feasible.
In our presentation, we made it clear that the end-to-end provision of digital product information over the entire lifecycle of a vessel is a key prerequisite for the digital twin. Ship operators today face the challenge of making digital product information available to their various partners throughout the vessel lifecycle in an efficient, needs-oriented and purpose-related manner. We used practical application examples to demonstrate how companies in other industries make service-relevant information available via the OpenDXM CCenter collaboration platform, thus managing the balancing act between end-to-end digitalization and know-how protection.
Another reason our presentation met with great interest among participants was the fact that we were able to point out similarities to the digitalization efforts being made in other industries. Carmakers and automotive suppliers, for example, are trying to ensure the homologation of autonomous driving functions with the help of simulation-based validation and verification processes – an approach that could also be of interest to the maritime industry. We have set important accents with the DVT topic at Maritim 4.0 and established promising contacts with potential new customers.
In the panel discussion that followed, which was headed up by Prof. Dr. Uwe von Lukas from Fraunhofer IGD in Rostock, several participants raised the question of which standards are needed to create digital vessel twins in view of the large number of IT systems used. We think that the technical problem encountered in the context of end-to-end digitalization of the DVT can be solved using a variety of standards, as demonstrated by examples from other industries. However, data continuity across company boundaries remains a challenge in light of the fact that organizational aspects and aspects such as IP protection expand the number of requirements. Lukas suggested creating a maritime data space, i.e. an open, industry-specific platform for exchanging digital data.
The question of where data acquisition and documentation for the digital twin should take place aroused some controversy during the event. While equipment manufacturers see this taking place on land or in the cloud to ensure the provision of new services, shipping companies think it should accompany the ship on board in order to support the crews during operation. They are after all faced with the challenge of having to repeatedly prepare and document emission-specific data in line with different country-specific requirements, a process that is still paper-based. For us, this was an important input which allows us to further sharpen our DVT concept.
By Lars Wagner
No Comments »
Sunday, February 3rd, 2019
Digital twins make it possible to perform material flow simulations for plant layout and bottleneck analyses. Building digital twins for existing production systems is, however, extremely complicated. As part of the DigiTwin joint project, PROSTEP and three partners are developing a procedure for creating digital simulation models from the 3D scan data generated by production systems largely automatically.
Material flow simulations for bottleneck analyses, plant layout and inventory analyses help improve operational workflows. Up until now, developing corresponding simulation models was extremely complicated, making it difficult for small and medium-sized companies to use them. Digitalization, however, offers new possibilities for simulating and optimizing the real-life situation in production with the help of a digital twin. In the DigiTwin project, the Institute of Production Engineering and Machine Tools (IFW) at the University of Hanover, together with PROSTEP, isb – innovative software businesses and Bornemann Gewindetechnik, are examining how digital twins for existing production systems can be created more easily.
The research project, the full name of which is “DigiTwin – Effiziente Erstellung eines digitalen Zwillings der Fertigung” (Efficient Creation of a Digital Twin for Production), is being funded by the “SME innovation: Service research” initiative of the German Federal Ministry of Education and Research. Within the framework of the project, the partners are developing a service concept for deriving simulation models from scans of the factory floors largely automatically. The idea is to use object recognition to convert, with a maximum of automation, the 3D scan data from production into digital models that can be mapped one-to-one in the simulation software. The aim is to make both the layout of the production facilities and the logic of the production processes transparent.
In the project, PROSTEP is responsible for transforming dumb point clouds of machines, robots, transport equipment, etc. into intelligent CAD models that can then be used to simulate the manufacturing processes. With the help of methods from artificial intelligence and machine learning, the solution uses the point cloud, or the network geometry derived from it, to identify similar system components, which are stored in a library together with their CAD models.
It is intended that system components for which there is no equivalent in the library be converted into CAD models and parameterized with the help of feature recognition so that they can be prepared for simulation. This means that the simulation models can easily be adapted to take account of company-specific characteristics. PROSTEP’s data management team will make the services for object recognition, object harmonization and conversion available via the data logistics portal www.OpenDESC.com.
Production systems generally vary from company to company. Company-specific machine configurations and special chaining logic cannot, of course, be derived directly from the scan data, which is why the scientists at IFW query this data using standardized forms. This minimizes the amount of time and effort needed to adapt the simulation models, thus also ensuring that the concept remains attractive to small and medium-sized companies. The project partners need only a few days to create a digital twin that can also be adapted quickly in the event of changes to production. As this is a service concept, no programming knowledge is required on the part of the customer.
No Comments »
Friday, February 1st, 2019
In today’s world, agile methods are the state-of-the-art when it comes to IT project management. Corporations also use them in large-scale IT projects that they implement together with service companies. The most common agile process model is Scrum, which has to a great extent replaced the conventional waterfall model and its variants. It does, however, require a redefinition of customer/supplier relationships.
In the past, large-scale IT projects involving external service providers were usually executed on the basis of service contracts and billed at a fixed rate. This model is incompatible with agile methods in that there is no longer a clearly defined scope of delivery that could be the subject of formal acceptance and billing at the start of the project. Instead there is a rough target (product vision), a timeframe and a team. This means that the customer’s management must make an investment decision without a detailed cost estimate, rather like a declaration of intent: We will invest the sum of x euros within a defined period of time in order to achieve a specific objective without knowing the detailed results in advance and without having a contractual basis for demanding the result should this become necessary.
This not only requires a change in thinking when it comes to project controlling but also has a direct impact on the remuneration arrangement as measurable criteria for evaluating supplier performance need to be identified. There are basically two variants: billing for the number of hours worked (time and materials) or for deliverables (agile fixed price). The time-and-materials model is easier to implement from an organizational perspective, but it shifts all the project risk to the client. The deliverables-based model is in principle a fixed-price model with very small billing units (user stories), whose constraints are regulated by a framework agreement. It requires significantly more organizational effort when it comes to acceptance and billing but results in greater transparency and a more even distribution of risks.
Conflicts can become problematic when assessing the overhead of user stories because the parties involved are more or less at an impasse. If the supplier estimates the overhead for a user story to be significantly higher than the client is willing to accept, neither side can force the other to accept their view of the situation. This means that assessment conflicts must be resolved constructively within the framework of the collaboration. A general feature of agile collaboration models becomes particularly apparent here: They require a high level of mutual trust and great willingness to resolve conflicts constructively. Proponents of agile principles will counter that this applies to the successful implementation of IT projects in general and that agile process models such as Scrum merely demonstrate a methodical way of dealing with them professionally and in a results-oriented fashion.
Scrum is in itself only designed for teams of up to nine people. Software relevant to industry such as PLM systems, however, require much larger development teams, thus giving rise to the question of how collaboration between a more or less large number of Scrum teams can be organized efficiently.
The approaches proposed in the literature, such as LeSS and SAFe, are typically based on a multi-level model, with operational work taking place on the lower level and coordination on the upper levels. LeSS, for example, aims to minimize overhead. Here, the operational teams send representatives to the upper level teams and the Scrum process is only modified slightly. SAFe, which is currently the most widely used approach, introduces a total of three levels and numerous new roles and events.
There are differing views on how well Scrum can actually be scaled in practice. There is no simple solution to the main problems that arise when coordinating creative work in large organizations. It is, however, becoming evident that, as the size of the project teams increases, binding content-related and technical definitions that are valid for all actors and a deep, shared understanding of the overall objective become critical success factors. Conveying them is more important than finding the supposedly perfect scaling method.
Using an agile process model poses certain risks if more comprehensive digitalization measures are to be implemented in a company’s extensive IT landscape. It could be tempting to launch larger projects without sufficiently clarifying the objectives and constraints and hope that it will be possible to clarify these points during the course of the project. This often leads to expensive dead ends. Unfortunately, the methodology does not provide any clear recommendations as to what should be clarified and defined in advance and with what level of detail. The project participants must therefore make this decision based on their own experience and strike a balance between creative chaos and over-specification à la the waterfall model.
Experience shows that a lack of process knowledge and inadequate analysis of the data model in particular can result in expensive abortive developments, which could be avoided or at least reduced with a little more conceptual preparatory work. It can therefore be concluded that sufficient attention also has to be paid to the process and data model description when using an agile approach in order to ensure that a solid conceptual basis for implementing the desired functionality is available at an early stage. This information should then be communicated to all project participants so that the creative scope that an agile approach creates can also be exploited in the context of the overall objective.
By Norbert Lotter, PROSTEP
No Comments »
|