Archive for the ‘PLM Consulting’ Category
Saturday, July 27th, 2019
The IT landscapes in the maritime industry are characterized by the fact that special shipbuilding applications are often used for initial, basic and detail design as well as for production preparation. Although they enabled the specialist departments to carry out their work particularly efficiently, they require powerful integrations from an entrepreneurial point of view in order to be able to use the digital information throughout the process. Consistent processes and information flows are the prerequisite for building a digital ship model that can accompany the entire ship life cycle.
The development of proprietary interfaces is not only time-consuming and cost-intensive, but also complicates the exchange of existing applications or the rapid integration of new ones. Based on the proven OpenPDM technology, which is used by many companies for the integration of different enterprise systems, data migration and cross-company collaboration, PROSTEP has therefore created an extension of this integration platform specially designed for the maritime industry. An essential component of this platform are standards-based connectors that simplify both the horizontal integration between different authoring systems and their vertical integration into the enterprise systems which manage the product structures (PDM, PLM, ERP, etc.).
OpenPDM SHIP enables data exchange between special shipbuilding applications such as NAPA, AVEVA Marine, CADMATIC or ShipConstructor and mechanical CAD systems such as CATIA or NX. The latter are often used for the construction of complex interiors, e.g. for public areas in cruise ships or large yachts. When transferring information from mechanical to shipbuilding specific CAD applications and vice versa, the integration platform maps the different data models to each other. This allows companies to use the CAD tool of their choice for any task in the ship development process without losing valuable information during conversion and data exchange.
On the other hand, OpenPDM SHIP supports the creation of complex CAx process chains with arbitrary synchronization points from initial design in NAPA or NAPA Steel to basic and detail design in AVEVA Marine or CADMATIC to production preparation, for which some shipyards use the NESTIX software. The challenge with CAx integration is that the coherent ship geometry for the subsequent processes and systems must be broken down into manufacturable components and transferred with the production-relevant information. The integration platform supports this process and enables the consistent use of digital information in all phases of the ship development process.
OpenPDM SHIP also provides connectors to common PDM/PLM and ERP systems (3D Experience, ARAS Innovator, Teamcenter, SAP, Windchill, etc.) to merge CAx data from different source systems into a digital ship model and control this model through the ship life cycle. The vertical integration of the authoring systems into the data and process management environment is a prerequisite for comprehensible ship development processes and consistent management of all information generated. At the same time, the integration platform offers the possibility to link the digital ship model for Digital Twin applications with the real operating data.
OpenPDM SHIP is now available and will be continuously expanded with new integrations. In cooperation with SSI and SSI’s European sales partner NDAR (Nick Danese Advanced Research), PROSTEP is currently developing an OpenPDM SHIP connector for the SSI Enterprise platform. It is the basis for exchanging CAD models between NAPA Steel and the Autodesk-based ship development platform ShipConstructor and for importing the ShipConstructor data into common PDM/PLM systems.
No Comments »
Friday, July 12th, 2019
Last year, Professor Dr. Jens C. Göbel took over as successor to Professor Martin Eigner as the head of the Institute for Virtual Product Engineering at the University of Kaiserslautern. He has already been concerned with PLM-related topics for many years, both at research level and with regard to industrial applications. In an interview for the PROSTEP Newsletter, he explained what the future of PLM will look like.
Question: What are the technological trends that PLM users and vendors are going to have to get to grips with?
Göbel: In the future, smart product systems with integrated services that are very closely networked and form part of other systems (systems of systems) will play a much more central role in product lifecycle management. The end-to-end implementation of PLM concepts throughout the entire product lifecycle will, on the one hand, be made possible for the first time by this networking capability and, on the other, is also urgently required – for example in order to permit the integrated management of digital product twins. PLM is therefore a vital underpinning for the lifecycle of smart products. However, PLM will have to open out and develop in different directions if it is to achieve this aim. This applies, for example, to the integration of other specialist disciplines in the very early systems engineering phases, the inclusion of product instances in later product utilization phases, and the use of AI for the purposes of “engineering intelligence”.
Question: Where do you see the greatest obstacles to the implementation and optimization of PLM applications in industry?
Göbel: Many of the obstacles hindering PLM implementation continue to be of an organizational nature or relate to the field of project or acceptance management. At the same time, however, increasing functional scope and, most importantly, integration capabilities, are making the implementation of PLM solutions even more complex. We will need new skills that include the digitalization of the products themselves and combine this with the digitalization of the processes in the product lifecycle. However, the progress of PLM implementation is being helped by industry’s increasing awareness of the correlation between standard PLM tasks and the success of digital transformation in the engineering field, as well as throughout the entire product lifecycle.
Question: Do companies have the necessary technical PLM skills, for example in order to develop smart products and services?
Göbel: Far from it! The offering of smart products and services is directly linked to the enterprise-specific design of new business models and flexible collaboration models. At this level, it will be necessary to create PLM interfaces and extensions that reach through to the very early innovation and product planning processes. This will require a fundamental mind shift in individual specialist departments as well as in PLM and IT departments. Another current challenge, in particular in large companies, is the PLM-specific integration of and cooperation between somewhat traditional company departments and highly agile start-up divisions.
Question: What role do currently hyped-up topics such as the digital thread and the digital twin play with regard to the strategic importance of PLM in companies?
Göbel: Generally speaking, it can be seen that companies have recognized that PLM is key and forms a vital basis for these issues. That is why it is often becoming the focus of strategic enterprise initiatives to a much greater extent than before and is being considered from a number of very new perspectives. However, to avoid false expectations and unrealistic ideas of the potential benefits, we need to be clear about what these terms mean and consider them against the background of concrete application scenarios, that is to say not as an end in themselves. At the VPE, we have already shown that this is possible in a number of research projects, e.g. in the case of the use of digital product twins for service-based business models in the agricultural machinery and automotive industries.
Question: It is said that monolithic PLM systems are no longer in fashion. However, PLM vendors are constantly extending their solutions with new modules. Isn’t that a contradiction?
Göbel: PLM solutions must continue to develop radically at both the functional and technological levels if they are to keep up with the fast-moving dynamics that characterize products and business models. In this context, the integrative nature of PLM is of particular importance. It is becoming increasingly important that other discipline-specific IT systems and platforms be integrated quickly and easily throughout the entire product lifecycle, including for temporary applications. It is therefore vital for success that PLM vendors keep their promises regarding openness and press ahead even further in this direction, including in joint initiatives such as the prostep ivip Association’s Code of PLM Openness (CPO).
Question: You wrote your doctoral thesis on the harmonization of hybrid PLM environments. Has industry made progress in this area on recent years?
Göbel: Yes, it has been pleasing to see the visible progress and success achieved in the field of PLM harmonization in recent years. Businesses have learned from the past and pursued more methodical approaches that consider processes, methods, IT systems and company organization more holistically. However, this topic is still extremely relevant today. Here, too, technological further developments and additional PLM requirements are demanding new approaches, for example the use of semantic technologies for linking data; in this context, we refer to this as semantic PLM.
Question: Do we maybe need something like a meta-PLM system that links together the data from the different data silos?
Göbel: In principle, yes. But it mustn’t be rigid. Any such meta-PLM will have to keep pace with the ever faster dynamics of internal organizational structures and value-added networks. For example, in the AKKORD project, which was recently launched by the German Ministry of Education and Research (BMBF), we are working together with various industries to achieve intelligent data networking in the product lifecycle. In this project, we are attempting to flexibly integrate not only PLM but also ERP, CRM and other systems and perform AI-based product data analysis, for example in order to exclude possible sources of error and predict costs on the basis of quality data as early as the development phase.
Question: No other organization is working harder to further develop the PLM concept in the direction of SysLM than the VPE Institute. With which of these terms do you identify most?
Göbel: For me, the content that stands behind the terms is more important than the terms themselves, which in any case are not used uniformly. We will orient the development of PLM in the direction of the aspects we have discussed – and in some areas will have to completely rethink it. The concept of SysLM primarily reflects the idea of an interdisciplinary systems approach, which is increasing taking the place of our traditional product concept. As such, this term represents an important element of these further developments. However, it is not the only direction of PLM development that we are working on.
Question: In the future, where will you place the emphasis in terms of research and teaching? Is MBSE still at the very top of the curriculum?
Göbel: MBSE continues to be our central topic, in particular as an “enabler” for smart engineering. For example, we are currently taking part in a large interdisciplinary research project involving the knowledge-based roadmapping and cross-value chain modeling of overall system architectures in very early innovation phases in the automotive industry. A few weeks ago, we became one of the first German university research institutes to be included in the OMG’s SysML v2 Submission Team (SST), where we will help design the next SysML generation. We are currently incorporating the results of our research activities in a new cross-discipline “Smart Systems Engineering” study module which we, together with colleagues from the IT and Electronic Engineering departments, will offer in the upcoming winter semester. We opened the e4lab (engineering 4.0 lab) demonstration and testing laboratory in Kaiserslautern last July with the aim of making the potential applications of our research results tangible and accessible for industrial enterprises. These are just a few examples – many more will follow. Be ready to be surprised!
Question: The digitized engineering conference SYSLM2019 will be held in Kaiserslautern in October. Why has the name of this event been changed?
Göbel: We at the VPE have been working on the digitalization of engineering ever since the institute was founded by Professor Dankwort in 1994, that is to say for 25 years. I think that the name very accurately expresses this overarching core idea and also emphasizes the key topics we are addressing today. In the future, we want to direct even greater attention forward towards visionary ideas and trends that are also of relevance to industry in order to provide a guide for participants and stimulate an inspirational dialog about the future of digital engineering. I am very much looking forward to this year’s program, which contains some top-quality contributions, including the participation of PROSTEP AG, and is therefore ideal for promoting just such a dialog.
Mr. Göbel, thank you very much for talking to me.
(This interview was conducted by Michael Wendenburg)
About Jens C. Göbel
Professor Dr. Jens C. Göbel has headed up the Institute for Virtual Product Engineering (VPE) at the University of Kaiserslautern since 2018. Göbel studied industrial engineering (mechanical engineering) at the University of Siegen. After graduating, he worked on PLM-related topics at Bosch Rexroth, Keiper Recaro and Schmitz Cargobull and also conducted research into the fundamental principles of the lifecycle management of integrated product-service systems at Bochum University. There, he wrote his doctoral thesis on the harmonization of hybrid PLM environments at the Department of Information Technology in Mechanical Engineering under Professor Dr.-Ing. Michael Abramovici. Since 2010, he has served as research coordinator and senior engineer and head of the Lifecycle Management research group in the same department.
No Comments »
Friday, July 5th, 2019
Theegarten-Pactec is the world’s leading manufacturer of continuous-motion packaging machines for confectionery, food and non-food products. In order to secure its market position, the Dresden-based company has decided to use PLM to speed up the development of new machines and its order-oriented design work. PROSTEP will be advising the company on the optimization of its PLM processes, requirements analysis and selection of the system.
Unlike clocked or discontinuous packing machines, which are also part of the company’s product portfolio, continuous-motion packaging means that hard candies, chocolates and other products are constantly in motion. In an end-to-end process, they are wrapped or folded in packaging material by a number of tools mounted on a rotating drum. The advantages of this technology lie in the higher throughput and gentler packaging process, as Dr. Egbert Röhm, managing director of Theegarten-Pactec GmbH & Co. KG explains: “Our continuous-motion machines package up to 2,300 products per minute, clocked machines not even half of them.”
“The story of our success is not a short one,” says the company’s home page. It is also a very German reunification success story. In Dresden, the company VEB Verpackungsmaschinenbau Dresden developed the first continuous-motion packaging machines to market maturity as early as 1981 and successfully exported them worldwide, even in the days of the GDR. At that time, the state-owned company formed part of the NAGEMA combine and, since 1946 had bundled the potential of a number of Dresden-based mechanical engineering companies whose owners had been expropriated after the Second World War. Following German reunification, the combine was first transformed into a stock corporation, which subsequently gave rise to a number of individual companies, including the packaging machine division.
As of 1991, the newly founded Verpackungsmaschinenbau GmbH, which comprised various parts of the original combine, traded as Pactec Dresden GmbH. It employed only a small, high-performance workforce that represented approximately 10 percent of the original company, which previously had 3,000 employees. In 1994, Pactec Dresden was acquired by the Cologne-based family-owned company Rose-Theegarten, which had been manufacturing packaging machines for confectionery products since as early as 1934. Because the site in Dresden offered advantages both in terms of the usable industrial area and available skilled workers, the company’s headquarters were moved to Dresden a few years later, as Röhm recounts: “That is certainly something that didn’t happen very often in the history of German reunification.”
Series machines packed with engineering
In Dresden, Theegarten-Pactec now manufactures both clocked and continuous-motion packaging machines and systems for all types of small products that have to be packaged quickly: hard candies, soft caramels, chocolate products as well as stock cubes and dishwasher tablets. “We service the global market with around about 430 employees,” says Röhm. Nearly 100 of these work in development and design because the engineering effort is considerable. Even though, fundamentally, these are series machines that have become increasingly modular in design in recent years, the design engineers have to adapt them to the format of the product that is to be packaged, the desired fold type and other specific customer requirements. Each year, the mid-sized company ships between 100 and 120 machines and packaging systems.
The cross-departmental processes involved in handling customer orders are time-consuming and tie up significant capacities because the support provided by software applications is not optimal. The measure of all things is currently the ERP system. There is no independent PLM system, just a module integrated in the ERP system that manages the authoring systems’ CAD objects and data. “The ERP system also manages the order-neutral variant BOMs. This means that we have to check any number of dependencies every time there is a change,” explains Röhm. “Once we’ve got the BOM for the order, our processes work very well. However, the administrative effort required to create an order-specific BOM is huge because we force our designers to follow the production and assembly-oriented logic of the ERP processes. To speed up development, we are thinking about how we can decouple the development process more thoroughly from the downstream processes.” The aim is to speed up the offer and order process and to bring a new machine onto the market every year rather than every other year.
Analyzing the information flow
In 2018, as a first step, Theegarten-Pactec called in external consultants to shed some light on the shortcomings in the process and IT landscape. The Institute for Industrial Management (FIR) performed a detailed examination of the ERP-related processes, while the PLM consulting and software company PROSTEP took a close look at the PLM processes. “We weren’t thinking in terms of lofty Industry 4.0 issues but about how to better organize the existing processes involving our ERP system, as well as about topics such as variant and BOM management, how to cut down the administrative overhead for our design engineers, and how to make the knowledge we have about our machines more readily available,” emphasizes Röhm.
The consultants came to the conclusion that the interaction between ERP and PLM needed to be reorganized and that it was essential to implement clear-cut PLM processes with system support. “The highest priority, however, is to construct a PLM landscape that will make the existing information and information flows more transparent. We have placed the focus on PLM and on integrating this approach throughout the company in order to create a shared data store for all our processes,” explains Dr. Dirk Hagen Winter, project manager Change Management in corporate management.
Before starting to look for a suitable PLM solution, Theegarten-Pactec commissioned the PLM experts to undertake an initial subproject with representatives from all the affected departments in order to set up an end-to-end information model. To do this, PROSTEP used a standardized method that makes it possible to identify redundancies, bottlenecks and discontinuities in the information flows. This showed that the main problem facing the company lies in making the comprehensive know-how that it has acquired over the years available and searchable quickly enough. Indeed, knowledge is often concentrated in individual employees and is not universally accessible.
Potential in variant management
“Together with PROSTEP, we also took a close look at how we get the information we receive from the customer when an order is placed in a well-structured manner in the order processing process,” says Winter. In principle, the aim is to structure the customer requirements functionally and to transfer the configuration with the desired options at least in part automatically into the mechatronic parts list and CAD structures. The idea is to manage the variant parts list in PLM in the future and then transfer the configured design parts list to an assembly-oriented production parts list and transfer it to the ERP system.
For historical reasons, the company’s product and module structure tended to be assembly-oriented in the past. As a result, the entire company is obliged to think assembly-oriented. The Engineering department does not develop end-to-end in a functional-oriented way, but instead develops the modules in the way they will subsequently be assembled. “Of course we still need an assembly-oriented BOM, but nowadays it really ought to be possible to derive it as a second mapping from a functional view.” As Röhm goes on to say: “PROSTEP has made it clear to us that our current approach wastes a lot of potential in development.”
In addition, it is currently very difficult for Theegarten-Pactec to track the lifecycle of its machines once they have left the factory. There is no time-based view of the shipped machines in the form of an “as-maintained” view of the digital twin. Such a view, however, is also difficult to maintain, since, for example, a food manufacturer with 100 machines in different development and delivery stages does not necessarily inform Theegarten-Pactec which spare part he has just ordered for which machine or which modifications he has made himself.
Concept for the PLM system development
In a second subproject with PROSTEP, employees from the various departments examined the question of what neutral formats could be used to provide information in the future. In this context, the topic of product manufacturing information (PMI) associated with CAD objects played just as much of a role as the derivation of simplified geometry models for creating electronic spare parts catalogs or project planning. “Our vision for the future is that all the information will be available in the 3D model, which is not the case at present,” says Röhm. With the exception of NC programming for mechanical machining and sheet metal working operations, which is performed on the basis of CAD models, the processes used for manufacturing and assembly are predominantly drawing-based. The tasks of deriving and preparing these drawings takes up a lot of the company’s design engineering capacity.
In a third subproject, which is due to be completed by the end of the year, a concept for the structure of the future PLM system and a requirements specification for the selection of the system will be drawn up. A key component of the development plan is a central integration platform that will act as an intermediate layer permitting the more flexible interconnection of the ERP, PLM and authoring systems. As Röhm explains: “At present, the direct connection between the PDM module and the ERP system means that we have to update the CAx systems whenever there is an ERP system release change and vice versa. We want to decouple these components by means of the integration layer in order, among other things, to make it easier to replace certain software modules.”
PROSTEP’s experts will also help Theegarten-Pactec select an appropriate system. Röhm is extremely happy with the way the PLM consulting and software company is planning and conducting the projects. “PROSTEP understands the path ahead very clearly. However, we don’t want to merely follow blindly but instead want to work on this task together.” Those involved know very well that it will take several years to implement the initiated changes and that systematic implementation of the PLM approach will demand a realignment of the currently employed ERP applications and the relevant processes.
No Comments »
Friday, February 1st, 2019
In today’s world, agile methods are the state-of-the-art when it comes to IT project management. Corporations also use them in large-scale IT projects that they implement together with service companies. The most common agile process model is Scrum, which has to a great extent replaced the conventional waterfall model and its variants. It does, however, require a redefinition of customer/supplier relationships.
In the past, large-scale IT projects involving external service providers were usually executed on the basis of service contracts and billed at a fixed rate. This model is incompatible with agile methods in that there is no longer a clearly defined scope of delivery that could be the subject of formal acceptance and billing at the start of the project. Instead there is a rough target (product vision), a timeframe and a team. This means that the customer’s management must make an investment decision without a detailed cost estimate, rather like a declaration of intent: We will invest the sum of x euros within a defined period of time in order to achieve a specific objective without knowing the detailed results in advance and without having a contractual basis for demanding the result should this become necessary.
This not only requires a change in thinking when it comes to project controlling but also has a direct impact on the remuneration arrangement as measurable criteria for evaluating supplier performance need to be identified. There are basically two variants: billing for the number of hours worked (time and materials) or for deliverables (agile fixed price). The time-and-materials model is easier to implement from an organizational perspective, but it shifts all the project risk to the client. The deliverables-based model is in principle a fixed-price model with very small billing units (user stories), whose constraints are regulated by a framework agreement. It requires significantly more organizational effort when it comes to acceptance and billing but results in greater transparency and a more even distribution of risks.
Conflicts can become problematic when assessing the overhead of user stories because the parties involved are more or less at an impasse. If the supplier estimates the overhead for a user story to be significantly higher than the client is willing to accept, neither side can force the other to accept their view of the situation. This means that assessment conflicts must be resolved constructively within the framework of the collaboration. A general feature of agile collaboration models becomes particularly apparent here: They require a high level of mutual trust and great willingness to resolve conflicts constructively. Proponents of agile principles will counter that this applies to the successful implementation of IT projects in general and that agile process models such as Scrum merely demonstrate a methodical way of dealing with them professionally and in a results-oriented fashion.
Scrum is in itself only designed for teams of up to nine people. Software relevant to industry such as PLM systems, however, require much larger development teams, thus giving rise to the question of how collaboration between a more or less large number of Scrum teams can be organized efficiently.
The approaches proposed in the literature, such as LeSS and SAFe, are typically based on a multi-level model, with operational work taking place on the lower level and coordination on the upper levels. LeSS, for example, aims to minimize overhead. Here, the operational teams send representatives to the upper level teams and the Scrum process is only modified slightly. SAFe, which is currently the most widely used approach, introduces a total of three levels and numerous new roles and events.
There are differing views on how well Scrum can actually be scaled in practice. There is no simple solution to the main problems that arise when coordinating creative work in large organizations. It is, however, becoming evident that, as the size of the project teams increases, binding content-related and technical definitions that are valid for all actors and a deep, shared understanding of the overall objective become critical success factors. Conveying them is more important than finding the supposedly perfect scaling method.
Using an agile process model poses certain risks if more comprehensive digitalization measures are to be implemented in a company’s extensive IT landscape. It could be tempting to launch larger projects without sufficiently clarifying the objectives and constraints and hope that it will be possible to clarify these points during the course of the project. This often leads to expensive dead ends. Unfortunately, the methodology does not provide any clear recommendations as to what should be clarified and defined in advance and with what level of detail. The project participants must therefore make this decision based on their own experience and strike a balance between creative chaos and over-specification à la the waterfall model.
Experience shows that a lack of process knowledge and inadequate analysis of the data model in particular can result in expensive abortive developments, which could be avoided or at least reduced with a little more conceptual preparatory work. It can therefore be concluded that sufficient attention also has to be paid to the process and data model description when using an agile approach in order to ensure that a solid conceptual basis for implementing the desired functionality is available at an early stage. This information should then be communicated to all project participants so that the creative scope that an agile approach creates can also be exploited in the context of the overall objective.
By Norbert Lotter, PROSTEP
No Comments »
Friday, February 1st, 2019
This year, PROSTEP is sponsoring the COMPIT Conference in Tullamore, Ireland, where it will give a presentation on the integration of shipbuilding design tools in the early development phase. Specifically, the new NAPA-AVEVA interface will be presented and the associated improved possibilities it offers for a continuous integration of data and processes in shipbuilding.
Companies in the marine and offshore industries are spoilt for choice when it comes to digitizing their business processes: either they opt for a Best of Suite approach, i.e. an integrated solution for ship design from a single manufacturer that may not cover all the functional requirements of the various disciplines and areas. Or they use the best solutions for the various tasks and are therefore faced with the question of how to integrate the digital tool chain in such a way that the data flows as consistently as possible.
Both approaches have their advantages and disadvantages, but in most cases the shipbuilders opt for the Best of Breed approach because of their grown IT landscapes. However, it will only be successful if the cost of integrating the tool landscape is lower than the additional effort resulting from the functional limitations of a Best of Suite solution. As a specialist for CAD and PLM integration, PROSTEP will explain to COMPIT participants which requirements they should consider when integrating the tool chain from a business perspective and with a view to the IT organization.
The implementation of an interface is not enough. First of all, it must be clarified whether the data should flow in one or both directions, whether all the information required by the target system is available in the source system, whether its data models are compatible, and whether native data or only geometric representations are required. Also important are aspects such as the use of catalog part libraries, which may have to be standardized. In addition, the question arises as to how often data synchronization must take place, which data volumes must be synchronized, and whether the exchange process should be permanently monitored. Only then can you start looking for standard interfaces for importing and exporting data.
Using the new NAPA Steel / AVEVA Marine interface as an example, PROSTEP will then explain the challenges of integrating two shipbuilding-specific CAD applications in more detail. PROSTEP presented the interface concept for the first time last autumn at the NAPA User Conference in Helsinki, where it met with great interest.
NAPA Steel is mainly used in the early design phase, e.g. for the calculation of buoyancy, drag and maneuverability. However, most shipyards use AVEVA Marine software to design their vessels’ steel structures and outfitting. Although both are intent-driven systems that do not primarily generate explicit geometry but describe the way in which they are generated parametrically and via topological relationships to other elements, the data from NAPA Steel cannot be used one-to-one in AVEVA Marine. The different semantics of the two systems must be mapped to each other in such a way that topology definition, parametric breakthroughs and other features can be transferred in best quality or rebuilt in the target system.
The special feature of the import strategy developed by PROSTEP is its high error tolerance: the interface is able to transfer even non-accurately defined topology information in a lower quality, e.g. as pure geometry. The user does not have to read through long error reports to understand which data has which quality, but recognizes this by means of the different color shades. The interface is developed on behalf of NAPA and already supports the transmission of 80 percent of the components in the current development stage.
No Comments »
Wednesday, January 30th, 2019
In January, PROSTEP AG celebrated its 25th anniversary together with approximately 300 employees, former employees and friends of the company at Centralstation, a cultural center in Darmstadt. Top-class speakers from industry and the research community, including representatives from BMW, Bosch, Daimler, Schaeffler and the University of Stuttgart, took a look at what PLM might look like in the future. In their presentations, they outlined the challenges arising in the age of digital transformation, challenges that the PLM consulting and software house is well equipped to handle.
Dr. Bernd Pätzold, CEO of PROSPTEP AG, welcomed around 300 employees, former employees and friends of the company to the anniversary celebrations, which were held in what was once one of the city’s power stations and now serves as a cultural center. But, as Pätzold said, the company wasn’t interested in talking about the past but about the future. He nevertheless used the celebrations as an opportunity to thank the company’s founders and long-serving employees for the commitment they have shown. The guests partied until late into the night, enjoying lively musical entertainment that enticed a number of people onto the dance floor.
“The next 25 years will be even better,” said Pätzold, even though he has no magic formula for the future. Dr. Martin Holland, responsible for Business Development at PROSTEP, described to those present where the journey could take them. Sometimes it is better to first make a decision and then plan how to implement it, he said, alluding to Caesar’s decision to cross the Rubicon. The Internet of Things, the cloud, blockchain and artificial intelligence are all topics of the future that PROSTEP is examining closely.
In his talk, Holland presented a 1:1 model of a Mars Rover that the company had built together with student employees with the aim of creating a “testing ground” for topics such as IoT integration and the linking of digital twins and VR technology using the Mission Control Center. The winners of the Fraunhofer blockchain hackathon sponsored by PROSTEP, illustrated at one of the exhibition stands how blockchain could be used to make paying for coffee more fair. And they also offered the best coffee in the room.
The future of PLM has already begun
Professor Oliver Riedel, head of the Institute for Control Engineering of Machine Tools and Manufacturing Units (ISW) at the University of Stuttgart and a member of the board of directors of the Fraunhofer IAO, took a wide-sweeping look at PLM of the future. In addition to digital transformation, globalization and glocalization, mankind and society, and in particular the demographic change, which will exacerbate the shortage of skilled workers, are the key megatrends impacting on “PLM 2040 and Beyond”
“The level of complexity will continue to increase, not decrease,” predicted Riedel. And if this complexity is to be made manageable, everything needs be interlinked more efficiently. This applies, for example, to automation technology, which blurs the functional boundaries between ERP and MES, but also to the linking of digital shadows as representations of the past and digital twins as representations of the product as a living entity in the future, PLM will also require greater flexibility through the merging of development and operations. There are examples that demonstrate that DevOps is making more agile development possible not only in the software sector but for all products.
Agility was the watchword for Ralf Waltram, head of IT Delivery at the BMW Group, who presented the carmaker’s showcase projects involving the implementation of agile approaches in corporate IT to the guests. Becoming 100% agile is BMW’s answer to the disruptive changes brought about by digitalization and it has four main thrusts: processes, technology, organization and culture. IT is no longer organized according to projects but instead is organized in agile, product-oriented teams that are responsible for the solutions’ development and operations (DevOps). Back2Code is the maxim, i.e. BMW is once again developing more software itself. Waltram said that the company’s experiences with agile methods in SAP environments have also been good. User satisfaction has improved dramatically thanks to the faster provision of new functions, and the number of open tickets has been reduced by 72 percent.
Waltram went on to say that a more open, intuitive IT landscape is an important prerequisite for implementing agile methods. This was also made clear in the presentation given by Dirk Spindler, head of R&D Processes Methods and Tools at Schaeffler. As Spindler said, the company is very diversified, has a huge portfolio comprising approximately 20,000 products and uses different business models. The desire to standardize the IT landscape is an illusion that has devoured an enormous amount of money. Schaeffler intends to achieve its objective of fully-integrated PLM by setting up a model-based product engineering process and linking the data on an overarching platform. The aim is to ensure that the workplace of the future provide users with only the functions and information that they needs for their work in a cockpit.
On the road to digital transformation
While many companies are still busy digitalizing their business processes, Daimler is already giving thought to the digital transformation of its business models. Connected, Autonomous, Shared Services and Electric (CASE) are closely interwoven new topics that are bringing about a radical change in the way in which work is performed, methods and business models,” said Dr. Siegmar Haasis, CIO of R&D Cars. But they also have to be financed by the company’s core business. As far as IT is concerned, this means not only agile approaches but also an even higher level of digital frontloading in order to get the vehicles on the road faster and make them right the first time round: “When it comes to autonomous driving, the race will not be won on the road but in the computer center,” said Haasis. The supreme discipline here is the digital twin in the context of a holistic approach that supports different variants.
Digital transformation cannot be achieved on its own but only together with partners. Jochen Breh, who works in Governance IT Architecture at Bosch, underscored this point in his talk on the IoT platforms and ecosystems of the future. “Even the best solution will fail without an ecosystem,” Breh explained. On the other hand, an ecosystem can only function properly if all the parties involved, from the owner of the ecosystem to the developers to the customers, benefit from it. IoT platforms provide the technological basis for these ecosystems because, as Breh went on to explain, solutions are created on them. Because distributed ecosystems are going to grow closer together, open source and microservices are the key to building platforms quickly and for connecting them with each other.
At the same time, platforms and ecosystems are a prerequisite for successful transformation of business models. PROSTEP sees itself as a guide accompanying customers on this journey. In a round table at the end of the program of presentations, the executive board and managers from of the various business units explained to the audience just how broadly the company is positioned, thus ensuring that it is well-equipped to tackle the challenges of the future. PLM strategy consulting, Industry 4.0 readiness, MBSE and validation, collaboration, blockchain and agility are the aces they hold when it comes to shaping the digital future. Not to mention major projects like updating Daimler’s Smaragd installation. “I take off my hat to you and want to thank you all for everything you are doing”, Dr. Siegmar Haasis said in his presentation. That was certainly the best anniversary gift that PROSTEP employees’ could have received.
No Comments »
Friday, January 11th, 2019
Dirk Spindler, Head of R&D Processes, Methods and Tools at Schaeffler, is a relative newcomer to the field of PLM. Up until mid-2016, he was head of development for the company’s Industrial division, which manufactures system components for engines, transmissions and chassis as well as rolling and plain bearings for industrial applications. In this interview, he explains how he intends to digitalize Schaeffler’s development process and tool chain.
Question: The title of your presentation at PROSTEP‘s anniversary event was “The Future of Industry”. Does the automotive industry actually have a future?
Spindler: Yes, provided that it succeeds in making the transition to digitalization and e-mobility, which is something it has clearly identified. New mobility concepts must be developed under the leadership of the carmakers; otherwise there is a danger that the car will become a standard product.
Question: PLM was also written off as a commodity for a long time, but now it seems to be “in” again. What is your take on this?
Spindler: Yes, that is absolutely the case. To put it rather provocatively, PLM in the form we want to do it today has never existed before. It has always been very much focused on mechanical aspects and what I would call PDM. Digitalization and topics such as systems engineering have given PLM a considerable boost in the past two or three years. Nowadays, products have to be described very differently and aspects such as service and after sales have to be taken into account. We also need PLM more than ever for the digital twin and model-based systems engineering. But the IT solution providers need to rethink how the term PLM fits in.
Question: How are you getting on with building the model-based product creation process (PCP), for which Schaeffler laid the foundations in the MecPro2 project?
Spindler: We turned MecPro2 into a separate project called MecPro4You with the aim of making further progress in applying the findings. But we are still very much in the initial stages, among other things because the necessary standardized data exchange between different systems is not currently possible. We are currently using models within the system context to map architectures.
Question: But you are already thinking about the autonomous product engineering process (PEP). What does this involve?
Spindler: The next step after model-based systems engineering will be a more automated PCP. Automated in the sense that we take the evaluation of, for example, customer inquiries and the resulting solutions that we develop and use artificial intelligence to provide new solutions for our customers largely automatically, and ultimately autonomously. You could also call this a smart configurator. However, the models I’ve already mentioned are a prerequisite.
Question: In such a diversified organization as Schaeffler, is it even possible to have a uniform PCP given that you have such different customers?
Spindler: Yes, if you describe the PCP as a collection of different process modules that are combined within certain constraints to form a custom process for a project. It is thus not a uniform process in the sense of one that is always identical but rather uniform with regard to the process modules, currently around 200, which include appropriate methods and suitable IT solutions. We have also preconfigured various product creation processes depending on the business model.
Question: How agile are you within the processes today? Do you, like the BMW Group, want to become one hundred percent agile?
Spindler: We are currently bi-modal and agile and tend towards a hybrid approach to product development. We are now largely agile in software development. But even with agile development, you ultimately need fixed gates defined in the project, at which a given deliverable must be available. That’s why we stick to our overall project plan with milestones or quality gates and fixed targets, but work on an agile basis in between. As far as I see it, this hybrid approach is (still) the reality in most companies today.
Question: Are you still digitalizing your business processes or are you already on the way to digital transformation?
Spindler: (laughing) I guess we’re still digitalizing. But I am always a little wary of such buzzwords because you should use formulations that the team understands and with which they can associate concrete content. Cultural change also means understanding each other’s language. Digital transformation is a catchphrase that quickly morphs into a buzzword. At some point, nobody can really explain what it actually is and where the transformation is going to lead.
Question: To the development of new, digital business models, for example. What progress is Schaeffler making there?
Spindler: We already have new business models in our Industrial division. For example, we are talking about monitoring gear trains in wind turbines, i.e. the whole issue of predictive maintenance or residual lifetime prediction and maintenance planning. In the automotive sector, we are probably still too far from the OEM or end customer. But the bio hybrid, which we presented as a Schaeffler product innovation at CES in Las Vegas and will industrialize in the next few years, gives some indication of where the journey might take us.
Question: In the Industrial division, you undoubtedly use an IoT platform for collecting and evaluating sensor data. Have you already established an ecosystem as well?
Spindler: Yes, in fact, we have. It’s a concept that we call the Smart Ecosystem. It primarily involves collecting and evaluating data from the use of our products in the field in order to generate added value for our customers. For example, we have a rolling bearing with integrated sensor technology that measures data relevant to calculating the service life. This data is then transferred to the platform of a gear train manufacturer, for example, where it is processed further.
Question: What does the platform concept for your product development look like? Haven’t you turned the concept on its head?
Spindler: That’s right. I’m often asked that question. At Schaeffler, the “platform” is at the top. The Engineering Cockpit serves as an overarching PLM backbone, which we use to allow access to the information in the various authoring systems and orchestrate interaction between the systems, e.g. in change and configuration management.
Question: Can developers directly access information that is stored in the authoring systems of other cockpits from their own Engineering Cockpit, for example?
Spindler: No, the current version of the Engineering Cockpit only accesses the information in its “own” authoring systems and makes it available in a neutral format. In a later version, it will be possible to implement access to other IT systems using appropriate interfaces or via other cockpits.
Question: Is the Engineering Cockpit already being used productively?
Spindler: It’s being developed on an agile basis, so some functions are already live. In the course of this year, we will gradually be adding additional functionality such as engineering change management.
Question: Will certain functions in the existing systems be switched off at the same time?
Spindler: No, we won’t be mapping any functions in the Engineering Cockpit that have already been implemented in other systems. Each user remains in the domain they are familiar with. Our mechanical developers, for example, are familiar with Windchill and Creo, and I don’t see much use in replacing these with a different tool. Above all, the Engineering Cockpit is a system-level tool on which the configurations are documented to a given maturity level and from which the so-called mechatronic BOM and implementation jobs for the various disciplines are generated.
Question: Is your PLM platform still installed on premises or is it already running in the cloud?
Spindler: At the moment everything is still installed on premises because we are still developing the solution and only have a few users on the system. But I assume that we will gradually move all our applications to the cloud, among other things because the IT vendors are moving in this direction. But it will probably not be a single cloud – we are pursuing a multi-cloud strategy that uses the most appropriate strategy. I’m not too concerned either way.
Question: You’re not worried about cloud-to-cloud connectivity?
Spindler: No, because we already have that in many areas of application today. I believe that we have solved many problems over recent years through the use of interfaces and standards. Of course, PROSTEP has played a major role in this. Undoubtedly, we will need even more standards and strategies such as the Code for PLM Openness, but the technical problems can be solved.
Mr. Spindler, thank you very much for talking to us. (This interview was conducted by Michael Wendenburg)
About Dirk Spindler
Dirk Spindler (born in 1964) has been working for the Schaeffler Group in various positions in product development at home and abroad since 1990. He has headed up the R&D Processes, Methods and Tools at Schaeffler AG since 2017. Spindler studied Production Engineering and Precision Engineering at the TU Kaiserslautern. He is married with two sons.
No Comments »
Sunday, January 6th, 2019
PROSTEP and NAPA, the leading maritime software, services and data analysis provider, have successfully completed a joint ship design solution development project for shipbuilder MEYER TURKU. Building on this success, the two companies have signed a formal cooperation agreement to work more closely together in the development of PLM software and CAE interfaces, as well as in the field of PLM consulting.
Increasingly, shipyards function as integrators between multiple parties in the design process. They need efficient and effective software to successfully drive a project from inception to completion. To help MEYER TURKU make and implement its shipbuilding decisions more effectively, NAPA and PROSTEP have combined their respective expertise to develop an interface for the 3D modelling an design software NAPA Designer, that perfectly meshes with MEYER TURKU’s CAD.
MEYER TURKU is one of the world’s leading shipbuilders and specialises in building cruise ships, car-passenger ferries and special vessels. Their efficient design and construction requires the seamless interaction of many commercial partners and software providers. MEYER TURKU uses NAPA software as part of their design toolbox. “Together with PROSTEP, NAPA has delivered us a CAD solution which has realised substantially improved efficiency and time-savings for us during the design and production process”, comments Ari Niemelä, Hull Basic Design Manager at MEYER TURKU.
As a result of this success NAPA and PROSTEP have signed a formal cooperation agreement to work more closely together in the development of PLM software and CAE interfaces, as well as in the field of PLM consulting. By combining their capabilities in these areas, NAPA and PROSTEP will be able to increase operational efficiencies and improve customer satisfaction. Through our cooperation with NAPA, we are strengthening our market position in this important industry as an IT systems integrator and closing another important gap between NAPA and downstream detail design CAD systems in the digital thread along the ship development process.
Tapio Hulkkonen, Director, NAPA Design Solutions Development, commented, “Thanks to our cooperation agreement with PROSTEP, our customers around the world benefit from a comprehensive, market-leading PLM know-how that is bringing improvements in their ship design process. NAPA looks forward to working with PROSTEP to drive further gains and profitability.”
By Matthias Grau
No Comments »
|