Open side-bar Menu
 PROSTEP INC Blog
Joseph Lopez
Joseph Lopez
Joseph is an experienced marketing professional with a demonstrated history of working in the engineering and information technology and services industry. He is skilled in marketing strategy, search engine optimization (SEO), copywriting and web design. With a Master of Computer Information … More »

PROSTEP informs customers about blockchain technology

 
May 17th, 2020 by Joseph Lopez

Now that the SAMPL project has been successfully completed, PROSTEP has made the blockchain-based solution for the forgery-proof exchange of 3D print data part of its consulting portfolio. In an online discussion called TechTALK, our experts informed customers and interested parties about the blockchain and possible use cases in the PLM environment.

As a basis for the discussion, Dr. Martin Holland first gave the participants a brief overview of how the blockchain works. Similar to the Internet, it is a disruptive technology, but it is not about the decentralized provision of information, but about the decentralization and transmission of values. What distinguishes the blockchain is that the information passed on cannot be subsequently changed because each miner creates a copy and each block of information is inseparably linked to the next by a cryptographic procedure.

In order to create new blocks, the miners have to solve cryptographic problems. The process used by PROSTEP requires significantly less energy than, for example, mining with Bitcoin technology. In addition, it is also possible to embed executable software code, so-called smart contracts, in the blocks in order to map more complex processes, e.g. the activation of a vehicle as soon as the leasing rate is paid.

PROSTEP has integrated the block chain technology into the OpenDXM GlobalX data exchange solution as part of the SAMPL (Secure Additive Manufacturing Platform) project. This end-to-end solution for the forgery-proof exchange of 3D print data ensures that the exchanged data cannot be misused and that printed components can be identified beyond any doubt.

A prerequisite for this is the use of certified 3D printers that support this technology and, for example, report back the printing status, as Holland said in response to a question from a participant.

The blockchain not only enables the licensed production of a limited number of spare parts and traceable documentation of where they were built in which Holland demonstrated using the example of a use case at Airbus Spares. It also supports new business ideas such as the “Earn as you ride” program developed by automotive supplier Continental on the base of a blockchain platform. Drivers receive mini payments for information on free parking spaces or weather data, which their vehicles report back in encrypted form.

Holland invited the Online-TechTALK participants to think about possible applications of the block chain in their companies themselves. PROSTEP has developed a number of criteria for identifying potential use cases. The blockchain makes sense whenever many parties are to use or change data, decentralized data storage is desired but an intermediary is not desired, documentation must be unalterable and participants interact in processes where transaction time is critical. If four of these criteria are met, the blockchain promises added value.

By Daniel Wiegand

OpenDXM GlobalX awarded the Industry Prize 2020

 
May 4th, 2020 by Joseph Lopez

Our OpenDXM GlobalX data exchange platform has once again received the industry award from the Initiative Mittelstand. In addition to specific improvements such as the intuitive web interface or block chain integration, the jurors particularly praised the innovative capacity of the software, which can now also be used as a service from a highly secure cloud infrastructure.

OpenDXM GlobalX, the world’s leading data exchange platform, has once again been awarded this year’s Industriepreis (Industry Arward) by the “Innitiative Mittelstand” as one of the best products of the year. We owe this award primarily to the fact that we are constantly setting new standards in terms of innovation in the further development of our software. By integrating block chain technology, for example, we have set the course for the use of the solution in new fields of application in which, in addition to secure exchange, the integrity of the exchanged data and its controlled use for the intended purpose must be guaranteed.

We have also made the software cloud-enabled so that it can be used as a service from the cloud without any installation or operating costs. A key prerequisite for cloud deployment was the new, web-based interface, which makes the data exchange platform much easier to use, not only for users but also for administrators.

All these factors were taken into account in the evaluation for the Industry Prize, which is awarded annually by the Huber Verlag für Neue Medien. It recognizes the economic, ecological, technological and social benefits of innovative industrial products in various categories. Among others, IT & software solutions that contribute to the simplification and automation of processes and workflows in industrial companies are also honored. The products and solutions are evaluated by an independent jury of experts consisting of professors, scientists, industry representatives and trade journalists. The prizes are usually awarded during the Hanover Fair Industry, but this year had to be handed out online due to COVID-19.

By Udo Hering

There’s digitalization – and then there’s digitalization

 
May 1st, 2020 by Joseph Lopez

In the last newsletter, I wrote “To stand still is no option for us” with reference to the difficulties facing the management of companies in an age of global uncertainties. And then came the lockdown and suddenly (almost) everything ground to a halt. Admittedly, the coronavirus pandemic was not entirely unforeseeable, but we were unprepared for the scale with which it hit us. I do, however, feel that one point I made has been confirmed: The situation can only be mastered using an agile approach. And with an even greater level of digitalization, I would now add.

PROSTEP has been agile in its response to the lockdown. Our employees have been working from home from day one and can be contacted by customers. They are probably even easier to reach and are able to work more efficiently due to the fact that they are saving time they would normally spend visiting customers or attending events. Our software solutions support remote maintenance irrespective of location – if customers are not already using them as a cloud-based service. Thanks to the use of appropriate IT tools and methods, we are even able to conduct consulting workshops online. I’m surprised how well they work, even with new customers, with whom we first need to establish a sense of trust. It is possible to do more online than I anticipated, even if we cannot and do not want to dispense with face-to-face meetings entirely in the future.

The coronavirus crisis has shown us just how important digital technologies are when it comes to staying in touch with colleagues, partners and customers, and working together with them efficiently despite the lockdown. The crisis has not only provided a boost to digitalization in companies but also in official agencies and authorities, schools and medical facilities that we would never have been able to imagine a few months ago. And despite years of complaints about a lack of Internet bandwidth in Germany, everything is working surprisingly well.

The digital progress made over the last few months will irrevocably change the way we work and our mobility behavior, especially as the virus will be around for some time to come. There’s digitalization – and then there’s digitalization. The boost to digitalization triggered by the coronavirus applies in particular to communication processes, which can be digitalized relatively easily with Teams, Skype or Zoom and a good Internet connection. However, it is not yet possible to predict how long-lasting this boost will be for other business processes in which the end-to-end utilization of data and information is particularly important. Because in these cases simply introducing a few new tools is not enough.

The fundamental problems with end-to-end digitalization in product development and manufacturing cannot be solved by digital communication processes. Digital information flows are still hindered by heterogeneous system landscapes involving a large number of individual data silos and poorly integrated processes. The solution to these problems requires not only technical answers but also changes to the organization and to the process landscapes of the companies and, more importantly, a long-term digitalization strategy.

One of the most important lessons learned from the numerous strategy consulting projects that we have carried out in recent years is that companies are not fully exploiting the potential offered by their existing PLM landscapes. The reason for this is not necessarily the PLM systems, which have also become increasingly powerful in recent years, but to the way users work with them. In many cases, they are performing their work the same way they did prior to the introduction of PLM instead of rethinking their processes and methods and adapting them to take advantage of the new possibilities. Sticking with old approaches leads to highly customized PLM solutions. This not only has a negative impact on the ability to update the solutions but also makes it more difficult to respond agilely to new demands placed on PLM landscapes due, for example, to the increasing networking of products and new service-oriented business models.

My hope is that once the coronavirus crisis is over, companies will not immediately return to business as usual but instead will use the time during which business is still somewhat slower to lay the foundation for the digital transformation of their business processes. Regardless of which IT systems they are using, they should determine what information they need for which processes and in what form it needs to be available in order to be able to use it consistently throughout the whole product lifecycle. Thinking about the flow of information from the perspective of the end of the product lifecycle can be useful, especially when it comes to providing support for new service models.

The coronavirus crisis offers companies an opportunity to put their processes and methods to the test, to better integrate their system landscapes and, if necessary, to even roll out new IT tools. They should seize this opportunity to emerge from the crisis digitally stronger. We can provide them with effective support. Based on the analysis of their existing and future PLM capabilities, our strategy consultants identify gaps and potential in the process and system landscapes and, together with the customer, design a PLM infrastructure that will hopefully also be able to withstand the next crisis.

By Karsten Theis

The right advice when choosing a PDM system

 
April 24th, 2020 by Joseph Lopez

Finding the right PDM system is a challenge for companies with limited IT resources and with no expertise in the field of PDM. This is why Oberhausen-based GHH-Radsatz GmbH, which is part of the GHH-BONATRANS Group, brought in the PLM consultants from PROSTEP. They not only supported the company in defining the requirements and selecting the system, but also accompanied the pilot implementation.

Wheelsets have been made in Oberhausen for more than 200 years, although Gutehoffnungshütte Radsatz GmbH was only founded in 1994. Since 2014, the company has been part of the GHH-BONATRANS Group. With its global workforce of 1,700 and sales of over 300 million euros, the group is Europe’s largest manufacturer of wheelsets for all types of rail vehicles. In addition to its two development and production sites in Oberhausen and Bohumín in the Czech Republic, GHH-BONATRANS has a further production site in India and a sales office in Hong Kong.

The 280 employees at the Oberhausen plant primarily develop, manufacture and sell light-rail applications with rubber-sprung resilient wheels for trams around the world, but also wheelsets for heavy-rail applications ranging from underground and metro to high-speed trains and railway construction equipment. The staff at the Czech sister company BONATRANS are responsible for developing and producing wheelsets with solid wheels for conventional trains, high-speed trains, locomotives and freight cars. Their own hot forming facilities also allow them to supply forged parts for wheels and shafts, which are machined in Oberhausen and fitted in the wheelsets.

Every year, GHH-Radsatz supplies around 6,000 wheelsets and 40,000 wheels to rail vehicle manufacturers such as Alstom, Bombardier, Stadler, Skoda and Siemens as well as to rail operators. According to Dr. Sven Jenne, Director of Engineering and Research & Development in Oberhausen, Germany, around half of the business is in the aftermarket segment, since wheels are wearing parts. “There is an extremely large range of variants. This is because wheels and wheelsets have to be tailored to each vehicle project and adapted to the infrastructure. This is also our strength, because otherwise we would not be able to assert ourselves in the highly competitive market against competitors from Eastern Europe and increasingly also from China.”

Increasing effort invested in documentation

Compared to solid wheels, the amount of engineering effort needed to adapt the rubber-sprung tram wheels is greater, as their design is more complex. A V-shaped rubber ring between the wheel body and the tire ensures greater ride comfort. Design engineers must therefore always achieve a balance between strength, cushioning and mountability of the wheels. And Jenne explains that the V-shaped cushioning is unique to GHH-Radsatz. “Our GHH® V60 is the most widely used rubber-sprung resilient wheel in Europe.”

Every year, the company handles many concurrent projects, each of which can last between six and 24 months. The aim is to improve punctual delivery by detecting discrepancies in good time. Design engineers are under great time pressure, especially for new vehicle projects, as the time between order placement and delivery is becoming ever shorter, while delivery times for long-running items such as the forged parts are often beyond their control.

At the same time, the complexity of the projects and the amount of documentation needed are growing. Jenne: “Wheelsets are safety-relevant components, and the requirements with regard to traceability and also the volume of documents for each project and order have increased significantly in recent years.” The documents must be kept for 30 years or more because wheelsets have very long lifecycles and are constantly being reordered.

Time-consuming information retrieval

The file-based archive system made it increasingly difficult for users and company managers to keep track of the status of projects and, in the case of aftermarket projects, to trace which documents were actually valid. “Our staff spends a lot of time hunting down and collating information. That’s why we want to make it accessible to all those involved, regardless of the archive systems used in the individual departments, and in the process firm up the ‘memory’ of the company,” says Jenne, explaining the purpose of the PDM project.

In consultation with its Czech sister company, GHH-Radsatz decided to replace the archive system with a database-driven product data management system. When they embarked on the search for a suitable solution, however, it soon became apparent that the company was not in a position to get a clear picture of the multitude of solutions on offer and to assess their capabilities.

Jenne: “At times, we had the impression that we were using a sledgehammer to crack a nut, as we initially only needed part of the functionality offered by PDM.” That’s why PROSTEP was called in as a vendor-independent helper. The company’s PLM consultants not only know the systems and the vendors, but also bring along a wealth of experience from other selection projects.

As a first step, PROSTEP supported the project team in completing the requirements, structuring them clearly and creating a proper requirements specification. One of the most important requirements was the interaction with the Infor Smart Office M3 ERP system, which is currently critical for the creation of articles and BOMs and for order processing, and is intended to remain so. It was also important for the PDM system to offer a good interface to the SolidWorks design system, which is used in Oberhausen on 18 CAD workstations, and it should also be possible to connect it to the CAQ solution. In addition to the system’s integration capabilities, GHH-Radsatz also attaches great importance to simple system administration and the ability to further develop it in-house without the need for programming.

Benchmarking with three system vendors

Even though the first priority is to connect the existing systems and make information more readily available, the company has more far-reaching plans that PROSTEP also took into consideration when selecting the system. For example, the engineering change process, which is currently still entirely paper-based, is to be mapped to an electronic workflow. Jenne would also like to see greater digitalization of the entire order flow from the request for quotation, through design, material procurement and production, right up to dispatch and invoicing. This would be done by parallelizing certain tasks by something akin to PDM-driven project management. It would also help management to monitor the status of the projects and respond to discrepancies more rapidly.

In a professional selection process with transparent parameters, PROSTEP initially selected five candidates from a total of ten potential vendors. These were then invited to submit an offer. After the offers had been evaluated and discussions had taken place with the vendors, three candidates were shortlisted and were given the opportunity to demonstrate their programs on the basis of the use cases engineering change and order processing. This methodical approach to system selection ensured that the results were comparable. “We were always able to explain to management how we came to our decision,” explains Jenne.

Ultimately, the choice fell on the PRO.FILE software from PROCAD, although Jenne stresses that all three suppliers made a very good impression. The decisive factor was not only better value for money in terms of the costs of purchasing, rolling out and maintaining the software, but also the ease of configuration. “I was very impressed with how easily my colleagues were able to program, or rather configure, some wonderful things. This gives me confidence that we will easily be able to extend the solution in the future.”

The consulting service paid off

GHH-Radsatz spent about 10 to 15 percent of its total budget (excluding internal expenses) on consulting. According to Jenne, this was a wise investment because the company is confident that it has taken a decision that is sustainable in the long term. He would particularly recommend external consulting to smaller companies that are less familiar with PDM. “Thanks to the collaboration with PROSTEP, the vendors immediately realized that we knew what we were talking about. And I have the feeling that the consultants’ knowledge of the market also had a positive impact on the price negotiations.”

The company will begin rolling out the system this March. Rollout will be based on an existing prototype that PROCAD set up last year and which essentially maps all the planned functions including change management and order workflow. However, implementation of the latter is not planned until next year in order not to overburden users. The plan is to initially enable CAD data and document management with read access to ERP and CAQ systems. Before this can happen, however, large amounts of existing data from the various file archive structures will have to be migrated. Jenne explains: “We have already held a large number of workshops with PROCAD and PROSTEP on this aspect.”

In the long term, he expects the use of PDM to bring considerable benefits. Users will be more productive because they will spend less time searching for information. Processes will be accelerated by working in parallel, which will reduce throughput times. In addition, the status of projects will become more transparent, so that management can intervene more rapidly in the event of delays.

By Michael Manderfeld

We have to learn how to deal with artificial intelligence An interview with Professor Frank Kirchner

 
April 21st, 2020 by Joseph Lopez

For companies in all industries, artificial intelligence is becoming a key driver of competition. In this interview, Professor Frank Kirchner explains what it can and cannot do and where the challenges lie when implementing AI applications. Kirchner studied computer science and neurosciences and has been exploring how AI can be used in the real world for 25 years.

Question: You once said that you came to artificial intelligence through music. How did that happen?

Kirchner: I’ve always liked making music. (Kirchner plays guitar and piano.) When I started studying, I wasn’t in a band at first and tried playing along with drum computers and synthesizers. What bothered me was that the rhythms from the computer were very sterile back then. When a human being plays the drums, there are always slight delays because the player gets emotionally involved. It is barely perceptible, but it has a huge impact on the music. So I started trying to teach my computer to vary the precision slightly at certain points in songs. In the end, I didn’t succeed; it just sounded sterile in a different way. It was only later that I realized that this was an AI problem, but the exercise of programming taught me how creative computer science can be.

Question: How intelligent is AI in reality and where are its limits?

Kirchner: It’s not yet possible to develop AI that acts like a human drummer or guitarist, perceiving or even producing emotional states and then adapting their playing accordingly. But with today’s methods, we can mimic human emotions or playing styles by giving the algorithms thousands of examples. This works not only with music, but also with painting. You can train machine learning algorithms to reproduce pictures in the style of certain painters.

Question: What aspects of AI are you currently working on at the Robotics Innovation Center?

Kirchner: We are developing robotic systems with AI algorithms for various fields of application, ranging from exoskeletons for the rehabilitation of stroke patients to autonomous underwater robots for inspecting offshore wind turbines, production robots that can be deployed alongside human workers in tomorrow’s production facilities and space applications. For example, we are currently building robotic systems for the European Space Agency that will autonomously map the lunar surface to detect cavities in the lava layers that can be used to build a lunar base.

Question: What fields of application do you see as deriving the greatest potential benefit from AI?

Kirchner: I see a massive benefit in the field of medicine, which is currently facing enormous pressure as a result of the coronavirus pandemic. In particular, machine learning processes could support human diagnosis or relieve hard-pressed medical staff of routine tasks that can be done by AI-based robots. In agriculture, which has a huge problem with the lack of seasonal workers, AI could automate the picking of strawberries or asparagus. Simple automation technology cannot cope with these jobs because each plant grows differently. You need machines with a certain amount of intelligence to recognize the context.

Question: Did you deliberately not mention the field of production automation?

Kirchner: Of course, that’s also a field of application in which AI-based robotics plays an important role. For example, we are working with VW on hybrid teams of humans and robots to get away from the traditional production lines with “dumb” robots that always do the same thing. The aim is to create robotic systems that can be deployed flexibly and act as assistants to humans, even if they are only positioning workpieces, thus relieving them of heavy manual work.

Question: In which industries is AI currently used most intensively?

Kirchner: In Germany, as in other countries, it has been in use for some time – and very intensively – in the financial sector. AI methods are used in office automation for text, voice and image recognition or in the security sector, e.g. at airports, although this is not always apparent. They are becoming increasingly widespread in medicine and have also made their mark in production in the context of Industry 4.0. The networking of machines using AI algorithms provides the basis for increasing productivity.

Question: What are the difficulties in implementing industrial AI applications?

Kirchner: I think one of the greatest bottlenecks is the lack of digital infrastructure. Alarmingly, a large proportion of German companies are still living in the analog world. The infrastructure for collecting data from production, logistics, administration and even management is not particularly well developed. We have some catching up to do in this area. Although many German SMEs and smaller companies have begun to wake up to the problem, they don’t know where to start with digitalization, as the rest of the infrastructure in Germany is not a great deal of help. We’ve been talking about the nationwide rollout of fiber broadband for 20 years, and nothing has happened. This is a real competitive disadvantage.

Question: What opportunities does AI offer SMEs in particular?

Kirchner: I see massive potential there, which absolutely has to be exploited if SMEs want to remain successful. Because ultimately, they also are global players and their development, production and logistics have to be very fast and cost-effective to survive in the global market. And they have to be able to react quickly to varying market situations. The problem lies primarily with small companies, which often lack their own research capabilities. Our experts support them in the development of AI-based solutions in fields ranging from the automotive sector to mining – something that distinguishes us from other research institutes.

Question: You worked in Boston for several years. Are people more open to AI there?

Kirchner: In the USA, but also in China, people recognize the benefits of AI, whereas in Europe we tend to emphasize the risks. Leaving aside which is the better approach, it is vital that we Europeans be at the forefront of AI development. Only by playing with the big boys can we influence how it is developed and, above all, how it is used. Otherwise we will become dependent, with all the negative consequences that we are seeing in the pandemic. We need greater digital sovereignty.

Question: What are the challenges currently being faced in AI research?

Kirchner: One of the challenges lies in integrating the various AI methods. On the one hand, there is the area of symbolic AI in traditional, logic-based methods. These have weaknesses when it comes to physical phenomena in the real world. In such areas, sub-symbolic AI methods such as machine learning, neural networks and so on work very well. Then there is a third area that I call physical AI, i.e. the embedding of all these methods in robots or other objects of the physical world. The challenge is to integrate these three areas to form a hybrid overall system. At the same time, this forms a basis to allow AI decisions to be explained and become transparent, which is important if people are to trust the technology.

Question: Is it true that self-learning systems are trained with historical data and make decisions that are often not comprehensible?

Kirchner: That’s correct. On the one hand, we must provide the computer scientists who train these algorithms with the appropriate skills. They need a very high level of knowledge about the data they are using, where it comes from and how it was obtained. These data skills must be firmly anchored in the computer science curricula. The second issue is that the AI algorithms, for example when evaluating MRI scans, must also give the doctor an explanation as to why they have identified a carcinoma. Only then does the doctor have a basis for accepting the decision. It is precisely this kind of transparency that we have to incorporate.

Question: Do we need something like ethical rules for the use of AI?

Kirchner: Yes, of course. But we have to develop them on the basis of the ethical values that we already have and which apply to all technologies. I don’t see that AI raises new ethical issues and I don’t see any way in which this could be incorporated in the technology. It primarily concerns people.

Prof. Kirchner, thank you very much for the interview.
(The interview was conducted by Michael Wendenburg)


About Frank Kirchner:

Professor Frank Kirchner has headed up the Robotics Group in the Faculty of Mathematics and Computer Science at the University of Bremen since 2002. He is also spokesperson for the German Research Center for Artificial Intelligence (DFKI) in Bremen and is in charge of the Robotics Innovation Center research department. Kirchner studied and obtained his doctorate in Bonn. He worked as a researcher for several years at the Northeastern University in Boston (USA) and took charge of establishing the Brazilian Institute of Robotics in Salvador de Bahia, which was founded in 2013 and was modeled on the DFKI. Kirchner is one of the leading experts in the field of AI-based robotics and has more than 350 publications on robotics and AI to his name.

If You’re a Remote Worker and Exchange CAD data or other Intellectual Property, These Tools will Help

 
April 13th, 2020 by Joseph Lopez

The impact of COVID-19 has left much of the manufacturing and supply chain industry at a standstill. Stock markets have fallen to record lows, unemployment is at an all-time high, entire industries such as the airlines are asking for bailouts, retail shops have closed, and most states have imposed stay-at-home orders.

Prior to the outbreak of the coronavirus in the US, CEOs, CIO/CTO’s and upper leadership sought to drive digital transformation and push towards efficiency and optimization in the workplace. IoT, blockchain and 3D printing were just some of the cutting-edge technologies our leaders focused on with a vision of the ‘factory of the future”.

Perhaps the biggest driving factor now spearheading change isn’t the C-level executive, but rather a global pandemic. The coronavirus has challenged us to quickly adapt to an ever-changing workplace. From remote working and collaboration to staying in touch and keeping a sense of ‘normality’, all aspects of our daily work routines have been affected.

Read the rest of If You’re a Remote Worker and Exchange CAD data or other Intellectual Property, These Tools will Help

No Digital Twin without Digital Thread

 
April 7th, 2020 by Joseph Lopez

Digital Twins offer the possibility to simulate the behavior of physical assets, to monitor them during operation and to continuously improve them. The data and models from planning and development form the context in which the operating data can be interpreted correctly. Putting them together from the wealth of available information is an essential prerequisite for the use of digital twin applications.

The Digital Twin is the digital image of a physical object or system, which can be a product, a production plant, but also a company or a process. The Digital Twin connects virtual planning and development models with the real product or production world in order to give people a better insight into the system and its condition or behavior. A vision in the sense of Industrie4.0 is to enable technically complex systems to control themselves autonomously and behave more intelligently through digital algorithms, virtual models and status information.

The functional relationships of a product or a production plant are defined based on customer requirements and in consideration of a multitude of legal requirements in product planning and development. Without knowledge of these interrelationships, the operating data that the real asset captures and provides in its later product life cannot be interpreted correctly. If you do not know how a machine or system is actually supposed to function, it is not possible to identify the causes of deviations from this target state or behavior beyond doubt and take appropriate countermeasures. At the same time, knowledge of the history of origins is also important in order to be able to assess for what reason, for example, a bearing has failed and which other machines could also be affected by the problem.

This connection between the real asset and the development and planning models describing its history is called a digital thread. It is the digital “red thread” that links the information of a real product instance across processes and IT systems. On the one hand, this makes it possible to bring together all the information from the life cycle of the product instance or the real asset and thus forms the basis for the creation of a digital thread. Without a digital thread, the digital twin can be reproduced manually, but it is difficult or impossible to keep it up to date. On the other hand, traceability along the Digital Thread allows decisions in development and production to be questioned and optimization potential to be identified with the help of the operating data.

Management of product configurations

From a PLM point of view, the starting point of the digital twin is a specific configuration of the product or production system, for example the asset in its delivered state. This includes not only mechanical, electrical/electronic and software components with their models, but perhaps also service-relevant information, such as the service life of certain components. Bringing this information together and maintaining it manually is time-consuming and error-prone, especially since the configuration changes over the course of the product’s life, whether through software updates or other measures in the context of maintenance or further development of the asset. The expectation of today’s PLM systems is to automatically extract the configuration for the Digital Twin and keep it up-to-date.

We speak here of the concept of Configuration Lifecycle Management (CLM), which makes it possible to generate temporally valid views of the product across IT system boundaries and to manage product configurations across all phases of the product lifecycle. The main function of CLM is to create and keep consistent the various views of the digital product model during the life cycle, and to document their validity over time. To do this, it uses cross-system and cross-discipline baselines. These baselines document the state of the configuration at a certain point in time or maturity level and thus also control the representation of the Digital Twin. They enable companies to immediately and reliably answer the question at any point in the process whether and how the product or asset meets the requirements placed on it or in what state the asset was at a defined point in time, for example, which product configuration was delivered to the customer.

In order to manage the configuration of a product along its entire life cycle in a traceable manner, the use of a powerful PLM integration platform with connectors to all IT systems involved is required. As an intermediate layer spanning all IT systems, it creates the prerequisite for bringing together the information from the individual IT systems in a way that corresponds to the digital thread concept.

Cross-company collaboration

In industries such as mechanical and plant engineering or shipbuilding, companies face the challenge that the manufacturer who builds and provides the Digital Twin is not necessarily the operator and user who feeds it with operational data. Both the digital data and the operating data, or at least part of it, must therefore be exchanged and synchronized across companies in order to keep the Digital Twin up to date and to be able to use the operating data for the continuous improvement of real assets. Questions such as data security, protection of intellectual property and ownership of the data therefore play a very central role in the development and use of a digital twin application.

More and more customers today require their suppliers to deliver digital data and models to support Digital Twin applications along with the physical assets. CLM can be used to control not only the amount of information provided, but also the level of detail of the information and the formats in which it is delivered. They can be compiled largely automatically and made available to the customer as a data package, for example in 3D PDF format.

In order to maintain digital consistency in cooperation across company boundaries, the exchange partners must first agree on the scope of the information to be exchanged and agree on common standards for handling this information. But the central question is where the Digital Twin should live? PROSTEP is convinced that it is advisable to set up a joint collaboration platform for this purpose, which will become part of the information model. This platform will provide customers with the information they need to build their Digital Twin application while the development process is still underway and will also allow them to synchronize changes to the master models during operation if necessary. The common platform can also be used to link parts of the operating data required by the manufacturer for new service offers such as predictive maintenance or product improvements with the Digital Thread.

Three building blocks for the Digital Twin

The foundations for the Digital Twin are already laid in product development and production planning. To bring it to life and keep it alive, the digital umbilical cord must not be cut. This is why an integration platform is needed that makes the digital information from the various authoring and data management systems available at any time. A powerful configuration management system that manages the relationships between the information scopes and their validity is essential for building a Digital Twin. However, digital consistency is not a one-way street. In order to derive maximum benefit from the product twin in terms of closed loop engineering, traceability between Digital Twin and Digital Thread must be ensured. The creation of a collaboration platform maintains digital consistency even beyond company boundaries.

By Lars Wagner

Clean ERP/PLM migration with the “data washing machine”

 
April 3rd, 2020 by Joseph Lopez

In one fell swoop, the robotics and automation specialist KUKA has migrated its SAP installation, introduced Teamcenter as its new PLM system and reorganized the entire engineering-to-order process. Crucial to the project’s success were the soft PLM migration, during which the legacy and new system coexisted for a short period, and the consistent cleansing of the data, which KUKA undertook with the assistance of PROSTEP AG. PROSTEP also accompanied KUKA during the changeover to the current Teamcenter version.

KUKA, which is headquartered in Augsburg, is one of the world’s leading suppliers of automation solutions. KUKA offers customers everything from a single source: from robots and cells to fully automated systems and their networking. The company, which was founded over 120 years ago, employs around 14,200 people worldwide and generated revenues of 3.2 billion euros in the 2018 financial year.

The first step in the company’s transformation program – “Power ON KUKA 2020″ – was to standardize the process and system landscape in the engineering-to-order (ETO) sector. ETO is the term KUKA uses to describe everything relating to the development of custom-built production systems for the automation of manufacturing processes– in contrast to its configure-to-order (CTO) business involving robotic components and systems. The PLM migration project was driven first and foremost by the ETO sector, as Project Manager Matthias Binswanger affirms. However, the project also had to be synchronized with the consolidation of the global ERP landscape that was taking place at the same time.

KUKA previously had a very heterogeneous ERP and PLM landscape, which was partly due to the increasing scope of the group structures. For example, the ETO specialists in Augsburg worked with a local SAP instance and an older version of the former Eigner system, Oracle Agile e6. After an in-depth system selection process, KUKA decided to implement the Teamcenter PLM system from Siemens Digital Industries Software as the global solution for all its ETO locations.

Teamcenter is intended to support the future product engineering process, including functional engineering, manufacturing process planning and simulation, as well as control engineering change management. To do this, it has to be familiar with the relationships between the mechanical, electrical and fluid components of the functional units (for example valves, sensors and their processes), which were mapped in a separate application in the old world. Changes are part of the ETO sector’s everyday business because the systems are often designed before the products to be manufactured on them are fully defined. “One major challenge is the complexity that results from the sheer volume of changes to thousands of components,” explains Binswanger.

PLM implementation was already underway when KUKA launched the parallel consolidation of the heterogeneous ERP landscape in order to give greater transparency to its project activities. The simultaneous changeover to SAP S/4HANA considerably increased the complexity of the PLM migration, as Binswanger explains: “To introduce the new solutions, we made use of a clear project control mechanism with a flexible, multi-stage project structure that did not previously exist in this form. This went hand-in-hand with changes to the engineering processes and methods, which in turn had repercussions for the PLM landscape and therefore also had a big impact on PLM migration.”

To migrate the PLM system, the project team called on the services of the experts from PROSTEP, who brought to the project not only their PLM expertise and many years of experience in performing migrations but also PROSTEP’s proven OpenPDM integration platform. “There aren’t many companies that have certified connectors to Agile e6 and Teamcenter. As a result, there was really no real alternative to PROSTEP,” explains Binswanger. The PLM consulting and software company also assisted the customer during the cleansing of the master data prior to the start of the migration. When considering this step, it is important to understand that at KUKA materials, BOMs, etc. are currently created in the PLM system, or in both systems, and then published to the ERP system.

While the changeover to SAP S/4HANA was to follow the “big bang” approach, KUKA chose the soft route for its PLM migration, with the legacy and new systems temporarily coexisting. Although Teamcenter is the target system for the new architecture, the idea was to conclude any open projects in the old PLM environment. Binswanger explains that migrating them all in one fell swoop would have required enormous effort. Agile only works with documents, materials, BOMs and structures, whereas the CAD data is managed using a file-based approach or in containers. Teamcenter, on the other hand, provides interfaces to all the CAD systems, system versions and releases used at KUKA, which means that CAD files in different formats can be stored together with the materials for the first time.

Direct synchronization of the PLM data

The changeover to SAP S/4HANA and the temporary coexistence of the two PLM systems meant that the migration resembled a billiards shot across three cushions. First of all, Agile e6 had to be updated and interfaced with the new ERP system so that materials and BOMs could be correctly linked to the new project structure. It was then necessary to connect the two PLM systems in order to achieve the cross-system synchronization of standard parts, catalog parts and other materials. Binswanger explains why it was not sufficient to simply synchronize them via SAP: “PLM data with no logistical relevance is not published to the ERP system in the first place. However, this data is important for the Teamcenter users so that they can re-use the materials stored in Agile.”

The OpenPDM integration platform provides the basis for PLM data synchronization. It is designed to transfer all the materials between the two system environments and not only the standard and catalog parts. PROSTEP adapted the Teamcenter connector a number of times in order to take account of changes in the data model. All types of document are now also transferred together with the PLM metadata. Automatic quality checks ensure that the documents meet the requirements of the Teamcenter data model. “We have an activity-driven application which automatically synchronizes the data sent to Teamcenter every five minutes, that is to say it creates new materials together with their attributes, structures and documents or updates modified ones,” says Binswanger.

Contrary to the original planning, KUKA decided to actively shut down the legacy system rather than simply phasing it out gradually. This allows the company to save on the high license and maintenance costs involved in operating two systems. In order to meet requirements regarding traceability, the documents relating to long since completed projects also have to be migrated to Teamcenter. Binswanger explains that in order to do this, it will be necessary to relax the quality requirements a little and store the documents uncleansed in a separate archive, where they can be accessed only for reading and printing.

Data selection and cleansing

Due to the simultaneous changeover to SAP S/4HANA, the PLM migration in Augsburg started later than planned but with considerably higher-quality input data. The project team took advantage of the delay to implement a clearly structured, documented OpenPDM-based process for cleansing the data. One clear specification was that, of the 3.3 million data records in the old SAP solution, only those materials that are relevant for future projects should be transferred to the new environment. Therefore, it was first necessary to identify the data that needed to be migrated.

On the basis of over a dozen criteria and taking account of various attributes, PROSTEP calculated the so-called Total Article List (TAL) from the 3.3 million data records in SAP and Agile. The TAL is a list of all the articles that have been ordered or installed in systems, used for service purposes in recent years or are still in stock. It now comprises “only” 1.2 million articles. According to Binswanger, PROSTEP’s ability to resolve the structures and identify the components for any given article is of decisive importance.

The TAL controlled not only the big-bang migration of the SAP data but also acted as master for the selective cleansing and migration of the PLM data. In particular, the repeat parts (standard parts, purchased parts, semi-finished products, etc.) had to be augmented with additional data and classified before being imported into Teamcenter. To do this, KUKA used the software classmate from simus systems together with other solutions. OpenPDM controlled the entire cleansing process, from the extraction of the data to manual or automatic cleansing through to validation of the results, and also generated the corresponding quality reports. A total of approximately 80,000 articles passed through one or other of the programs in the “data washing machine”. Only the data that ultimately met all the quality criteria was automatically imported into Teamcenter.

In Augsburg, SAP S/4HANA, a new Agile version and Teamcenter all went live on the same day. An important milestone for KUKA. According to Binswanger, PROSTEP, its OpenPDM software platform and its expertise played a key role. KUKA successfully took advantage of the migration project to cleanse its database of unnecessary clutter.

The Teamcenter application was continuously further developed after the go-live. This repeatedly required adaptations to OpenPDM, which PROSTEP implemented in agile sprints. One major challenge was to migrate the documents from ongoing Agile projects because the data models in the two systems are very different. The last hurdle for the time being was the changeover to the new Teamcenter version 12, which required a change of integration platform version. Thanks to PROSTEP’s support, the company was also able to surmount this hurdle without any problems.

By Andreas Hoffmann

PROSTEP completes the new OpenPDM architecture

 
March 2nd, 2020 by Joseph Lopez

PROSTEP has released Version 9.1 of its PLM integration platform OpenPDM, which is designed to help customers find their way into the cloud. Its new MicroServices architecture with independent connectors to common PLM systems makes it particularly suitable for hybrid PLM scenarios in on-premise and cloud infrastructures.

To enable the use of OpenPDM in distributed software architectures, PROSTEP has broken down the integration platform into smaller software components and designed the mapping and process engine as independent MicroServices. In addition, the import and export functions are now part of the connectors, which the system administrator can configure individually via a web-based interface. Thanks to the consistent use of REST interfaces, each OpenPDM connector can thus run independently and can be used, for example, with message brokers such as Apache Kafka in conjunction with ESB (Enterprise Service Bus) architectures. In addition, PROSTEP has integrated Docker technology so that OpenPDM can be run in a cloud-based container environment such as OpenShift.

The neutral OpenPDM data model also had to be extended for the new software architecture. Process modeling is now carried out with the proven workflow engine Camunda, which is also used by customers such as NASA and T-Mobile. The Camunda Modeler has a graphical user interface that enables intuitive modeling of BPMN (Business Process Model and Notation) workflows and DMN (Decision Model and Notation) decisions.

The new architecture allows customers to flexibly use the proven OpenPDM functions for PLM integration, migration and collaboration in hybrid PLM scenarios. Version 9.1 currently offers MicroService-based connectors to the PLM systems 3DEXPERIENCE R2019x and R2020x from Dassault Systèmes, PTC Windchill R11.1 and R11.2, SAP PLM R3, R3 EHP and S4 (on premise) and the IoT platform PTC ThingWorx 10.x. PROSTEP will migrate additional connectors to the new architecture as required.

The new OpenPDM version is not backward compatible with versions 8.x. PROSTEP is thus primarily addressing new customers who want to use PLM and/or ERP systems from the cloud and integrate them with their existing IT infrastructure. Existing customers with complex integration, migration or collaboration scenarios based on OpenPDM 8.x do not necessarily have to migrate their installations. PROSTEP will also continue to develop its existing software and will soon launch a new version 8.7 with connectors to the current versions of all connected PDM/PLM and ERP systems.

By Udo Hering

PROSTEP builds digital twin in the ProProS research project

 
February 9th, 2020 by Joseph Lopez

The Bremen-headquartered Lürssen shipyard group, the Machine Tool Laboratory (WZL) at RWTH University in Aachen and PROSTEP have launched the ProProS research project. The aim of the project is to create a digital twin for the manufacturing and assembly processes at shipyards and use it for status monitoring and optimizing shipbuilding. The shipbuilders want to minimize delays in the processes.

Lürssen, a family-owned company, expects digitalization to improve transparency in production and reduce throughput times says Dr. Bernhard Urban, Head of Development & Innovation: “The joint research project with PROSTEP and WZL provides the basis for increased digitalization in our manufacturing and assembly processes. We hope that the development program will help us drive the broad-based digitalization processes at our company forward in a targeted manner and thus do justice to the leadership claim regarding performance and quality formulated by the founder of our company, Friedrich Lürssen.”

As part of the project, PROSTEP is working together with the WZL’s manufacturing experts, who will be responsible for developing the production technology logic, to develop the demonstrator for a digital twin. It maps the planning data from the target process (product structure, work orders, assembly sequences, scheduling, etc.) in an end-to-end data model and compares it in real time with the actual data from production and assembly.

The first step involves detecting disruptions in the process flow, e.g. caused by a missing or unfinished component, at an early stage based on a unique component ID and assessing their impact on the schedule. But it is also intended that the digital twin perform control tasks and help avoid or minimize delays by simulating alternative production and assembly sequences.

The project, which has an overall budget of 3.2 million euros, will runs until 2022 and is supported by the German Federal Ministry for Economic Affairs and Energy (BMWi). The Lürssen shipyard group, which specializes in building yachts and naval vessels, is the project coordinator.

By Carsten Zerbst



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise