Archive for April, 2020
Friday, April 24th, 2020
Finding the right PDM system is a challenge for companies with limited IT resources and with no expertise in the field of PDM. This is why Oberhausen-based GHH-Radsatz GmbH, which is part of the GHH-BONATRANS Group, brought in the PLM consultants from PROSTEP. They not only supported the company in defining the requirements and selecting the system, but also accompanied the pilot implementation.
Wheelsets have been made in Oberhausen for more than 200 years, although Gutehoffnungshütte Radsatz GmbH was only founded in 1994. Since 2014, the company has been part of the GHH-BONATRANS Group. With its global workforce of 1,700 and sales of over 300 million euros, the group is Europe’s largest manufacturer of wheelsets for all types of rail vehicles. In addition to its two development and production sites in Oberhausen and Bohumín in the Czech Republic, GHH-BONATRANS has a further production site in India and a sales office in Hong Kong.
The 280 employees at the Oberhausen plant primarily develop, manufacture and sell light-rail applications with rubber-sprung resilient wheels for trams around the world, but also wheelsets for heavy-rail applications ranging from underground and metro to high-speed trains and railway construction equipment. The staff at the Czech sister company BONATRANS are responsible for developing and producing wheelsets with solid wheels for conventional trains, high-speed trains, locomotives and freight cars. Their own hot forming facilities also allow them to supply forged parts for wheels and shafts, which are machined in Oberhausen and fitted in the wheelsets.
Every year, GHH-Radsatz supplies around 6,000 wheelsets and 40,000 wheels to rail vehicle manufacturers such as Alstom, Bombardier, Stadler, Skoda and Siemens as well as to rail operators. According to Dr. Sven Jenne, Director of Engineering and Research & Development in Oberhausen, Germany, around half of the business is in the aftermarket segment, since wheels are wearing parts. “There is an extremely large range of variants. This is because wheels and wheelsets have to be tailored to each vehicle project and adapted to the infrastructure. This is also our strength, because otherwise we would not be able to assert ourselves in the highly competitive market against competitors from Eastern Europe and increasingly also from China.”
Increasing effort invested in documentation
Compared to solid wheels, the amount of engineering effort needed to adapt the rubber-sprung tram wheels is greater, as their design is more complex. A V-shaped rubber ring between the wheel body and the tire ensures greater ride comfort. Design engineers must therefore always achieve a balance between strength, cushioning and mountability of the wheels. And Jenne explains that the V-shaped cushioning is unique to GHH-Radsatz. “Our GHH® V60 is the most widely used rubber-sprung resilient wheel in Europe.”
Every year, the company handles many concurrent projects, each of which can last between six and 24 months. The aim is to improve punctual delivery by detecting discrepancies in good time. Design engineers are under great time pressure, especially for new vehicle projects, as the time between order placement and delivery is becoming ever shorter, while delivery times for long-running items such as the forged parts are often beyond their control.
At the same time, the complexity of the projects and the amount of documentation needed are growing. Jenne: “Wheelsets are safety-relevant components, and the requirements with regard to traceability and also the volume of documents for each project and order have increased significantly in recent years.” The documents must be kept for 30 years or more because wheelsets have very long lifecycles and are constantly being reordered.
Time-consuming information retrieval
The file-based archive system made it increasingly difficult for users and company managers to keep track of the status of projects and, in the case of aftermarket projects, to trace which documents were actually valid. “Our staff spends a lot of time hunting down and collating information. That’s why we want to make it accessible to all those involved, regardless of the archive systems used in the individual departments, and in the process firm up the ‘memory’ of the company,” says Jenne, explaining the purpose of the PDM project.
In consultation with its Czech sister company, GHH-Radsatz decided to replace the archive system with a database-driven product data management system. When they embarked on the search for a suitable solution, however, it soon became apparent that the company was not in a position to get a clear picture of the multitude of solutions on offer and to assess their capabilities.
Jenne: “At times, we had the impression that we were using a sledgehammer to crack a nut, as we initially only needed part of the functionality offered by PDM.” That’s why PROSTEP was called in as a vendor-independent helper. The company’s PLM consultants not only know the systems and the vendors, but also bring along a wealth of experience from other selection projects.
As a first step, PROSTEP supported the project team in completing the requirements, structuring them clearly and creating a proper requirements specification. One of the most important requirements was the interaction with the Infor Smart Office M3 ERP system, which is currently critical for the creation of articles and BOMs and for order processing, and is intended to remain so. It was also important for the PDM system to offer a good interface to the SolidWorks design system, which is used in Oberhausen on 18 CAD workstations, and it should also be possible to connect it to the CAQ solution. In addition to the system’s integration capabilities, GHH-Radsatz also attaches great importance to simple system administration and the ability to further develop it in-house without the need for programming.
Benchmarking with three system vendors
Even though the first priority is to connect the existing systems and make information more readily available, the company has more far-reaching plans that PROSTEP also took into consideration when selecting the system. For example, the engineering change process, which is currently still entirely paper-based, is to be mapped to an electronic workflow. Jenne would also like to see greater digitalization of the entire order flow from the request for quotation, through design, material procurement and production, right up to dispatch and invoicing. This would be done by parallelizing certain tasks by something akin to PDM-driven project management. It would also help management to monitor the status of the projects and respond to discrepancies more rapidly.
In a professional selection process with transparent parameters, PROSTEP initially selected five candidates from a total of ten potential vendors. These were then invited to submit an offer. After the offers had been evaluated and discussions had taken place with the vendors, three candidates were shortlisted and were given the opportunity to demonstrate their programs on the basis of the use cases engineering change and order processing. This methodical approach to system selection ensured that the results were comparable. “We were always able to explain to management how we came to our decision,” explains Jenne.
Ultimately, the choice fell on the PRO.FILE software from PROCAD, although Jenne stresses that all three suppliers made a very good impression. The decisive factor was not only better value for money in terms of the costs of purchasing, rolling out and maintaining the software, but also the ease of configuration. “I was very impressed with how easily my colleagues were able to program, or rather configure, some wonderful things. This gives me confidence that we will easily be able to extend the solution in the future.”
The consulting service paid off
GHH-Radsatz spent about 10 to 15 percent of its total budget (excluding internal expenses) on consulting. According to Jenne, this was a wise investment because the company is confident that it has taken a decision that is sustainable in the long term. He would particularly recommend external consulting to smaller companies that are less familiar with PDM. “Thanks to the collaboration with PROSTEP, the vendors immediately realized that we knew what we were talking about. And I have the feeling that the consultants’ knowledge of the market also had a positive impact on the price negotiations.”
The company will begin rolling out the system this March. Rollout will be based on an existing prototype that PROCAD set up last year and which essentially maps all the planned functions including change management and order workflow. However, implementation of the latter is not planned until next year in order not to overburden users. The plan is to initially enable CAD data and document management with read access to ERP and CAQ systems. Before this can happen, however, large amounts of existing data from the various file archive structures will have to be migrated. Jenne explains: “We have already held a large number of workshops with PROCAD and PROSTEP on this aspect.”
In the long term, he expects the use of PDM to bring considerable benefits. Users will be more productive because they will spend less time searching for information. Processes will be accelerated by working in parallel, which will reduce throughput times. In addition, the status of projects will become more transparent, so that management can intervene more rapidly in the event of delays.
By Michael Manderfeld
No Comments »
Tuesday, April 21st, 2020
For companies in all industries, artificial intelligence is becoming a key driver of competition. In this interview, Professor Frank Kirchner explains what it can and cannot do and where the challenges lie when implementing AI applications. Kirchner studied computer science and neurosciences and has been exploring how AI can be used in the real world for 25 years.
Question: You once said that you came to artificial intelligence through music. How did that happen?
Kirchner: I’ve always liked making music. (Kirchner plays guitar and piano.) When I started studying, I wasn’t in a band at first and tried playing along with drum computers and synthesizers. What bothered me was that the rhythms from the computer were very sterile back then. When a human being plays the drums, there are always slight delays because the player gets emotionally involved. It is barely perceptible, but it has a huge impact on the music. So I started trying to teach my computer to vary the precision slightly at certain points in songs. In the end, I didn’t succeed; it just sounded sterile in a different way. It was only later that I realized that this was an AI problem, but the exercise of programming taught me how creative computer science can be.
Question: How intelligent is AI in reality and where are its limits?
Kirchner: It’s not yet possible to develop AI that acts like a human drummer or guitarist, perceiving or even producing emotional states and then adapting their playing accordingly. But with today’s methods, we can mimic human emotions or playing styles by giving the algorithms thousands of examples. This works not only with music, but also with painting. You can train machine learning algorithms to reproduce pictures in the style of certain painters.
Question: What aspects of AI are you currently working on at the Robotics Innovation Center?
Kirchner: We are developing robotic systems with AI algorithms for various fields of application, ranging from exoskeletons for the rehabilitation of stroke patients to autonomous underwater robots for inspecting offshore wind turbines, production robots that can be deployed alongside human workers in tomorrow’s production facilities and space applications. For example, we are currently building robotic systems for the European Space Agency that will autonomously map the lunar surface to detect cavities in the lava layers that can be used to build a lunar base.
Question: What fields of application do you see as deriving the greatest potential benefit from AI?
Kirchner: I see a massive benefit in the field of medicine, which is currently facing enormous pressure as a result of the coronavirus pandemic. In particular, machine learning processes could support human diagnosis or relieve hard-pressed medical staff of routine tasks that can be done by AI-based robots. In agriculture, which has a huge problem with the lack of seasonal workers, AI could automate the picking of strawberries or asparagus. Simple automation technology cannot cope with these jobs because each plant grows differently. You need machines with a certain amount of intelligence to recognize the context.
Question: Did you deliberately not mention the field of production automation?
Kirchner: Of course, that’s also a field of application in which AI-based robotics plays an important role. For example, we are working with VW on hybrid teams of humans and robots to get away from the traditional production lines with “dumb” robots that always do the same thing. The aim is to create robotic systems that can be deployed flexibly and act as assistants to humans, even if they are only positioning workpieces, thus relieving them of heavy manual work.
Question: In which industries is AI currently used most intensively?
Kirchner: In Germany, as in other countries, it has been in use for some time – and very intensively – in the financial sector. AI methods are used in office automation for text, voice and image recognition or in the security sector, e.g. at airports, although this is not always apparent. They are becoming increasingly widespread in medicine and have also made their mark in production in the context of Industry 4.0. The networking of machines using AI algorithms provides the basis for increasing productivity.
Question: What are the difficulties in implementing industrial AI applications?
Kirchner: I think one of the greatest bottlenecks is the lack of digital infrastructure. Alarmingly, a large proportion of German companies are still living in the analog world. The infrastructure for collecting data from production, logistics, administration and even management is not particularly well developed. We have some catching up to do in this area. Although many German SMEs and smaller companies have begun to wake up to the problem, they don’t know where to start with digitalization, as the rest of the infrastructure in Germany is not a great deal of help. We’ve been talking about the nationwide rollout of fiber broadband for 20 years, and nothing has happened. This is a real competitive disadvantage.
Question: What opportunities does AI offer SMEs in particular?
Kirchner: I see massive potential there, which absolutely has to be exploited if SMEs want to remain successful. Because ultimately, they also are global players and their development, production and logistics have to be very fast and cost-effective to survive in the global market. And they have to be able to react quickly to varying market situations. The problem lies primarily with small companies, which often lack their own research capabilities. Our experts support them in the development of AI-based solutions in fields ranging from the automotive sector to mining – something that distinguishes us from other research institutes.
Question: You worked in Boston for several years. Are people more open to AI there?
Kirchner: In the USA, but also in China, people recognize the benefits of AI, whereas in Europe we tend to emphasize the risks. Leaving aside which is the better approach, it is vital that we Europeans be at the forefront of AI development. Only by playing with the big boys can we influence how it is developed and, above all, how it is used. Otherwise we will become dependent, with all the negative consequences that we are seeing in the pandemic. We need greater digital sovereignty.
Question: What are the challenges currently being faced in AI research?
Kirchner: One of the challenges lies in integrating the various AI methods. On the one hand, there is the area of symbolic AI in traditional, logic-based methods. These have weaknesses when it comes to physical phenomena in the real world. In such areas, sub-symbolic AI methods such as machine learning, neural networks and so on work very well. Then there is a third area that I call physical AI, i.e. the embedding of all these methods in robots or other objects of the physical world. The challenge is to integrate these three areas to form a hybrid overall system. At the same time, this forms a basis to allow AI decisions to be explained and become transparent, which is important if people are to trust the technology.
Question: Is it true that self-learning systems are trained with historical data and make decisions that are often not comprehensible?
Kirchner: That’s correct. On the one hand, we must provide the computer scientists who train these algorithms with the appropriate skills. They need a very high level of knowledge about the data they are using, where it comes from and how it was obtained. These data skills must be firmly anchored in the computer science curricula. The second issue is that the AI algorithms, for example when evaluating MRI scans, must also give the doctor an explanation as to why they have identified a carcinoma. Only then does the doctor have a basis for accepting the decision. It is precisely this kind of transparency that we have to incorporate.
Question: Do we need something like ethical rules for the use of AI?
Kirchner: Yes, of course. But we have to develop them on the basis of the ethical values that we already have and which apply to all technologies. I don’t see that AI raises new ethical issues and I don’t see any way in which this could be incorporated in the technology. It primarily concerns people.
Prof. Kirchner, thank you very much for the interview.
(The interview was conducted by Michael Wendenburg)
About Frank Kirchner:
Professor Frank Kirchner has headed up the Robotics Group in the Faculty of Mathematics and Computer Science at the University of Bremen since 2002. He is also spokesperson for the German Research Center for Artificial Intelligence (DFKI) in Bremen and is in charge of the Robotics Innovation Center research department. Kirchner studied and obtained his doctorate in Bonn. He worked as a researcher for several years at the Northeastern University in Boston (USA) and took charge of establishing the Brazilian Institute of Robotics in Salvador de Bahia, which was founded in 2013 and was modeled on the DFKI. Kirchner is one of the leading experts in the field of AI-based robotics and has more than 350 publications on robotics and AI to his name.
No Comments »
Monday, April 13th, 2020
The impact of COVID-19 has left much of the manufacturing and supply chain industry at a standstill. Stock markets have fallen to record lows, unemployment is at an all-time high, entire industries such as the airlines are asking for bailouts, retail shops have closed, and most states have imposed stay-at-home orders.
Prior to the outbreak of the coronavirus in the US, CEOs, CIO/CTO’s and upper leadership sought to drive digital transformation and push towards efficiency and optimization in the workplace. IoT, blockchain and 3D printing were just some of the cutting-edge technologies our leaders focused on with a vision of the ‘factory of the future”.
Perhaps the biggest driving factor now spearheading change isn’t the C-level executive, but rather a global pandemic. The coronavirus has challenged us to quickly adapt to an ever-changing workplace. From remote working and collaboration to staying in touch and keeping a sense of ‘normality’, all aspects of our daily work routines have been affected.
(more…)
No Comments »
Tuesday, April 7th, 2020
Digital Twins offer the possibility to simulate the behavior of physical assets, to monitor them during operation and to continuously improve them. The data and models from planning and development form the context in which the operating data can be interpreted correctly. Putting them together from the wealth of available information is an essential prerequisite for the use of digital twin applications.
The Digital Twin is the digital image of a physical object or system, which can be a product, a production plant, but also a company or a process. The Digital Twin connects virtual planning and development models with the real product or production world in order to give people a better insight into the system and its condition or behavior. A vision in the sense of Industrie4.0 is to enable technically complex systems to control themselves autonomously and behave more intelligently through digital algorithms, virtual models and status information.
The functional relationships of a product or a production plant are defined based on customer requirements and in consideration of a multitude of legal requirements in product planning and development. Without knowledge of these interrelationships, the operating data that the real asset captures and provides in its later product life cannot be interpreted correctly. If you do not know how a machine or system is actually supposed to function, it is not possible to identify the causes of deviations from this target state or behavior beyond doubt and take appropriate countermeasures. At the same time, knowledge of the history of origins is also important in order to be able to assess for what reason, for example, a bearing has failed and which other machines could also be affected by the problem.
This connection between the real asset and the development and planning models describing its history is called a digital thread. It is the digital “red thread” that links the information of a real product instance across processes and IT systems. On the one hand, this makes it possible to bring together all the information from the life cycle of the product instance or the real asset and thus forms the basis for the creation of a digital thread. Without a digital thread, the digital twin can be reproduced manually, but it is difficult or impossible to keep it up to date. On the other hand, traceability along the Digital Thread allows decisions in development and production to be questioned and optimization potential to be identified with the help of the operating data.
Management of product configurations
From a PLM point of view, the starting point of the digital twin is a specific configuration of the product or production system, for example the asset in its delivered state. This includes not only mechanical, electrical/electronic and software components with their models, but perhaps also service-relevant information, such as the service life of certain components. Bringing this information together and maintaining it manually is time-consuming and error-prone, especially since the configuration changes over the course of the product’s life, whether through software updates or other measures in the context of maintenance or further development of the asset. The expectation of today’s PLM systems is to automatically extract the configuration for the Digital Twin and keep it up-to-date.
We speak here of the concept of Configuration Lifecycle Management (CLM), which makes it possible to generate temporally valid views of the product across IT system boundaries and to manage product configurations across all phases of the product lifecycle. The main function of CLM is to create and keep consistent the various views of the digital product model during the life cycle, and to document their validity over time. To do this, it uses cross-system and cross-discipline baselines. These baselines document the state of the configuration at a certain point in time or maturity level and thus also control the representation of the Digital Twin. They enable companies to immediately and reliably answer the question at any point in the process whether and how the product or asset meets the requirements placed on it or in what state the asset was at a defined point in time, for example, which product configuration was delivered to the customer.
In order to manage the configuration of a product along its entire life cycle in a traceable manner, the use of a powerful PLM integration platform with connectors to all IT systems involved is required. As an intermediate layer spanning all IT systems, it creates the prerequisite for bringing together the information from the individual IT systems in a way that corresponds to the digital thread concept.
Cross-company collaboration
In industries such as mechanical and plant engineering or shipbuilding, companies face the challenge that the manufacturer who builds and provides the Digital Twin is not necessarily the operator and user who feeds it with operational data. Both the digital data and the operating data, or at least part of it, must therefore be exchanged and synchronized across companies in order to keep the Digital Twin up to date and to be able to use the operating data for the continuous improvement of real assets. Questions such as data security, protection of intellectual property and ownership of the data therefore play a very central role in the development and use of a digital twin application.
More and more customers today require their suppliers to deliver digital data and models to support Digital Twin applications along with the physical assets. CLM can be used to control not only the amount of information provided, but also the level of detail of the information and the formats in which it is delivered. They can be compiled largely automatically and made available to the customer as a data package, for example in 3D PDF format.
In order to maintain digital consistency in cooperation across company boundaries, the exchange partners must first agree on the scope of the information to be exchanged and agree on common standards for handling this information. But the central question is where the Digital Twin should live? PROSTEP is convinced that it is advisable to set up a joint collaboration platform for this purpose, which will become part of the information model. This platform will provide customers with the information they need to build their Digital Twin application while the development process is still underway and will also allow them to synchronize changes to the master models during operation if necessary. The common platform can also be used to link parts of the operating data required by the manufacturer for new service offers such as predictive maintenance or product improvements with the Digital Thread.
Three building blocks for the Digital Twin
The foundations for the Digital Twin are already laid in product development and production planning. To bring it to life and keep it alive, the digital umbilical cord must not be cut. This is why an integration platform is needed that makes the digital information from the various authoring and data management systems available at any time. A powerful configuration management system that manages the relationships between the information scopes and their validity is essential for building a Digital Twin. However, digital consistency is not a one-way street. In order to derive maximum benefit from the product twin in terms of closed loop engineering, traceability between Digital Twin and Digital Thread must be ensured. The creation of a collaboration platform maintains digital consistency even beyond company boundaries.
By Lars Wagner
No Comments »
Friday, April 3rd, 2020
In one fell swoop, the robotics and automation specialist KUKA has migrated its SAP installation, introduced Teamcenter as its new PLM system and reorganized the entire engineering-to-order process. Crucial to the project’s success were the soft PLM migration, during which the legacy and new system coexisted for a short period, and the consistent cleansing of the data, which KUKA undertook with the assistance of PROSTEP AG. PROSTEP also accompanied KUKA during the changeover to the current Teamcenter version.
KUKA, which is headquartered in Augsburg, is one of the world’s leading suppliers of automation solutions. KUKA offers customers everything from a single source: from robots and cells to fully automated systems and their networking. The company, which was founded over 120 years ago, employs around 14,200 people worldwide and generated revenues of 3.2 billion euros in the 2018 financial year.
The first step in the company’s transformation program – “Power ON KUKA 2020″ – was to standardize the process and system landscape in the engineering-to-order (ETO) sector. ETO is the term KUKA uses to describe everything relating to the development of custom-built production systems for the automation of manufacturing processes– in contrast to its configure-to-order (CTO) business involving robotic components and systems. The PLM migration project was driven first and foremost by the ETO sector, as Project Manager Matthias Binswanger affirms. However, the project also had to be synchronized with the consolidation of the global ERP landscape that was taking place at the same time.
KUKA previously had a very heterogeneous ERP and PLM landscape, which was partly due to the increasing scope of the group structures. For example, the ETO specialists in Augsburg worked with a local SAP instance and an older version of the former Eigner system, Oracle Agile e6. After an in-depth system selection process, KUKA decided to implement the Teamcenter PLM system from Siemens Digital Industries Software as the global solution for all its ETO locations.
Teamcenter is intended to support the future product engineering process, including functional engineering, manufacturing process planning and simulation, as well as control engineering change management. To do this, it has to be familiar with the relationships between the mechanical, electrical and fluid components of the functional units (for example valves, sensors and their processes), which were mapped in a separate application in the old world. Changes are part of the ETO sector’s everyday business because the systems are often designed before the products to be manufactured on them are fully defined. “One major challenge is the complexity that results from the sheer volume of changes to thousands of components,” explains Binswanger.
PLM implementation was already underway when KUKA launched the parallel consolidation of the heterogeneous ERP landscape in order to give greater transparency to its project activities. The simultaneous changeover to SAP S/4HANA considerably increased the complexity of the PLM migration, as Binswanger explains: “To introduce the new solutions, we made use of a clear project control mechanism with a flexible, multi-stage project structure that did not previously exist in this form. This went hand-in-hand with changes to the engineering processes and methods, which in turn had repercussions for the PLM landscape and therefore also had a big impact on PLM migration.”
To migrate the PLM system, the project team called on the services of the experts from PROSTEP, who brought to the project not only their PLM expertise and many years of experience in performing migrations but also PROSTEP’s proven OpenPDM integration platform. “There aren’t many companies that have certified connectors to Agile e6 and Teamcenter. As a result, there was really no real alternative to PROSTEP,” explains Binswanger. The PLM consulting and software company also assisted the customer during the cleansing of the master data prior to the start of the migration. When considering this step, it is important to understand that at KUKA materials, BOMs, etc. are currently created in the PLM system, or in both systems, and then published to the ERP system.
While the changeover to SAP S/4HANA was to follow the “big bang” approach, KUKA chose the soft route for its PLM migration, with the legacy and new systems temporarily coexisting. Although Teamcenter is the target system for the new architecture, the idea was to conclude any open projects in the old PLM environment. Binswanger explains that migrating them all in one fell swoop would have required enormous effort. Agile only works with documents, materials, BOMs and structures, whereas the CAD data is managed using a file-based approach or in containers. Teamcenter, on the other hand, provides interfaces to all the CAD systems, system versions and releases used at KUKA, which means that CAD files in different formats can be stored together with the materials for the first time.
Direct synchronization of the PLM data
The changeover to SAP S/4HANA and the temporary coexistence of the two PLM systems meant that the migration resembled a billiards shot across three cushions. First of all, Agile e6 had to be updated and interfaced with the new ERP system so that materials and BOMs could be correctly linked to the new project structure. It was then necessary to connect the two PLM systems in order to achieve the cross-system synchronization of standard parts, catalog parts and other materials. Binswanger explains why it was not sufficient to simply synchronize them via SAP: “PLM data with no logistical relevance is not published to the ERP system in the first place. However, this data is important for the Teamcenter users so that they can re-use the materials stored in Agile.”
The OpenPDM integration platform provides the basis for PLM data synchronization. It is designed to transfer all the materials between the two system environments and not only the standard and catalog parts. PROSTEP adapted the Teamcenter connector a number of times in order to take account of changes in the data model. All types of document are now also transferred together with the PLM metadata. Automatic quality checks ensure that the documents meet the requirements of the Teamcenter data model. “We have an activity-driven application which automatically synchronizes the data sent to Teamcenter every five minutes, that is to say it creates new materials together with their attributes, structures and documents or updates modified ones,” says Binswanger.
Contrary to the original planning, KUKA decided to actively shut down the legacy system rather than simply phasing it out gradually. This allows the company to save on the high license and maintenance costs involved in operating two systems. In order to meet requirements regarding traceability, the documents relating to long since completed projects also have to be migrated to Teamcenter. Binswanger explains that in order to do this, it will be necessary to relax the quality requirements a little and store the documents uncleansed in a separate archive, where they can be accessed only for reading and printing.
Data selection and cleansing
Due to the simultaneous changeover to SAP S/4HANA, the PLM migration in Augsburg started later than planned but with considerably higher-quality input data. The project team took advantage of the delay to implement a clearly structured, documented OpenPDM-based process for cleansing the data. One clear specification was that, of the 3.3 million data records in the old SAP solution, only those materials that are relevant for future projects should be transferred to the new environment. Therefore, it was first necessary to identify the data that needed to be migrated.
On the basis of over a dozen criteria and taking account of various attributes, PROSTEP calculated the so-called Total Article List (TAL) from the 3.3 million data records in SAP and Agile. The TAL is a list of all the articles that have been ordered or installed in systems, used for service purposes in recent years or are still in stock. It now comprises “only” 1.2 million articles. According to Binswanger, PROSTEP’s ability to resolve the structures and identify the components for any given article is of decisive importance.
The TAL controlled not only the big-bang migration of the SAP data but also acted as master for the selective cleansing and migration of the PLM data. In particular, the repeat parts (standard parts, purchased parts, semi-finished products, etc.) had to be augmented with additional data and classified before being imported into Teamcenter. To do this, KUKA used the software classmate from simus systems together with other solutions. OpenPDM controlled the entire cleansing process, from the extraction of the data to manual or automatic cleansing through to validation of the results, and also generated the corresponding quality reports. A total of approximately 80,000 articles passed through one or other of the programs in the “data washing machine”. Only the data that ultimately met all the quality criteria was automatically imported into Teamcenter.
In Augsburg, SAP S/4HANA, a new Agile version and Teamcenter all went live on the same day. An important milestone for KUKA. According to Binswanger, PROSTEP, its OpenPDM software platform and its expertise played a key role. KUKA successfully took advantage of the migration project to cleanse its database of unnecessary clutter.
The Teamcenter application was continuously further developed after the go-live. This repeatedly required adaptations to OpenPDM, which PROSTEP implemented in agile sprints. One major challenge was to migrate the documents from ongoing Agile projects because the data models in the two systems are very different. The last hurdle for the time being was the changeover to the new Teamcenter version 12, which required a change of integration platform version. Thanks to PROSTEP’s support, the company was also able to surmount this hurdle without any problems.
By Andreas Hoffmann
No Comments »
|