Archive for April, 2021
Wednesday, April 21st, 2021
The digital twin is an important, if not the most important, enabler for the digital transformation of business processes and the development of data-driven business models. This is why it is the focus of numerous digitalization initiatives in a wide range of industries. Companies however face a number of challenges when it comes to implementing the digital twin. One of these challenges, albeit not the biggest, is the fact that their existing PLM capabilities are most likely insufficient for this purpose and need to be expanded.
Everyone is talking about the digital twin, or perhaps I should say almost everyone. At CLAAS, a manufacturer of agricultural technology, this term is not used to avoid alienating users with grand terms, as Kai Korthals explains in an interview with the PROSTEP Newsletter. For many companies, the term digital twin is still a buzzword that everyone takes to mean something different. Even in the shipbuilding industry, despite its many digital twin projects, there is still no common industry-wide understanding of what a digital twin is, as indicated by a recent survey conducted by PROSTEP.
A concept study that we prepared together with 3DSE for Airbus Defense & Space significantly sharpened my understanding of the digital twin. The findings have been incorporated in a white paper that I recommend you read. A key insight is that there is, or should be, something along the lines of a generic digital twin that accompanies the product or production system throughout all the phases of its lifecycle – from its as-designed/planned/manufactured through to its as-operated/maintained state.
The phase-specific configurations of the digital twin have a shared data basis, which is also used to derive configurations for the use cases to be supported. Creating a special digital twin for each use case would not be a viable solution as it would create isolated solutions and data silos. The aim must be to keep the digital twin as redundancy-free as possible across all products and variants, which is why it places much more demanding requirements on configuration management.
From the very start, we need to think about how we can weave the digital twin from the digital thread. I would even go so far as to turn the tables and say that it is ultimately the digital twin that determines the requirements when it comes to end-to-end digitalization. It determines which information we need to link for which use case and with which level of granularity. In my opinion, approaches based on data lakes do not work. It must be possible for field data collected while a product is being used to be connected to the correct development data in a transparent manner. Establishing relationships at a later date using semantic searches or AI results is, at most, an 80% solution and always prone to errors.
The key requirement for the digital twin is access to the “core data” in its “atomic” form. This means that we need to move away from file-based product lifecycle management toward granular access to all the information objects in the product development process. Freezing a bunch of documents at specific baselines might improve auditability, but it is no digital twin.
Developers need to know the relationships between individual objects, for example in order to understand what impact changing a requirement will have on a particular function, on the costs, on the manufacturing process, etc. Knowing which circuit diagram is affected is of no help because hundreds of functions can be described in a single circuit diagram. No PLM concept today provides appropriate support, neither in terms of technology nor methodologically. Extending the PLM concept to include additional PLM capabilities is therefore an essential prerequisite for the digital twin and one of the challenges that Airbus is addressing with its Shared digital Enterprise Services.
However, the biggest obstacles standing in the way of digital twin initiatives are not of a technical nature. For one thing, the companies have their own “fiefdoms” with separate system structures and methodology, which in the short term enjoy no direct benefit from end-to-end digitalization at cross-domain level. The initiative should therefore be driven forward in a strategic manner by someone above domain level. In addition, many companies today make a lot of money from service-related activities. A digital twin that results in customers needing fewer services is to a certain extent counterproductive. A major problem when it comes to end-to-end digitalization from development through to operation is the change of ownership of the physical product. As a result, manufacturers no longer have access, or only limited access, to the operating data that would allow them to gain insight into product behavior.
Offering your products as a service provides an elegant solution to this problem. But you might not want to wait that long before you launch your digital twin initiative. We recommend that you tackle concrete projects that offer economic added value as soon as possible. PROSTEP can provide you with effective support in this context. We have the required expertise and a wealth of experience implementing digital twin concepts in a variety of different industries.
By Karsten Theis
No Comments »
Saturday, April 17th, 2021
The products from agricultural machinery manufacturer CLAAS are becoming ever more complex, and this also true of the associated development processes. Talking to PROSTEP Newsletter, Dr. Kai Korthals, Head of Digital Product Engineering, explains how CLAAS intends to master this increasing complexity and looks at the role that PDM/PLM is playing in the company’s digitalization strategy.
Question: How important a role are product-service systems playing in CLAAS’s product portfolio now?
Korthals: That depends on what you mean by a product-service system. If you mean supplementary services such as predictive maintenance as part of the after sales service or features such as software updates over the air, it is an area of growing importance for us and represents a major challenge.
Question: What new requirements arise from this with regard to product development?
Korthals: In particular, we have to take solution-neutral customer requirements as a basis for integrating software development, balancing processes and methods from the very beginning. Which is why model-based systems engineering (MBSE) is a very important topic for us. In addition, seemingly mundane topics such as the quality of master data, which we have been working on ever since the advent of PDM, are enjoying something of a renaissance. Even the topic of the end-to-end use of 3D, which is not in itself new, is taking on a new dimension. Suddenly, we find ourselves collaborating with game vendors like GIANTS, who use our 3D models for their farm simulators and in return provide us with rendered models for our sales activities. The coronavirus pandemic in particular has increased demand for virtual sales meetings and training sessions with customers, for example, where we use animated renderings to show them how to get into the cab.
Question: What does this mean for your digitalization strategy? Where are the key fields of activity?
Korthals: There are a number of pillars to our digitalization strategy: modeling and connecting with MBSE, visualization, i.e. the issue of digital continuity with a focus on 3D, and validation using simulations, which is an aspect that should not be overlooked in the wider discussion about digital transformation. In other words, the basic topics remain the same as they were ten years ago. What has changed is the way in which these topics and, indeed, the data models are interlinked. I can map these connections using MBSE, but I also need the link to the tasks in project management or to the configuration for production. Which brings us on to the issue of traceability.
Question: Is traceability driven more by the complexity of product development or the legal burden of proof?
Korthals: Traceability remains important in the context of functional safety, but with regard to the product service systems already mentioned or to autonomous systems, mastery of the technical, process-related and organizational complexity is becoming increasingly important. You can’t negate the complexity. Instead, you have to make it manageable. For this, we need MBSE and configuration management throughout the lifecycle in order to make the interrelationships easier to understand.
Question: You just said that MBSE is an important topic for you. What do you see as the main drivers of this approach?
Korthals: There are undoubtedly a number of different drivers, but they can be grouped together under the term complexity. Ultimately, it is the growing proportion of software, the interconnected development of cross-product features and globally distributed development that lead to increasing complexity at the product, process and organizational level.
Question: You get the impression that CLAAS is to a certain extent pioneering the use of MBSE. Is this the case?
Korthals: That is for others to judge, but we are of course represented in a large number of working groups and we see what other companies are doing. So I think I have some justification in saying that we have a very holistic approach to the topic and have already made considerable progress.
Question: To what extent has what you have validated in the Systems Engineering Solution Lab already been implemented in the product development process?
Korthals: The various aspects are at present being rolled out one by one. We are currently rolling out validation management. But we are not migrating all ongoing development projects to the new process and the new tool environment in a big bang, because that would inhibit the projects. Instead, we are introducing it gradually across the projects until we reach a tipping point, as the users in cross-sectional functions have a vested interest in avoiding the use of parallel systems.
Question: Does PDM/PLM still play an important role in your digitalization strategy?
Korthals: Yes, absolutely. Our digitalization strategy has three major pillars. Firstly, we want to digitalize our interaction at the point of contact with the customer and thus make it independent of time and place. The second is the empowerment of our employees, i.e. we want to drive digital transformation as a change process. And the third major pillar is the topic of the digital enterprise, which break down to the level of Industry 4.0. PDM/PLM is in many cases the enabler that brings together the internal view, external view and empowerment. Without this foundation, digital transformation simply collapses like a house of cards.
Question: You are working very closely with Dassault Systèmes to implement your digitization strategy. Is your broad PDM/PLM vision feasible with a monolithic system landscape?
Korthals: Your question is understandable. PLM experts have for years been promoting the idea that monolithic systems are dead. We are aware of the fact that we are to a certain extent placing ourselves in a position of dependency, but we have done very well with Dassault so far. You have to remember that we are not just a customer, but a strategic R&D partner and write user stories for the developers in Vélizy, so our needs are implemented very quickly. Not only that, we don’t source all of our expert systems and authoring systems from Dassault. There are certain environments for software development and simulation that we will not be replacing. At the system level, however, the opportunities offered by a monolithic approach outweigh the risks.
Question: In what use cases are you already using digital twins?
Korthals: We don’t use the term ‘digital twin’ at CLAAS yet, partly because of our experience with the introduction of systems engineering. When we started using it five years ago, we tried to avoid coining some grand, new term without any concrete benefit for the user being apparent, because that simply discourages people. Although we had our strategy in mind, we approached the issue very much on the basis of use cases. And we’re doing something similar with the digital twin. We have plenty of very concrete use cases, for instance moving maintenance documentation to a kinematic DMU to show a service technician in Uzbekistan how to change the oil filter without the need for words. But we don’t refer to this as a digital twin.
Question: How important is the East Westphalia cluster for CLAAS?
Korthals: The cluster is extremely important to us. From our Systems Engineering Solutions Lab, we have started research projects together with Fraunhofer IEM, one of which has been merged into it’s OWL. And then there is the MOSYS project for human-oriented systems engineering, which is funded by the German Ministry of Education and Research. Collaboration with other it’s OWL partners such as Miele allows us to discuss our future needs and system requirements. The research projects have also allowed us to hire additional staff for our Solution Labs, which helps us to become faster.
Question: What needs and system requirements do you see in the future?
Korthals: As I have already said, connecting and visualizing information is currently a big driver for us. It is only this combination that makes complexity really comprehensible and thus manageable. In our Solution Labs, we have found out that we can build databases to connect certain artifacts with each other, requirements with test cases, or with architecture models, with CAD models, with circuit diagrams and so on. But the problem is that at the end of it all, no normal person has any hope of understanding it. That’s why we have to be able to pull out these relationships in a way that is specific to the application and the user and, where possible, visualize them in the 3D model in order to quickly make the complexity understandable across different locations, languages and roles.
Mr. Korthals, thank you very much for talking to us. (This interview was conducted by Michael Wendenburg)
About Kai Korthals
Dr. Kai Korthals has worked for agricultural machinery manufacturer CLAAS since 2014 and is currently Head of Digital Product Engineering. In this role, he and his team are responsible for the CLASS engineering system. This comprises the engineering processes, methods, data models and applications for CAD, PDM/PLM and systems engineering. Korthals studied industrial engineering, majoring in mechanical engineering and production technology at RWTH University in Aachen. He subsequently completed his doctorate at the RWTH’s machine tool laboratory in the field of production-oriented product development.
No Comments »
Friday, April 9th, 2021
When exchanging CAD data, it is important to ensure the transfer of complete and valid assemblies, which might possibly be used for downstream operations such as conversion. PROSTEP has therefore expanded its OpenDXM GlobalX data exchange platform to include intelligent CAD analysis functions, which efficiently analyze and structure assemblies before they are transferred.
CAD Analyst is available to users in their send client and is activated using Windows Explorer and the CAD send function on the right mouse button. Depending on the selection made by the user, the CAD data is analyzed and structured in such a way that assemblies are transferred for post-processing as separate data sets.
When a CAD file is selected, an intelligent algorithm checks whether the file contains an assembly for which dependent CAD files exist. All the data belonging to an assembly is transferred as a single data set. If multiple files are selected, analysis is performed for each of the files and the corresponding number of data sets is created.
If a directory is selected, CAD Analyst first checks whether the directory includes one or more root elements for assemblies. If this is the case, these assemblies and all dependent components are entered in the send job as separate data sets. Once all the assemblies have been processed, the algorithm repeats the process for all individual files, which are also entered in the job as separate data sets.
If neither method allows CAD Analyst to find all the referenced files, the user receives a warning that his assembly is incomplete. The user can then decide whether or not to start the transfer. Once analysis has been completed, he can generate a 3D preview for each data set and view it in his web browser. This preview file can be included in the transfer, thus making it possible for the recipient to quickly view the files they receive.
CAD Analyst supports the analysis of files in widely used CAD formats such as CATIA V5, Creo (Pro/E), Autodesk Inventor, JT, NX, Solid Edge and SolidWorks. There is no need to install CAD software in order to use the analysis and viewing functions. We deliver the libraries needed to do this with the Windows integration for OpenDXM GlobalX. The new analysis module is an optional extension to the basic OpenDXM GlobalX module and can be expanded to include CAD extensions for processing, converting and checking CAD data.
Find out more about creating send jobs with CAD Analyst in this video.
By Daniel Wiegand
No Comments »
Monday, April 5th, 2021
January saw the launch of the joint project “ICT-enabled model-based impact analysis in product development”, or ImPaKT for short, which is being funded with the support of the German Federal Ministry of Education and Research (BMBF). Within the framework of the project, PROSTEP will be extending its OpenPDM family to include a software module for the cross-domain coordination of changes and validating the functionality of the solution together with industry partners.
The more complex and variant-rich products become, the more time-consuming it is for companies to reliably analyze and evaluate the technical and financial impact of changes. The challenges grow when a large number of partners and domains are involved in the product development process. Impact analyses are designed to help companies identify the possible impact of product changes in advance.
A consortium or research institutes, software vendors and user companies, under the leadership of the Heinz Nixdorf Institute at the University of Paderborn, is developing a model and IT-based approach with the aim of making this type of analysis in product development easier. The joint project, which was launched in January, will run for three years and has a project volume of approximately four million euros.
The number of product variants is constantly growing. Every modified detail means changes in the design and production processes of all the partners involved. When it comes to developing complex products, incomplete and distributed data and knowledge bases, media discontinuities in the information flows, a lack of supplier integration and the large number of variants make engineering change management (ECM) a time-consuming and error-prone process. In the joint ImPaKT project, the consortium partners intend to develop a solution that makes it possible to efficiently analyze the impact of changes on the basis of a comprehensive data and knowledge base, while at the same time making the complexity of variant management more manageable using function-oriented impact analyses.
The integration of mechanical, electronic and software components in a single product requires an interdisciplinary development process. A key objective of the project is the development of a reference architecture for end-to-end model-based system development that links the partial models in the existing data repositories created during the development of mechanical, electrical and software system components and creates a common parameter space for changes. The project partners will develop and implement methods for a fully integrated impact analysis using model-based systems engineering (MBSE) and artificial intelligence (AI) algorithms on the basis of this integration platform. Standards for integrating impact analysis in process management and cross-enterprise collaboration are also to be defined.
In addition to the HNI, the Institute for Machine Elements and Systems Engineering at RWTH University in Aachen, the software companies CONTACT Software, Itemis and PROSTEP, as well as the user companies Eisengießerei Baumgarte, Hadi-Plast Kunststoff-Verarbeitung, Hofmann Mess- und Auswuchttechnik, CLAAS Industrietechnik, Knapheide Hydraulik-Systeme and Schaeffler are involved in ImPaKT.
The software partners will be implementing a demonstrator based on the ImPaKT reference architecture. The industry partners’ primary task will be to validate the suitability of the project results for supporting impact analysis on the basis of three case studies.
PROSTEP is contributing its many years of expertise with system modeling and the development of reference architectures to the consortium project. Building on this architecture, we will be developing certain basic services for performing cross-system impact analyses using artificial intelligence (AI). We will be using our integration platform OpenPDM, which is implemented at by over 200 customers worldwide, as the basis for implementing the demonstrator. OpenPDM is the world’s leading solution for synchronizing and migrating PLM data and processes in a wide variety of application scenarios and domains.
We intend to expand the software to include essential ALM and ECM aspects within the framework of ImPaKT. Once the project has come to an end, it is intended that the solution, which is designed as a demonstrator, be turned into a commercial product and marketed under the name OpenCLM. Maintenance of the solution is a prerequisite for long-term commercial use of the project results and provides a benefit outside the circle of consortium partners.
By participating in the consortium project, we not only expect to be able to establish interesting contacts with customers and universities and expand our AI expertise. We also hope it will provide important impetus for the further development of our OpenCLM solution in the direction of cross-system and cross-domain impact analysis. This is a prerequisite for being able to coordinate changes to complex products with an acceptable level of effort.
By Martin Holland
No Comments »
Saturday, April 3rd, 2021
A growing number of companies are relying on agile approaches when developing their PLM systems to enable faster reactions to new market requirements. At the same time, they often want to outsource development activities to offshore partners for financial reasons. Two new white papers explain how PROSTEP supports customers when it comes to using agile methods and introducing agile methods in context of near- and offshoring.
Companies in the manufacturing industry must be ready to quickly respond to changing market and customer requirements. Therefore, they need PLM solutions that support this, for example by making the growing dependencies between software and electronics in connected systems more transparent or ensuring traceability for safety-critical functions. New approaches such as model-based systems engineering (MBSE) or virtual validation of system functionalities by means of co-simulations are needed. The entire PLM architecture must be geared towards change.
IT organizations must also adapt to reduce the time between new requirements and working-functionality implemented in the PLM-Systems. Waterfall or V-model are typically not appropriate to fulfill the dynamics required here. Too much time passes between the definition of requirements and their implementation; time during which the developers do not receive any feedback. They run the risk of developing software that fails to meet the needs of the users. Specifications are often cluttered with requirements and are difficult to change. Then, their implementation is based on the contracts and not on the actual benefits. These and other factors lead to extremely long project runtimes, which can delay the introduction of innovations into productive PLM operations by months and sometimes even years.
A growing number of companies have identified the weaknesses in their existing software development processes and have started introducing agile approaches or are planning to do so. When implementing agile methods, they not only have to decide on a suitable agile model but also find development partners who are able to go along with their agile approach. Furthermore, they have to challenge existing contract models, because in agile approaches, project scope is typically only fuzzily defined at the start of the project.
PROSTEP has been using agile approaches to develop its own software solutions for many years, and as a partner and supplier also brings this experience to bear on customer projects. We are currently involved in agile projects with numerous major customers in the automotive, shipbuilding, and other industries. In many cases, we assume overall responsibility for these projects as general contractor and coordinate subcontractors, be it on site at the customer’s premises or at an offshore partner.
“Our teams combine PLM expertise and hands-on experience with using agile methods. They know the strengths and weaknesses of Scrum, SAFe and other process models from experience gained in the field and can therefore actively help to shape agile transformation at the customer’s site and drive it forward,” says PLM manager Frank Brandstetter. He is the author of PROSTEP’s new white paper, which provides more detailed information about the challenges posed by agile PLM development. (English version available soon.)
The white paper on agile PLM development is complemented by a second white paper in which Rainer Zeifang, Chief Technology Officer Daimler Projects at PROSTEP, reports on his experience with the use of agile methods in nearshoring and offshoring projects. The main driver for the outsourcing development activities is the increasing cost pressure to which we and our customers are subjected.
PROSTEP has been working together with selected nearshore and offshore partners on both the development of its own software products and on customer projects for some time now. We also make use of nearshoring internally. For the past year, we have been maintaining a subsidiary in Wrocław, Poland, which uses agile Scrum teams to provide the development team in PROSTEP’s Berlin office with support in the context of software development projects for major automotive customers.
Agile approaches are compatible with nearshoring and offshoring, but they also amplify some of the challenges involved. The partners have to create a common understanding of the customer project and exchange know-how that is generally in the heads of the developers. They need to establish a uniform approach to ensure that the software being developed is consistent and enables a coherent user experience despite distributed teams and long distances. And they must break down obstacles to communication or find new forms of communication that are compatible with agile approaches.
As Zeifang explains, personal contact and interaction are crucial for project success. “At the start of the project in particular, it is important that the key players get to know each other personally in order to exchange know-how but also to understand what makes their counterparts tick, what is important to them, and how they work.” In the new white paper, he answers questions like: What advantages and disadvantages do time differences offer when it comes to agile software development? How should the distributed agile teams be structured? Does nearshoring and offshoring work with all agile process models?
Download the white paper here.
By Joachim Christ
No Comments »
|