Archive for February, 2019
Monday, February 4th, 2019
PROSTEP is launching Version 9.0 of PDF Generator 3D earlier than planned. The major innovation in this version is support for data exchange and coordination processes in the architecture and construction industries. PDF Generator 3D now also recognizes BIM data in the formats Revit and IFC 4, which can be automatically converted into 3D PDF documents or made available in the web in HTML5 format.
A large number of companies and disciplines are involved in the planning, execution and management of buildings, and they generally use different software tools. As they work, they are faced with the challenge of reliably collecting a wide range of information and changes over the course of a building’s life and communicating them to all the parties involved. Building Information Modeling (BIM) is a method for collecting and combining the relevant building data using a digital building model.
The new version of PDF Generator 3D supports this method by making it possible to merge building data from different authoring systems in the formats Autodesk Revit or IFC 4 (Industry Foundation Classes) in 3D PDF containers or to convert it into HTML5 format so that all the parties involved can view it using the standard Adobe Reader or a common web browser.
Thanks to the lightweight neutral formats, even extremely large building structures can be converted and displayed reliably. This facilitates end-to-end utilization of 3D BIM information for the purpose of coordinating with property developers, inspection agencies, construction companies and future building operators.
PDF Generator 3D 9.0 also offers new functions for hollowing out geometry, which on the one hand improves performance when displaying the model data and on the other hand protects the design know-how inherent in the models. The software not only supports the new CAD and BIM formats but also the latest versions of all the widely used CAD formats, such as SolidWorks 2019 and SolidEdge 2019, and the import of JT 10.2 data. The new version of the PDF Generator 3D includes more than 130 improvements to details when converting CAD data. It is now available.
No Comments »
Sunday, February 3rd, 2019
Digital twins make it possible to perform material flow simulations for plant layout and bottleneck analyses. Building digital twins for existing production systems is, however, extremely complicated. As part of the DigiTwin joint project, PROSTEP and three partners are developing a procedure for creating digital simulation models from the 3D scan data generated by production systems largely automatically.
Material flow simulations for bottleneck analyses, plant layout and inventory analyses help improve operational workflows. Up until now, developing corresponding simulation models was extremely complicated, making it difficult for small and medium-sized companies to use them. Digitalization, however, offers new possibilities for simulating and optimizing the real-life situation in production with the help of a digital twin. In the DigiTwin project, the Institute of Production Engineering and Machine Tools (IFW) at the University of Hanover, together with PROSTEP, isb – innovative software businesses and Bornemann Gewindetechnik, are examining how digital twins for existing production systems can be created more easily.
The research project, the full name of which is “DigiTwin – Effiziente Erstellung eines digitalen Zwillings der Fertigung” (Efficient Creation of a Digital Twin for Production), is being funded by the “SME innovation: Service research” initiative of the German Federal Ministry of Education and Research. Within the framework of the project, the partners are developing a service concept for deriving simulation models from scans of the factory floors largely automatically. The idea is to use object recognition to convert, with a maximum of automation, the 3D scan data from production into digital models that can be mapped one-to-one in the simulation software. The aim is to make both the layout of the production facilities and the logic of the production processes transparent.
In the project, PROSTEP is responsible for transforming dumb point clouds of machines, robots, transport equipment, etc. into intelligent CAD models that can then be used to simulate the manufacturing processes. With the help of methods from artificial intelligence and machine learning, the solution uses the point cloud, or the network geometry derived from it, to identify similar system components, which are stored in a library together with their CAD models.
It is intended that system components for which there is no equivalent in the library be converted into CAD models and parameterized with the help of feature recognition so that they can be prepared for simulation. This means that the simulation models can easily be adapted to take account of company-specific characteristics. PROSTEP’s data management team will make the services for object recognition, object harmonization and conversion available via the data logistics portal www.OpenDESC.com.
Production systems generally vary from company to company. Company-specific machine configurations and special chaining logic cannot, of course, be derived directly from the scan data, which is why the scientists at IFW query this data using standardized forms. This minimizes the amount of time and effort needed to adapt the simulation models, thus also ensuring that the concept remains attractive to small and medium-sized companies. The project partners need only a few days to create a digital twin that can also be adapted quickly in the event of changes to production. As this is a service concept, no programming knowledge is required on the part of the customer.
No Comments »
Friday, February 1st, 2019
In today’s world, agile methods are the state-of-the-art when it comes to IT project management. Corporations also use them in large-scale IT projects that they implement together with service companies. The most common agile process model is Scrum, which has to a great extent replaced the conventional waterfall model and its variants. It does, however, require a redefinition of customer/supplier relationships.
In the past, large-scale IT projects involving external service providers were usually executed on the basis of service contracts and billed at a fixed rate. This model is incompatible with agile methods in that there is no longer a clearly defined scope of delivery that could be the subject of formal acceptance and billing at the start of the project. Instead there is a rough target (product vision), a timeframe and a team. This means that the customer’s management must make an investment decision without a detailed cost estimate, rather like a declaration of intent: We will invest the sum of x euros within a defined period of time in order to achieve a specific objective without knowing the detailed results in advance and without having a contractual basis for demanding the result should this become necessary.
This not only requires a change in thinking when it comes to project controlling but also has a direct impact on the remuneration arrangement as measurable criteria for evaluating supplier performance need to be identified. There are basically two variants: billing for the number of hours worked (time and materials) or for deliverables (agile fixed price). The time-and-materials model is easier to implement from an organizational perspective, but it shifts all the project risk to the client. The deliverables-based model is in principle a fixed-price model with very small billing units (user stories), whose constraints are regulated by a framework agreement. It requires significantly more organizational effort when it comes to acceptance and billing but results in greater transparency and a more even distribution of risks.
Conflicts can become problematic when assessing the overhead of user stories because the parties involved are more or less at an impasse. If the supplier estimates the overhead for a user story to be significantly higher than the client is willing to accept, neither side can force the other to accept their view of the situation. This means that assessment conflicts must be resolved constructively within the framework of the collaboration. A general feature of agile collaboration models becomes particularly apparent here: They require a high level of mutual trust and great willingness to resolve conflicts constructively. Proponents of agile principles will counter that this applies to the successful implementation of IT projects in general and that agile process models such as Scrum merely demonstrate a methodical way of dealing with them professionally and in a results-oriented fashion.
Scrum is in itself only designed for teams of up to nine people. Software relevant to industry such as PLM systems, however, require much larger development teams, thus giving rise to the question of how collaboration between a more or less large number of Scrum teams can be organized efficiently.
The approaches proposed in the literature, such as LeSS and SAFe, are typically based on a multi-level model, with operational work taking place on the lower level and coordination on the upper levels. LeSS, for example, aims to minimize overhead. Here, the operational teams send representatives to the upper level teams and the Scrum process is only modified slightly. SAFe, which is currently the most widely used approach, introduces a total of three levels and numerous new roles and events.
There are differing views on how well Scrum can actually be scaled in practice. There is no simple solution to the main problems that arise when coordinating creative work in large organizations. It is, however, becoming evident that, as the size of the project teams increases, binding content-related and technical definitions that are valid for all actors and a deep, shared understanding of the overall objective become critical success factors. Conveying them is more important than finding the supposedly perfect scaling method.
Using an agile process model poses certain risks if more comprehensive digitalization measures are to be implemented in a company’s extensive IT landscape. It could be tempting to launch larger projects without sufficiently clarifying the objectives and constraints and hope that it will be possible to clarify these points during the course of the project. This often leads to expensive dead ends. Unfortunately, the methodology does not provide any clear recommendations as to what should be clarified and defined in advance and with what level of detail. The project participants must therefore make this decision based on their own experience and strike a balance between creative chaos and over-specification à la the waterfall model.
Experience shows that a lack of process knowledge and inadequate analysis of the data model in particular can result in expensive abortive developments, which could be avoided or at least reduced with a little more conceptual preparatory work. It can therefore be concluded that sufficient attention also has to be paid to the process and data model description when using an agile approach in order to ensure that a solid conceptual basis for implementing the desired functionality is available at an early stage. This information should then be communicated to all project participants so that the creative scope that an agile approach creates can also be exploited in the context of the overall objective.
By Norbert Lotter, PROSTEP
No Comments »
Friday, February 1st, 2019
This year, PROSTEP is sponsoring the COMPIT Conference in Tullamore, Ireland, where it will give a presentation on the integration of shipbuilding design tools in the early development phase. Specifically, the new NAPA-AVEVA interface will be presented and the associated improved possibilities it offers for a continuous integration of data and processes in shipbuilding.
Companies in the marine and offshore industries are spoilt for choice when it comes to digitizing their business processes: either they opt for a Best of Suite approach, i.e. an integrated solution for ship design from a single manufacturer that may not cover all the functional requirements of the various disciplines and areas. Or they use the best solutions for the various tasks and are therefore faced with the question of how to integrate the digital tool chain in such a way that the data flows as consistently as possible.
Both approaches have their advantages and disadvantages, but in most cases the shipbuilders opt for the Best of Breed approach because of their grown IT landscapes. However, it will only be successful if the cost of integrating the tool landscape is lower than the additional effort resulting from the functional limitations of a Best of Suite solution. As a specialist for CAD and PLM integration, PROSTEP will explain to COMPIT participants which requirements they should consider when integrating the tool chain from a business perspective and with a view to the IT organization.
The implementation of an interface is not enough. First of all, it must be clarified whether the data should flow in one or both directions, whether all the information required by the target system is available in the source system, whether its data models are compatible, and whether native data or only geometric representations are required. Also important are aspects such as the use of catalog part libraries, which may have to be standardized. In addition, the question arises as to how often data synchronization must take place, which data volumes must be synchronized, and whether the exchange process should be permanently monitored. Only then can you start looking for standard interfaces for importing and exporting data.
Using the new NAPA Steel / AVEVA Marine interface as an example, PROSTEP will then explain the challenges of integrating two shipbuilding-specific CAD applications in more detail. PROSTEP presented the interface concept for the first time last autumn at the NAPA User Conference in Helsinki, where it met with great interest.
NAPA Steel is mainly used in the early design phase, e.g. for the calculation of buoyancy, drag and maneuverability. However, most shipyards use AVEVA Marine software to design their vessels’ steel structures and outfitting. Although both are intent-driven systems that do not primarily generate explicit geometry but describe the way in which they are generated parametrically and via topological relationships to other elements, the data from NAPA Steel cannot be used one-to-one in AVEVA Marine. The different semantics of the two systems must be mapped to each other in such a way that topology definition, parametric breakthroughs and other features can be transferred in best quality or rebuilt in the target system.
The special feature of the import strategy developed by PROSTEP is its high error tolerance: the interface is able to transfer even non-accurately defined topology information in a lower quality, e.g. as pure geometry. The user does not have to read through long error reports to understand which data has which quality, but recognizes this by means of the different color shades. The interface is developed on behalf of NAPA and already supports the transmission of 80 percent of the components in the current development stage.
No Comments »
|