Jeff's MCAD Blogging
Jeffrey Rowe has almost 40 years of experience in all aspects of industrial design, mechanical engineering, and manufacturing. On the publishing side, he has written well over 1,000 articles for CAD, CAM, CAE, and other technical publications, as well as consulting in many capacities in the design … More »
Forget Just Data Interoperability; Remember Data Obsolescence
November 19th, 2015 by Jeff Rowe
For as long as I can remember, CAD/CAM/CAE data (I’ll just refer to it as engineering data) has been saddled with a perpetual problematic issue – interoperability. That is, the ability (or inability) of a completely understood load of code to work with other current or future products or systems without any restricted access or utility.
The term, interoperability, was originally defined for information technology or systems engineering services to allow for information exchange. A broader definition takes into account organizational factors that impact system-to-system performance. In other words, the tough task of building coherent services for users when the individual components are technically different and managed by different organizations.
Like many things, data translation and true interoperability can be viewed with degrees of success, with no guarantees or absolutes.
Again, for software, the term interoperability is used to describe the capability of different programs to exchange data via a common set of exchange formats, to read and write the same file formats, and to use the same protocols. The ability to execute the same binary code on different processor platforms is not considered by the definition of interoperability. The lack of interoperability can be a consequence of a lack of attention to standardization during the design of software programs. Ironically, interoperability is not taken for granted in the non-standards-based portion of the computing world.
According to ISO/IEC 2382-01, Information Technology Vocabulary, Fundamental Terms, interoperability is defined as follows:“The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units.”
Note that the interoperability definition above is somewhat ambiguous because the user of a program can be another program and, if the latter is a portion of the set of programs that are required to be interoperable, it might well be that it does need to have knowledge of the characteristics of other units.
Something Beyond Interoperability: Obsolescence
OK, I get it, engineering data interoperability is a huge deal. However, I’ve been amazed that another issue that is at least as big as interoperability is seldomly discussed: digital data obsolescence.
Although engineering data interoperability and obsolescence do share some similarities, I’ll focus on the latter – software and some hardware obsolescence.
Digital obsolescence occurs when a digital resource is no longer readable because of its archaic format. Obsolescence can be due to physical media, the reader (required to read the media), the hardware, or the software that runs on it is no longer available.
A prime example of this is the BBC Domesday Project from the 1980s, although its data was eventually recovered after a significant amount of effort. Cornell University Library’s digital preservation tutorial (now hosted by ICPSR) has a timeline of obsolete media formats, called the “Chamber of Horrors”, that shows how rapidly new technologies are created and cast aside.
The rapid evolution and proliferation of different kinds of computer hardware, modes of digital encoding, operating systems, and general or specialized software virtually ensures that digital obsolescence will continue to be a problem in the future.
For example, many versions of office and engineering programs, data-storage media, and standards for encoding images are considered “standards” for some time, but in the end are always replaced by new versions of the software or completely new hardware.
Files meant to be read or edited with a certain program (for example Microsoft Word) will be unreadable in other programs, and as operating systems and hardware move on, even old versions of programs developed by the same company become impossible to use on the new platform. Case in point, older versions of Microsoft Works, before Works 4.5, cannot be run under Windows 2000 or later.
Attention was brought early to the challenges of preserving machine-readable data by Charles Dollar in the 1970s, but it was only during the 1990s that libraries and archives came to appreciate the significance of the problem. The challenges have been discussed, although so far without any obvious solutions other than continual forward-migration of files and information to the latest data-storage standards.
Ideally, file formats should be widespread, backward compatible, upgraded often, and an open format. The National Initiative for a Networked Cultural Heritage cites the following as “de facto” formats that are unlikely to be rendered obsolete in the foreseeable future: uncompressed TIFF, ASCII, and RTF (for text).
In order to prevent this from happening, it is important to regularly evaluate and explore current technologies and evaluate-long term business models.
Any organization that has digital data should assess its data to identify any potential risks for file format obsolescence. The Library of Congress maintains Sustainability of Digital Formats, which includes technical details about many different format types. The UK National Archives maintains an online registry of file formats called PRONOM.
In its 2014 agenda, the National Digital Stewardship Alliance recommended developing File Format Action Plans: “it is important to shift from more abstract considerations about file format obsolescence to develop actionable strategies for monitoring and mining information about the heterogeneous digital files the organizations are managing.”
File Format Action Plans are documents internal to an organization that list the type of digital files in its holdings and assess what actions should be taken to ensure its ongoing accessibility.
For engineering data, obsolescence describes programming methods and languages that have become outdated. With the rate that technology is advancing, obsolescence is becoming an increasing problem. Over shorter periods of time, software, engineering data, and hardware grow incompatible with newer technology lose their ability to be used productively.
For engineering data, obsolescence can refer not only to outdated technology, but data that is no longer accurate because of its age.
There is a widely held perception that information recorded digitally is secure forever. This idea has been reinforced by marketing messages for products like audio CDs, personal computers and digital cameras. This idea is appealing, but nothing could be further from the truth.
The rapid pace of development in computer hardware, operating systems and application software, coupled with the short effective life of most storage media, means that decades of digital data are at risk or already lost. The long-term accessibility and authenticity of digital records can only be assured through proactive preservation measures.
Digital data are subject to the following three potential forms of obsolescence:
Keep in mind that there is nothing wrong with today’s hardware and software, and a new system will probably last for many years. But, it absolutely won’t last forever and when the time comes to repair or replace it you may be in for a big surprise.
All technologies are evolving rapidly and continuously. That’s why a new system is more capable than anything you could buy just a few years ago. Tolerances are tighter, you need more flexibility, you can’t find skilled operators so you need more capabilities in the software and a simplified interface – all of these evolving requirements are reflected in the components of any system.
Also, needs will be different in a few years, and so will the systems available to meet them. The parts you need to keep the old one running, however, may not be available. For example, some of the basic building blocks of some of today’s still widely-used systems, such as Intel 486 processors, component-mount transistors, and some specialized chips came be tough to come by. Other things, such as gas-plasma displays are now listed as environmental hazards that can’t be used at all, even in a like-for-like replacement.
Barring crashes or catastrophic damage, just about any piece of hardware or software from a reputable supplier should deliver 5–10 years of service with routine maintenance and updates (admittedly shorter for software, but not always). That’s a reasonable number to use as a base for prioritizing ongoing requirements and a replacement schedule.
It should be obvious that the older a software system or component is, the less likely parts or direct replacements are to be available. In other words, you should prioritize your plan to deal with your oldest components first because that’s where failure is both most likely to happen, and most likely to cause the greatest disruption.
So, as vital as interoperability has proven to be, data obsolescence is just as big a concern, because it’s not only important to have the ability to read data, you still need the ability to actually use it. Combined, engineering data interoperability and obsolescence are two of the major challenges of our increasing digital world.
Like it or not, obsolescence is a fact of life, so you need to plan carefully for it now.
One Response to “Forget Just Data Interoperability; Remember Data Obsolescence”