information interoperability and exchange

Shared economy we live in is fueled by collaboration – exchange of ideas and data. Not all data, but good one. The good part starts from information interoperability concept. Interoperability is the quality of information to be read and understood in the same way by disparate information consumers – engineering services providers, equipment manufacturers, operation and maintenance companies, consultants, subcontractors, etc. Their effective collaboration in megaprojects is impossible without continuous data exchange.

Data produced in megaprojects is an asset for companies engaged in analytics of the megaprojects execution and forecasting market trends. Incomplete or ambiguous data may initiate development in wrong direction.

Operation and maintenance of mega-plants generates data streams that are shared with companies offering AI-driven tools for machine health diagnostics and process optimization. Without interoperability in mind it cannot be done.

I wouldn't be exaggerating to say that information interoperability is a pivot of any engineering software development today.

To be interoperable, the engineering information shall have well-defined open standard format describing its structure and payload. Unlike popular doc, xls, pdf formats, the former is much more sophisticated as it describes tangible things like valve, pump, foundation, insulation, etc. Although these things are actually described by different formats, they have something in common – they are all structured in similar way.

Industry Foundation Classes (IFC) for Building Information Modeling (BIM) are probably the first attempt to catch the similarities and offer practical way for creating new formats. Today BIM is interchangeable with digital information management in construction industry, which, in my opinion, is more advanced than in other heavy industries.

Today after over 20 years of IFC development one may say that this attempt has failed in every aspect: IFC have become neither any industry foundation, nor foundation for any engineering software development. The IFC schema is a collection of obsolete coding practices.

This failure only highlights the task complexity. Most of it came from neglecting data modelling science and practical experience in relational database development. It is dominated by the eternal conflict between object-oriented programming and database objects presentation.

Both sides use the object term in different ways: the data object retrieved from the database is in most cases a proxy of a real object storing data and relationships (!) with other objects. The real object may be irretrievable at all; it may have multiple proxies or partial views serving different purposes. Proxy may even contain some data from another real object.

Given the above-mentioned, one may conclude that IFC describe partial skewed and detached views which, by definition, can neither be fundamental nor point to real object. In other words, IFC are just a convention without prototype implementation, which will hardly be a catalyst for industry digitization.

(The same level of database design proficiency is demonstrated by the authors of ISO 14224 – Collection and exchange of reliability and maintenance data for equipment – one of the critical standards for oil and gas industry. Despite its impracticality this standard does a good job of persuading engineers go digital throughout all of its 280 pages.)

Next point in information interoperability is BIM Product Data Datasheet (PDS). It opens the door to automated tendering. Unlike IFC, PDS is in wide spread in process industries (oil and gas, pharma, chemical, water treatment) under a different name - Equipment Specification Datasheet. As it is a main document for purchase order, it will stay regardless whether we succeed with interoperability or not. It is a purchase order that defines priorities in PDS development, not data interoperability.

Can we use PDS for building engineering database? Only as a starting point as PDS may be neither complete nor accurate. And it is OK.

Incompleteness is what differentiate a novice from a seasoned engineer. The latter always strive to make PDS incomplete. Surprised?

The reason is rooted into Fuzzy Identities Problem (FIP) and Cascading Standards Principle. I mentioned the former in the Design Thinking article in regard to standardization problem and the latter – in Anatomy of Project Bidding as part of good plant design.

FIP describes a gap between what PDS asks for and what is available in the market due to two phenomena. First is the products perpetual evolution, which cannot be captured by stable-by-definition PDS. The second one explains the difference in the customer's and manufacturer's views on product.

The customer assumes that all the product properties may be changed with infinitely small increments. It is a "rubber world" perspective. The manufacturer views the world through modularization and standardization lens. This allows only limited number of different properties combination. The rubber world perspective becomes a reality if the number of manufacturers is large.

Cascading Standards Principle (CSP) governs sourcing of data missed in PDS. If data cannot be found in the project specification defined by the client, it is searched in the contractor corporate standards and guidelines valid for the current project. Then the corporate standards valid for any project are searched. Then search moves to local and international standards, followed by industry best practices. If not found the data defaults to the Original Equipment Manufacturer (OEM) offering.

Final question is how to treat PDS nonconformity? It is a twin problem of FIP. The straightforward answer is to disqualify the bidder. Unfortunately it may be the wrong answer. The decision-making theory suggests that with high probability that nonconformity may be an indication of advanced product whose bidder wrestles to squeeze it into the box of old beliefs.

© 2024 crenger.com