Digitalisation refers to the use of technologies to convert information and data into a digital format in order to improve business processes, writes Stephen Webb from Metegrity.
For owner-operators of pipelines, the single largest benefit of this process is the newfound ability to gain actionable intelligence from their data, straight from the right of way. Of course, that ability is contingent upon finding a technology solution that can deliver it.
Operators could be missing out on significant profit potential if they are too reliant on outdated methods for collecting and analysing their documents and data. This includes data issued during all stages of pre-construction, such as front-end engineering design, materials procurement, welding specifications, land permitting, route crossing, etc. – in addition to the construction and inspection records generated during the pipeline construction.
In fact, The Pipeline and Hazardous Material Safety Administration (PHMSA), has identified “material, weld and construction quality as a major source of leaks during pre-commissioning hydrostatic pressure tests, the first years of operations and later in life of a pipeline. PHMSA’s findings indicate a need for better quality assurance in the pipeline construction industry.”(1)
Utilising paper-based processes such as spreadsheets and physical reports that are slowly filtered up the chain of command, operators aren’t able to receive pertinent information in a timely manner, or to gain any actionable intelligence from the data they do receive. When data is allocated from a variety of disparate sources in an inconsistent format, it becomes difficult to properly assess and analyse project data or asset health. In this climate, inspectors wait until the end of their shifts to fill out their physical reports (or else leave the work face, which comes with its own inherent risks), and then hand those off to be filtered up to the decision makers. This directly impedes project efficiency and opens the door for quality issues, as key information from the job is not immediately known. This increases the likelihood of lost profitability, safety or quality issues, and even asset failure.
In an effort to reduce the risk brought on by construction quality, PHMSA created guidelines for a pipeline construction quality management system (QMS), as detailed in API 1177 – Recommended Practice for Steel Pipeline Construction Quality Management Systems.
The key to solving these issues lies in modernising the entire pipeline construction process via digitalisation, utilising technology and software compatible with API 1177.
Rather than waste time tracking down data from an array of multiple contractors, owner-operators are leveraging software platforms that can allocate all data – across all of the pipeline construction disciplines and stages – into one secure, robust database. Project information can be instantly uploaded in real-time to the database straight from the right of way.
Field personnel can access key data right at the job site, enabling immediate decision making for construction-related issues.
Inspectors can update the database with their findings as they go. Key decision makers can access and analyse the data to garner real, actionable intelligence in real-time, empowering them to make and pass down decisions immediately, and to implement preventative measures to prevent asset failure before it occurs.
With immediate access to information, identifying incidents for regulatory compliance becomes automatic. For example, if a pipeline were to leak, operators can go back through the data and easily access all of the steps that were taken, who was working on it, and when, and then generate a report for regulatory bodies.
By removing the need to gather data from files or paper-based record systems, owner-operators can substantially reduce overhead hours. Predictive capabilities are greatly improved when key information about project status and asset health is immediately known and readily accessible. As the data continues to build within the database, a clear audit trail is formed and with the right technology, any variation of report or trending can be generated with the click of a button. Once the pipeline is in operation, this data can be utilised for ongoing pipeline integrity management during the asset’s operational lifecycle. In the future, this opens the door to machine learning and, eventually, to artificial intelligence.
Ultimately, digitalisation equates to the ability to instantly gain and access actionable intelligence from pipeline construction data – which helps to improve understanding, drastically reduce response times, enable preventative measures where required, and ultimately reduce catastrophic instances during the pipeline’s inception through to its operations.
How to leverage digitalisation
With so many different forms of data across so many disciplines in the pipeline construction process, owner-operators might assume that the digitalisation process would be costly and daunting. Previously, that might have been the case. Not anymore. In the past few years, exponential leaps have been taken in technological innovation in this sector. With an abundance of digital technology available, and experienced consultants ready to perform the heavy lifting, the process of modernising construction projects is more affordable than ever.
The first step will be to allocate, consolidate and convert existing data from its current disarray of resources onto a single, robust database. To facilitate this, look for a company that provides professional consulting services. They will send in experts to assess the company’s current status, filter through current data, and digitalise all key information into the platform.
From there, the goal will be to implement a comprehensive pipeline enterprise system designed to align business processes with the work being performed in the field. The right system will align and connect data from the right of way to the office in real-time, allowing operators to constantly be aware of the project and garner actionable intelligence from their data. Automating the pipeline construction data with the right tool helps your organisation transition to a simpler, low cost business model. With all project data stored and built up on one database, this also provides the future benefit of helping companies save time researching project issues after the pipeline in the ground. For companies that are trying to move toward AI and machine learning, it is crucial to implement an enterprise software capable of advanced data analytics right from the pipeline’s inception. This will serve as the foundation for AI later on.
Look for a product that facilitates near real-time data capture straight from the right of way onto secure cloud servers for maximised efficiency and security. You should be able to access all information as soon as you sync your device, and then be able to push all project, engineering, material and welding specification revisions out to the inspector. The product should allow for reliability, accountability, and traceability – and it should support management review, management of change, and document and records control.
Consider the most advanced technology platforms to help maximise your investments. Look for innovative technology that delivers advanced analytics. This is what will enable you to run business intelligence on the collected data.
By aligning with a service provider that offers these technologies, you can quickly and affordably modernise your pipeline construction and digitalise all processes. As more-owner operators adopt this approach, it will become increasingly prudent to do so just to remain competitive – never mind the proven return on investment that has already been realised by companies who have taken the leap. The ROI will be realised right at construction, and then even more so over the continued operational lifecycle of the asset.
(1) Niesen, V. and Gould, M. (2017, November). Detecting Pipeline Leaks. ASME Mechanical Engineering, 139 (11), p 35-39