Managing data for compliance and a competitive advantage

Paul Boughton

Peter Copley discusses best practice in exploration and production data management

Exploration and Production (E&P) data is retained for many reasons. Companies may retain data to meet with increasing regulatory demands, to protect against events that can lead to data loss, or as an aid for future oil exploration decisions.

Intellectual property locked up in data archives may become valuable again over time due to changes in politics, technology or the economic environment. For example, as oilfield licensing rounds are announced and the oil price fluctuates, older reservoir characterisation projects can become attractive once more.

However, for data managers, keeping huge volumes of data available 'online' for E&P departments increases data storage and management costs and introduces impractically long back-up cycles. Moreover, for end users, manoeuvring through huge amounts can reduce the efficiency of their work.

Regulations can both force oil exploration companies to retain data and auditable records of decision making processes, but also compel companies to relinquish data (for example the UK's Data Protection Act).

For the purpose of this article, we can call an E&P asset a 'project', but in practice a single asset could comprise many constituent projects. Many different disciplines are involved in the processing and interpretation of data, and each project comprises results from both geotechnical and office applications.

To effectively archive a project, the results from all applications must be comprehensively captured, while maintaining internal references. The archive must also include descriptive information so that it can be easily found at a later date.

Data managers must consult domain experts to understand what data should be archived, because they are in the best position to understand the relative importance of the component data. Experts can help determine what data is valuable intellectual property and must be included, and what can be safely ignored.

Project data that has no value to a long-term archive should be removed with rules that effectively 'clean up' projects, saving time spent archiving, storing and restoring data, and also reducing storage costs.

When to archive

Projects can be scheduled for archiving at completion, or at important milestones. The facility to 'roll-back' a project to an intermediate milestone can be extremely useful in the event of data corruption, or perhaps if a poor processing decision is made.

It is common for oil exploration companies to store digital archives for years or decades, but compliance regulations could specify the retention that is acceptable by law.

E&P projects typically reside on multiple file systems, and may reference numerous application databases. There could also be data from multiple operating systems; for example technical information on Unix, Linux and Windows machines, as well as reports and spreadsheets from Windows. This diverse environment must be considered so that the project storage paths and intra-project references are maintained for future restores.

Dependable project archiving can only be achieved by understanding each application's unique requirements. First, project data structures must be understood, and the process for archiving these researched. The most appropriate information to describe a project dataset (the Metadata) must be identified so that archives can be properly located in the future.

Many mainstream geotechnical applications provide utilities that are designed to capture their project data in a manner that maintains consistency and ease of restore. Where possible, an archiving solution should harness these utilities to store all project information and index all Metadata from each application in a single, central archive location.

Central location

There are a number of advantages to having a central archiving solution for all application-based and unstructured data (individual files).

At this central location, a detailed search index of all archives can be made available containing each archive's associated Metadata. When Metadata is properly captured, this central database can be so powerful that in some cases the index description may provide all of the required information without the need to restore any of the actual project data.

This also has important benefits for regulatory compliance, because the same index can be used to identify all components that need to be deleted should ownership of data need to be relinquished.

Metadata capture should be standardised and automated to avoid storing inconsistent information that would make searching the archives impossible. By capturing the geospatial characteristics of a project, it is even possible to visually correlate archives against live data to identify all the data that might be held for a specific area.

Maintaining a central repository for archive data also makes storage easier to manage and maintain and consolidates a company's intellectual property in a single location.

Invisible storage

Complex storage solutions should be invisible to the user so that the user only needs interact with the archiving solution. In this way, the user need only be made aware of storage failures and can take successes for granted.

Verifying data written to archives is essential to ensure that an archive is an exact copy of the data on disk. While seen as a top priority for live projects, it is also essential that all objects within an archive have the proper security implemented to restrict access. It is also advisable and often a regulatory requirement to maintain auditable records that show all transactions on the archive.

Archiving solutions

Enigma Data Solutions has applied more than 20 years' experience delivering data management solutions for the world's oil companies. This knowledge is applied in its PARS (Project Archive and Retrieval System) software suite, giving data managers a direct link into these applications to specify the important project data and Metadata directly from the mainstream E&P applications of Landmark, Schlumberger, and Paradigm, and numerous other geotechnical tools used by its customers during the reservoir characterisation process.

This archiving software automates the archiving of complex oil exploration projects, whilst preserving all of the structured database information, unstructured data files and intra-project references required for a reliable and consistent archive.

Archiving jobs are configured using a step-by-step wizard tailored to the application, so that all of the required data and decision information for each project is preserved. In practice, it is common for archiving schedules to become sporadic and the scope to vary according to human error.

Using an archiving solution such as PARS, the process is fully automated from this central administration using the archiving settings that have been configured for each application.

Because the capture of application Metadata is both standardised and automated using the step-by-step wizard, a very powerful search index is automatically updated as new archives are added. This search can be used by the end users in E&P departments directly to quickly and accurately locate the correct project to restore without relying on the IT department. This eliminates costly operational delays and allows geoscientists to resume work on an asset within a fraction of traditional restore times.

Platform independent technology ensures that the software operates across Solaris, Linux and Windows Operating systems. The archives can be written to disk or a broad range of tape and virtual tape storage solutions and are automatically verified to ensure that the data is successfully stored.

Enter X at www.engineerlive.com/ihss

Peter Copley is Managing Director, Enigma Data Solutions, Uckfield, Sussex, UK. www.enigmadata.com

Recent Issues