Home  »  Solutions  »  Browse by Discipline  »  Information Management  »  Borehole Data Management

Borehole Data Management

Operators are inundated with vast and growing volumes of digital borehole data as the number of logs, cores, surveys and petrophysical analyses per well is growing. Unconventional shale plays generate data from hundreds or thousands of wells and advanced down-hole tools are capturing more information including real-time data.  In addition, Oil and Gas companies have thousands of historical wells and legacy data files scattered throughout the organization that must be managed.

Borehole data comes into a company from a variety of sources and is a key data type used throughout the reservoir characterization process.  The raw data increases in value as the data is analyzed and corrected for accuracy but that value is often not captured for later reuse. Automating the movement of borehole data through the company from the time is it acquired to the time it is effectively used by technologists is a key challenge to reduce costs and potential errors.



Built to integrate with your unique environment.

Landmark solves customer challenges with the most comprehensive Information Management portfolio in the industry.  Our solutions are designed to integrate with a company’s diverse application portfolio through a common, open platform.

Managing Borehole Data

Landmark’s Borehole data management software efficiently stores, manages, and publishes raw and edited borehole data in one integrated system.  New automated workflows built on the DecisionSpace® Platform make this solution unique and enforce data governance roles and practices.  It start with operators and data acquisition companies moving the data files to be loaded to a specific location on the company network.   New wellbore data is detected on the site, matched to the well master data, quality checked based on the operator’s company standards and data rules, then automatically loaded to the database.  Quality assessment indicators are associated with each data type in the database to give users confidence that the data is complete and correct.

When the data is loaded into the database, it can be selected and quickly downloaded to user applications leveraging the same, open platform.  In this challenging financial environment automating repetitive but important tasks allows critical staff to focus on higher value activities.

Recall™ Borehole »


Data Quality

Landmark delivers a robust, automated data validation tool for borehole workflows. It performs extensive investigations into data loaded into the databases and rule-based evaluations of data quality and metrics.

A conventions verification module tests values of attributes populated in the borehole database. Among the tests performed by this module are standardization of naming conventions for logs and curves, validity of mandatory values, and a review of the totality of your data sets.

A bulk data verification module thoroughly searches your borehole data for any anomalies, such as variances generated by a data acquisition system or for poorly formatted data from an outside source.

Quality control data rules from petrophysicists and other data experts can be captured in in the quality control tool and reused through the organization.   As data is checked against these rules the quality scores or results of the data checking are included in the database with the data.  This provides end users and data managers with the information they need to accurate evaluate the data and reduce risk uncertainty before using it.

Recall™ Raven »
DecisionSpace® Data Quality »

 


Data and application integration

The differentiating technology for this solution is the DecisionSpace® Platform which provides the tools needed to connect databases, visualize data and automate workflows.  Users can view data from multiple sources, select the data they need and move it from the master database to project databases where they do their work.  The DecisionSpace Platform is the foundation for technology used from Exploration to Production and provides consistent data access and tooling across all domains.

DecisionSpace® Integration Server »


Information Management Services

Successful master data management requires more than the right software. It requires a combination of practices, governance, and technology to define trusted data, the rules and relationships around that data, and the processes to keep it current through time. Using a combination of Landmark technology and domain expertise, we apply business rules derived from best practices to build a master or "gold" database using only the trusted parts of all your data sources. The result is a single view of the truest data and the methods you need to keep it current, regardless of where it's stored.

Information Management Services »

 

Planar Client Options
Layout Style
  • Wide
  • Boxed
Theme Preview