Geological Context – The Missing Element in The Interpretation Domain
Frans van Buchem, Principal Advisor Geoscience, Halliburton
The identification of time-specific global exploration concepts can help reduce geological subsurface uncertainty and guide strategy. This is demonstrated for the Cretaceous by establishing global sedimentary patterns that can be linked to specific controlling parameters. Within the global temporal framework of third-order sequences we focus on: the frequency and magnitude of relative sea-level change, global sediment volume trends, and climate modeling. The hydrocarbon-rich and well-studied deposits of Cenomanian age are used as a case example to illustrate the global impact siliciclastic and carbonate reservoir architecture in North America and on the Arabian Plate.
Velocities Are Geology – Interpretation-Based Velocity Analysis
Einar Magerøy, Advisor Reservoir Geophysics and Tone Wadel, Specialist, Depth Conversion, Equinor
A geological-based velocity model is required to estimate depths of surfaces for in-place volume calculations and prognosis for well placement – a process that requires expanding velocity modeling capabilities to efficiently treat velocities as geological. Seismic-based PSDM velocity models for imaging or traditional mathematical approaches to depth conversion often lack a proper link between geology and velocities. Traditional velocity modeling mistie analysis often lacks the lateral and vertical distribution of wells to provide a reasonable statistical analysis for uncertainties. The proposed workflow allows the consistency analysis between seismic interpretation and well zonation while interpretation is ongoing. This new and thorough velocity analysis leads to the mapping of previously ignored unconformities, thus improving the geological understanding of structural development. A velocity model was developed in a workflow with these elements, beyond the initial scope in the contract for the cooperation, after subject matter experts at Equinor met and discussed the workflows to be implemented. During the development phase, an agile approach was used to scope as opportunities presented themselves. In addition to improved conventional geological velocity analysis, the DecisionSpace® Velocity Modeling tool was upgraded to support other valuable workflows, such as multi-linear regression analysis, gross rock volumetric comparisons, and mistie statistics between different velocity models.
Reservoir Characterization Improvement Through Integrated Approach to Capillary Pressure Curves Handling. Case Field: Llanos Orientales, Colombia
Jose Espinosa, Petrophysicist, Ecopetrol S.A.
The capillary pressure setup and correction module from DecisionSpace® Petrophysics software can be used to manage capillary pressure data as saturation vs. height modeling input; additionally, the corrected curves are used for model construction associating each curve to a rock type. This module allows the user to process a large number of curves, regardless of the module’s measurement procedure (porous plate, centrifuge or mercury injection). Thus, it is possible to assign a unique J-function to each of the six rock types, describing capillary behavior as a function of porosity and permeability. By applying this, it is possible to overcome the difficulties in the water saturation interpretation due to heterogeneity, thin-bed sands, low-resistivity contrast, and Rw variations.
Global Patterns in the Geographic and Stratigraphic Distribution of Hydrocarbons
Michael Treloar, Product Owner for Screening Applications, Halliburton
Based on our studies, along with updates in velocity model building with improved technologies and advancements in seismic survey design for better acquisition and enhanced imaging techniques, challenges and risks can be mitigated for successful exploration of the Triassic and Paleozoic reservoirs. Data was gathered from various sources (including well data, petrophysical logs, and seismic) to understand reservoir complexity and heterogeneity. Greater depth of occurrence and poor imaging of the Paleozoic sequences in seismic are main deterrents in pursuing potential leads. Recently, 3D seismic data were analyzed by integrating the data with drilled well data, using a seismic sequence stratigraphic approach, identifying Kuwait’s paleobasin architecture. Evaluation of 3D seismic data in West Kuwait showed the possible presence of carbonate reefs, talus, and grain flow deposits within the Permian Khuff formation, identifying potential features such as buried structures, alluvial fans, and channel-levy systems. Well data from deep wells drilled in Kuwait indicate that the Permian reservoirs may have gaseous hydrocarbon potential. Well logs from nine deep exploratory and development wells have been used to study petrophysical characteristics and their effect on the reservoir quality of the Paleozoic Khuff and Unayzah formations. Petrophysical log data have been calibrated with core analysis available at some intervals. Data indicate that the Triassic and Paleozoic formations show potential for gas reserves, and work is being done to mitigate challenges and unlock the formations’ potential. Improved acquisition, imaging, and velocity-model building are underway, and some of these involve using improved DecisionSpace® tools. Acquisition of advanced suite of PP logs suitable for high-pressure/high-temperature (HP/HT) wells are being planned. Improved drilling and completion are also being designed to enable successful drilling, completion, and testing in the future.
Landmark Earth® Engineered Appliance – On-Premise Private Cloud Platform for Geoscience Application Delivery
Mark Daus, Energy Field Director, Dell EMC
Many upstream oil and gas operators have been evaluating and adopting cloud first strategies as a basis for their broader digital transformation initiatives. Deploying and managing petrotechnical applications in the public cloud can present additional challenges associated with end user experience, data management, and cost. To improve exploration and production (E&P) application performance, and maintain data sovereignty for upstream E&P application workloads, operators have found success in deploying on-premise private clouds. The Landmark Earth™ engineered appliance has been engineered to help streamline private cloud deployment with modern hyper-converged infrastructure comprising optimized and preconfigured compute and storage hardware tightly integrated with a full scope of E&P software. Upstream operators are being successful in adopting this technology to simplify and speed up the deployment of on-premise private clouds in exploration geographies around the world. The Landmark Earth engineered appliance is a successful use case on improving time to production with a modern, optimized solution for E&P application delivery.
Seismic Multi-Attribute Realistic Co-Visualization Techniques – Multidimensional Analysis and Delineation of Complex Structure Tools
Ignacio Rovira, Geophysicist, Pan American Energy
Fault delineation is one of the first challenges that seismic interpreters must face. When dealing with zones where subsurface characteristics have subtle structural features resulting from complex tectonics, the use of traditional geometrical attributes (e.g., discontinuity) has shown several methodological limitations. Taking one step ahead, the calculation of meta-attributes (an attribute derived from another attribute), the application of oriented structural filters and the use of automatic fault interpretation tools make a superlative contribution for the best positioning of geological structures. In areas with complex geological features, this challenge requires not only the general knowledge of the area and its context, but also an important dose of criteria in the selection and combination of seismic attributes. The Multiple Realistic Covisualization (MRC) technique has proved to be a valuable and effective tool to make key decisions in the interpretation stage. With this tool, it has been possible to enhance seismic resolution and, therefore, to identify seismic geometries associated with geological events in areas of prospective interest. Based on a process that combines the power of the AVTHF-attribute routine and MRC techniques, it is possible to define faults of different orders and the relationship between architectural elements and other components of the interpretive model of the subsurface. Finally, the application of these techniques results in an increment of the value to the seismic data and its subsequent interpretation. These visual improvements are of great interest for the comprehension of the subsurface characteristics in order to delineate prospects or simply reinterpret an already known area, looking for new, vertical and/or horizontal limits.
AVO Overview and Seismic Screening for a Quick Sweet Spot Evaluation – An Interpreter’s Perspective
Jose Foucault, Senior Geophysical Advisor, Ecopetrol America
For quick prospectivity estimations, geoscientists need to have a Direct Hydrocarbon Indicators (DHI) evaluation and an overview of seismic response. It is essential to calibrate and determine amplitude anomalies in a well location as an initial step to predict and detect anomalies throughout a seismic volume. The focus is to identify 3D seismic anomalies matching the characteristic of anomalies calibrated in wells, and to quantify their size and distribution. The workflow tested in this presentation represents a tool for interpreters to have a quick look at seismic data and to identify areas of interest during exploration stages. DecisionSpace® Seismic Analysis software has proved to be a great solution for interpreters to carry out seismic amplitude calibration and to determine the type of AVO class. Interpreters can execute a fluid substitution process using seismic analysis to define possible responses. With the definition of an expected seismic AVO response in a well, anomaly detection in the reservoir can be achieved. The cross plot between amplitude of grids and different AVO products allows interpreters to detect the anomalies with similar responses of hydrocarbons calibrated at the well. The cross-plot tool is also useful when two AVO cubes are plotted in an interval of interest. The results of the workflow with detected anomalies can be viewed in 3D Viewer or Map View to check the distribution and extent of the anomalies. Visualization of the distribution of the detected anomalies can further suggest, in the highlighted areas, a more active petroleum system or the possible distribution of the reservoir at a specific level. Additionally, the locations of the anomalies can suggest the kind of possible traps in the area – determining, for example, if these are anomalies against salt, if they are faults, or just isolated bodies and patterns. Interpreters can now focus on more significant anomalies and review the relation with a structural map that provides a quick prospectivity evaluation of the reservoir.
Generating Complex Fault in a Strike-Slip Area and Its Impact On Prospect Evaluation, Kutei Basin, Indonesia
Haryono Haryanto, Senior Geophysicist, Saka Energi Indonesia
The Sesulu area is located in the strike-slip area of the Adang fault and the Sepinggan fault. A complex faults framework was severely observed in the 3D seismic dataset. In general, fault interpretation is a key step in seismic structural interpretation, and it lays the basis for proper quantitative interpretation, inversion, and modeling. It describes the geological model, but also contributes to next exploration strategies. Yet, fault interpretation is a tedious, manual, subjective, and time-consuming process. Current interpretive tools rely entirely on the interpreter, while only utilizing the data qualitatively as a backdrop or an indirect guide. Thus, it becomes a very challenging process. This presentation will compare two main fault interpretation methods based on two different seismic attributes: (1) discontinuity, and (2) fault likelihood. The discontinuity attribute measures the similarity between seismic traces in 2D or 3D seismic volumes, so it enhances discontinuous events (e.g., faults). Sharpened discontinuity cubes, which are run on restricted azimuth 3D seismic volume, provide good visualization of fault patterns in selected orientations. The fault likelihood attribute (Hale 2013), which is a combination of different attributes, has been used in recent years. This increases the clarity and accuracy so much that automatic fault extraction from the attribute is now possible. The fault likelihood attribute has succeeded in providing the detail of fault framework complexity in the study area and has also helped define the faulted prospects.
Integrated Geosciences Analysis In Mature Field to Improve Reservoir Management and Production
Giovanny Yepez, Geologist, IGAPO
The Igapo consortium has developed several mature oil fields in Ecuador’s Oriente Basin. To fulfill the contractual commitment with Petroamazonas (Ecuador’s national oil company), it was necessary to integrate all geoscience disciplines to start drilling the different producing reservoirs at the Pucuna, Palo Azul, Lago Agrio, and VHR oil fields in September 2018, and to increase the production of the assigned areas. These producing reservoirs correspond to siliciclastic sandstones deposited in a tidal environment associated to the Upper Hollin formation and the U/T siliciclastic sandstones of the Napo formation. The consulting team carried out a detailed and integrated analysis using different geoscience disciplines to identify areas with the best reservoir (petrophysical) characteristics within estuarine facies in the tidal sandstones of the Hollin and Napo formations, which have commercial remnant reserves. The applied methodology corresponds to structural, stratigraphic, sedimentological, petrophysical, geological, and production studies with reservoir analysis to identify geological uncertainties or risks. When uncertainties were identified, estimation of the reserves to drain were performed, leading to an increment of the production within the analyzed oil fields. The drilling campaign began in September 2018 and ended in June 2019. Nine wells were drilled in four fields, increasing production to around 9,300 bopd. The success of the drilling campaign was driven by a multidisciplinary analysis from the geoscience team, using real-time technology to monitor and correlate wells. The analysis identified uncertainties and geological risks to detect the best areas with remaining reserves, thus maximizing production with adequate field management.
Well Design Concepts Accelerated with the Power Engine from Aker BP: Digital Well Program™
Alexander Brekke, Drilling Engineer – Digitalization, Aker BP
Aker BP and Halliburton have built the groundwork with their first release of the Digital Well Program™ (DWP) solution, which is designed to enable operators to build a well in a day. DWP allows reservoir engineers, geologists, drilling engineers, geophysicists, completion engineers, and operational geologists to come together in the creation of well construction and digital reports with an elevated workflow productivity. Typical pain points, such as the search for valid data and the copy-paste of data between departments and applications, are solved by integrated dataflow cross platforms. With DWP workflow delivery, the well planning team will be able to design wells and create a well program through an automated and auto-populated process using the DWP workflow, which complies with any current D&W governing system and technical requirements. The DWP solution offers time savings and great value, featuring the display of auto-populated well designs, with considerations to specific rules according to our technical requirements, and integration with third-party applications.
Drilling Software Evaluation and Directional Survey Simulation Project to Determine Accurate Survey Tool Code
Himawan Kartaatmadja, Drilling Engineer, Pertamina Hulu Mahakam
The two main objectives of this project are to transfer directional drilling data from 2,024 wells from TOTAL E&P Indonesia’s (TEPI’s) T-DESK software to the Halliburton Landmark EDM™ database, evaluate Engineer’s Desktop™ (EDT™) software from Landmark (including COMPASS™, CasingSeat™, StressCheck™ and WellPlan® software solutions), and determine accurate survey tool code. Based on the data assessment done by the Landmark team in July 2017, this migration will be done by manual entry into Landmark’s Computerized Planning and Analysis Survey System (COMPASS) software, a comprehensive software tool designed for use in directional well design. The data entry will be based on the T-DESK directional drilling data provided by TEPI in an EXCEL format. In line with this, comprehensive QA/QC of the entered data will be made to ensure the integrity of the migrated data. Drilling data to be QA/QC-reviewed will be determined, and data comparison will be made by VBA script programming. In addition, a Survey Tool Code Bridging study between the T- DESK and COMPASS software systems will be conducted.
A total of 2,426 wellbores has been entered in the EDM database, comprising 1,897 onshore wellbores and 529 offshore wellbores. In addition, a Survey Tool Code Bridging study (ISCWA and OWSG standards) between the T-DESK and COMPASS systems has been conducted and documented.
RTOC Transformation with iEnergy® Tenant, Using SmartDigital® Co-innovation Service
Ricardo Bustos Acosta, Drilling Optimization Team Leader, Ecopetrol S.A
Ecopetrol is planning to implement a real-time operations center (RTOC) that uses the iEnergy® cloud tenant to have a single real-time database integration as a strategical vision. The roadmap for this strategy uses visualization, a data warehouse, quality key performance indicators (KPIs) in real time, and dashboards, as well as instructor-led training (ILTs), engineering applications, and microservices for the daily operational challenges. Using real-time tools and other applications that integrate and synchronize the private cloud with the iEnergy cloud, Ecopetrol is expecting to implement SmartPlatoon software for delivering solutions for its operations in descriptive and predictive analytics, helping the operational decisions and vision of the digital twin as an input for Ecopetrol’s digital well program. Several challenges such as implementing RTOC support, overcoming restrictions from IT and restrictions to synchronize data from rigs, as well as limited user experience to control data gathering or visualization were the key points targeted to overcome in this transition. The digital and operational strategy to transform the RTOC was defined by co-innovation and technology. The focus was to work together to solve these challenges and develop a strategy with a focus on creating a roadmap for innovation and operational challenges with real field cases. Workshops were conducted to gather requirements, increase innovation, and gather ideas to solve operational problems. The major areas of focus were to solve the problems with the user interface, user experience, and well monitoring; provide an end-to-end support solution that can be fully operated with the iEnergy tenant; normalize and standardize WITSML; and to be open to enhancements and evolution.
EDM™ Insights – Improving User Proficiency and Performance in Real Time
Amir Bar, Global Advisor – Knowledge & Processes Analytics, Halliburton
Traditionally, our perspective on drilling design and operation data is very well-centric. We often do not have insight into how the data was created and by whom. The price associated with not following best practices by entering bad data in a typo can be significant, both in cost and safety. Therefore, knowing when something goes wrong is critically important, particularly if it can be done in real time. But the challenge does not end there. If the root cause of the bad data is lack of knowledge or not following best practices, then bad data will continue pouring in unless that knowledge gap is closed. In this session, we will present a new method that allows: 1) knowing EDM™ usage data in a user-centric way, 2) analyzing it to identify user knowledge gaps in real time, and 3) taking action on it by sending users bite-size learning content so they can apply the best practices at the right place and at the right time. It is invaluable for new engineers to become proficient EDM users, and for experienced engineers to become familiar with best practices in a new region or play.
Living on the Edge – Digital Challenges for Extreme Environments
Christopher McMillon, Senior Technical Advisor, Digital Solutions Architect, Halliburton
As real-time systems move from data acquisition into smarter, learning systems, the drilling industry faces new challenges that shape our system designs. The focus of this presentation will be the technology challenges and the milestones to get us from legacy systems through the digital transformation to advanced automation systems. In examples from McLaren Racing, we will also explore systems and designs with similar problems that we can learn from to apply to our industry.
Well-Architecture-Driven Transverse Shear Stresses – Engineering Challenges in Horizontal Extended-Reach Wells
Sandeep Dhawan, Principal Well Engineer, WellPerform ApS
Well engineering modeling work for a horizontal extended-reach well located offshore Denmark is presented that aims to resolve unique engineering challenges of downhole transverse shear stresses on a jointed work string planned for a specific completion application. The in-depth engineering analysis was performed using the DecisionSpace® Well Planning software’s torque and drag model with varied well engineering parameters, such as well doglegs and tortuosities, cased-hole friction, well architecture, string configuration and sizing, mechanical properties of string components, well fluids, operating parameters, and the stiff-string torque and drag model. The string loads were analyzed with examination into drag forces, side forces, buckling and lockup, string stretch, triaxial stresses, fatigue, and torsional and transverse shear stresses. It was diagnosed that there were issues with the transverse shear loading, which required further analysis. Large magnitudes of transverse shear stresses were generated due to jointed work string contact point loading with the string traversing in the low side of well and at the depth well casing crossed over from 10¾ inches to 5½ inches. These stresses were primarily generated by high-magnitude normal point loading, in a specific string depth range, due to the combined work string configuration and the well architecture. The transverse shear failure of the downhole string is unique to the industry, with scarce experimental evidence of transverse shear stress limits and no clear guidance from API and other standards. Therefore, safe operating limits for the jointed work string were evolved. Further mitigation measures were identified, including changes to the well architecture, to manage these complex well engineering issues highlighting downhole string shear capabilities that may be uncommon in the industry. This study work also invokes requirements to determine transverse shear failure, along with safe operating limits as a function of tubular yield strength.
Applying DecisionSpace® Gun Barrel View for Easy Visualization and Interpretation of Complex Unconventional Well Planning Scenarios
Kane Nile, Geologist, Occidental Petroleum and Stephen Williams, Technical Advisor, Halliburton
Subsurface geological complexities and uncertainties of unconventional plays result in complex well planning scenarios that can be difficult to illustrate for management and investors. Using manual methods to create and annotate these well planning scenarios takes hours, and any updates to the subsurface model require the entire view to be recalculated and rebuilt. Thus, an automatically updatable visualization and interpretation tool for multiple well planning scenarios was required by the operators working in North American unconventional plays. Within DecisionSpace® Geosciences software, a tool for visualization and interpretation of complex well planning scenarios was created to overcome these challenges. The collaborative environment of the DecisionSpace platform has played an important role in the development of this new feature. A Gun Barrel view of complex well planning scenarios automatically measures and displays targets in a section view. This Gun Barrel view dynamically displays the spacing between new wells, existing wells, subsurface hazards, leases, and target zones to compare various development scenarios. The Gun Barrel view has been implemented in Anadarko’s Permian unconventional activities, accelerated by the SmartDigital® process from Halliburton Landmark. With this view, multi-zone targets and well spacing can be easily visualized and interpreted in terms of subsurface asset complexities, thus minimizing preparation time and uncertainty. The Gun Barrel view can also be converted into presentations to show complex well planning scenarios to management and investors to develop multi-year drilling campaigns.
Cuttings Image Classification in Real Time, Using Machine Learning Applications
Robbert Schuurmans, Senior Drilling Engineer, Wintershall Dea
Cuttings monitoring is an important tool for detecting and preventing problems during the well drilling process. Different concentrations of cuttings and the shape of individual cuttings can indicate issues in the drilling operation – such as wellbore collapses, hole-cleaning necessities, and stuck-pipe incidents. Computer vision can help analyze real-time images provided by cameras located in the shakers to identify or prevent drilling issues. Using cameras located on the shakers, and utilizing applicable Landmark technology (such as the DecisionSpace® Integration Server and DecisionSpace Data Quality software platforms), we were able to analyze and classify images based on computer vision. Several models were tested with public data available for machine learning training and testing. Image analytics using machine learning and artificial intelligence (AI) algorithms showed strong correlations to prove the recognition. Identification of cuttings images on shale shakers can be a great enabler to real-time classification of wellbore-related problems, and can provide support with assisted lithology interpretation in real time. This can also help to provide a drilling roadmap based on formation lithology characterization with a robust and reliable database.
Empowering Data-Driven Upstream on Azure
Kadri Umay, Principal Program Manager, Microsoft
In this session, we will discuss how better usage of data can drive higher productivity, and what Azure provides as a platform, as well as examples of work we’re doing with Halliburton Landmark. Oil and gas customers are leveraging cutting-edge technologies – such as deep learning, elastic on-demand hyper-scale compute and augmented reality – to fuel the next generation of subsurface data management. In this presentation, we will demonstrate the outcomes of our collaboration with the Open Subsurface Data Universe (OSDU) Forum, and deep-dive into cloud computing innovations from Microsoft and the architectures that are making these technologies possible. We will also discuss fundamental cloud technologies that are enabling on-demand hyper-scale computing that can deliver tremendous benefits for geologists, geophysicists, scientists, and engineers. This session will help participants significantly improve productivity by automating mundane tasks through optimized data usage.
A 4D Small Data Solution in a Deepwater Gulf of Mexico Seismic-Driven History Matching Workflow
Travis Ramsay, Scientific Advisor, Halliburton on behalf of Talos Energy
The challenge was to realize the full value of Amberjack field, in deepwater Gulf of Mexico, through the identification, confirmation and exploitation of bypass pay – by using time-lapse seismic in an integrated multi-disciplinary workflow; improving the forecasting capability of simulation models, while ensuring consistency with production data and time-lapse seismic; and optimizing existing data, while dealing with missing or limited data, in order to make short- and long-term business decisions. An integrated multidisciplinary 4D closed-loop solution (identification, confirmation, and exploitation) was executed, and a combination of Landmark DecisionSpace® Geoscience, DecisionSpace® Petrophysics and Nexus® software was leveraged to evaluate asset potential. Data discovery was enabled, using a small data engine that analyzes the existing data to determine the data that is needed. Additionally, time-lapse seismic was linked to reservoir flow simulation through simulation-driven history matching. Geophysics, petrophysics, earth modeling, and reservoir simulation were reconciled in a multidisciplinary workflow. Infill wells were also planned to exploit identified bypass pay. We also determined how to increase production through closed-loop workflow integration. As a result, the value of identified bypassed reserves in the shallow reservoirs reached an estimated USD 128 million.
History Matching of Multiple Wells with Actual Downhole ICD Configuration in a Five-Spot Pattern Reservoir Development
Basit Altaf, Reservoir Engineer, ADNOC Offshore
Downhole control devices are widely implemented on fields globally, and, due to the costs involved in their implementation, a robust reservoir performance forecast is necessary. A prerequisite to a sound reservoir development plan is to have a robust history-matched reservoir simulation model. This study uses the actual downhole inflow control device (ICD) well configuration in the reservoir simulation model in order to perform history matching of a greenfield offshore Abu Dhabi, UAE. The results of this approach are compared to the results from traditional approaches. The scope of this study is to examine the differences in both history match approaches. The conventional approach to history matching the well performance is to implement a positive skin factor across the well completions to mimic the effect of the ICDs installed in the well, thus increasing the pressure drop (ΔP) between the formation and the well tubing. In this study, the actual downhole configuration was prepared using well-completion analysis software, followed by the use of a next-generation reservoir simulator to run the full-field reservoir model for the history matching period. As the field is being developed on the principles of digital concepts, continuous high-frequency downhole pressure data is available in both flowing conditions and shut-in conditions. The use of this data, coupled with direct modeling of the ICDs in the simulation model, resulted in a significant improvement in the reliability of the history match, as compared to traditional approaches.
Multiple-Cloud, Multidisciplinary Workflows are Possible with Standards-Based Dataset Transfer Packaging
Jay Hollingsworth, Energistics
Subsurface workflows for engineering and reservoir modeling involve an increasing scope of technologies, including third-party applications complementing major platform systems. In a cloud environment, this can lead to workflows where data needs to transfer not only between applications but also from one cloud to another. The transfer of large volumes of complex subsurface datasets has been addressed with the industry-developed RESQML standard, which consists of both a comprehensive data and metadata format and a packaging process to bundle all data objects into a single HDR file, thus making it easier to keep track of as it moves from one application to another. RESQML’s use of HDF5 addresses cross-platform compatibility. For multi-cloud workflows, the data packaging process consists of a file pair for ingestion that can be managed manually, or through a scripting process, or included in a more sophisticated management process. There is no risk of the file pair getting lost or misplaced during a data transfer – a risk that would be real for a reservoir project consisting of thousands of data objects and configuration files. The operational feasibility of a multi-cloud workflow was demonstrated during a pilot project that resulted in a live demonstration at a trade event in 2018. Five applications were operating on one cloud provider’s infrastructure, while a sixth required another cloud vendor’s environment. As this was a demonstration, the scripting approach was used. A shared storage area in Cloud A was monitored for new content every 5 seconds by a process in Cloud B. Any new file was copied to a shared area in Cloud B. A mirror process was set up in Cloud A, looking for new content in the shared area of Cloud B. For users, this was transparent: the user in Cloud B read the new file, applied processes to the data, and wrote the result. The next user in the workflow, in Cloud A, saw the modified file appear and continued the workflow.
Rethinking Earth and Reservoir Modeling: Is the Path Forward White, Black, or Gray?
Jeffrey Yarus, Technology Fellow, Halliburton
Today, subsurface characterization or earth modeling is the construction of a digital twin representing a reservoir or a stack of reservoirs. These digital models have been significant in the development of conventional hydrocarbon resources. In unconventional reservoirs, earth modeling has been challenged due to a lack of data acquisition during horizontal drilling operations. Advanced cloud computing coupled with traditional modeling techniques can enable reliable earth models to be constructed using “black” data-driven models and “white” physics-driven models to create “gray” or integrated models. This presentation discusses emerging technologies for earth modeling, along with their downstream applications, benefits and pitfalls.
Managing Large Amounts of Capillary Pressure Data, Using DecisionSpace® Petrophysics Software to Generate a Comprehensive Saturation Height Model
Jose Espinosa, Petrophysicist, Ecopetrol S.A.
To carry out characterizations of Colombian basins, it is challenging to generate a reliable saturation model based on logs, due to the basins’ heterogeneity, hydrodynamics, high content of feldspar and volcanic rocks, and the presence of fresh water – all of which have important impacts on resistivity measurements. More than 120 MICP and 48 porous plate test data were collected from two fields in different basins – the MMV and Llanos basins. Since time was limited, it was necessary to select a software tool for rapid visualization, management, comparison of different saturation model outputs to previous models, and rock typing. Using the Capillary Pressure module from DecisionSpace® Petrophysics software, it was possible to develop a comprehensive Saturation Height Model, including rock type analysis and results calibration, with a saturation model from the logs. The first process was to perform data corrections, choosing which correction to apply for each sample. Normally, this process would take about 4 hours or more for each sample, but the time involved in performing a bulk analysis was reduced by more than half. After corrections were made, the visualization process helped to associate results with rock type analysis and to generate independent models for each rock type. The Saturation Vs. Height Curves module from DecisionSpace® Petrophysics software allows carrying out a sensitivity analysis for the free water level (FWL), comparing it with the log-derived water saturation, and thus obtaining the best FWL that physically represents the reservoir. A comprehensive Saturation Height Model was the final result. This model was later used as 3D modeling water saturation input for the field, resulting in a better approximation of original oil in place (OOIP) estimation. By carrying out a well-to-well FWL sensitivity process, it was possible to identify the presence of a tilted water contact caused by the hydrodynamic process acting in the Llanos Basin.
Revitalization of Oil Production in Nezzazat Reservoir, Using Low-Cost Water Dump-Flood in Offshore Environment: Case Study from Gulf of Suez Petroleum
Ahmed Gamal, Senior Reservoir Engineer, Gulf of Suez Petroleum Company – BP JV
The successful natural dump-flood in the Gulf of Suez (GOS) offshore environment provided numerous lessons learned to the operator. Implementing the dump-flood in the subject reservoir was beneficial without spending money, which enhances the offshore practice and maximizes the gain from the abandoned reservoir with minimal capital cost. The only existing Nezzazat oil producer is the A1 well. Originally, it started production commingled with the Nubia and Nezzazat reservoirs. However, it did not compete well with the Nubia reservoir due to different permeability between the two reservoirs. Consequently, the well was re-completed as a Nezzazat-only producer in 2007 and flowed at 1,400 BOPD. The reservoir pressure showed sharp decline. The water dump-flood was initiated by commingling the target and water source reservoir through the A6 well, allowing water to naturally cross-flow into the pressure-depleted target reservoir. Cross-flow was confirmed, and the static pressure data was recorded in the producer and showed an increased pressure. A Nexus® 3D simulation model was constructed and history matched to evaluate and forecast the impact of dump-flooding in the reservoir after EDFU A6 was shut-in due to mechanical problems during the dump-flood process. One MMBO was successfully recovered from the A1 well after starting the dump-flood in A6 Nubia water, which led to defining different forecast cases in Nexus software to estimate the value for restarting the flooding in Nezzazat. The dump-flood technique was successfully proven in the GOS and can be applied in other fields in the GOS that require water injection in areas lacking injection facilities.
DecisionSpace® 365 Full-Scale Reservoir Simulation
Zainub Noor, Senior Product Manager Reservoir Management and Qinghua Wang, Product Owner Reservoir Cloud Solutions, Halliburton
Cloud is changing the customer value life cycle and how solutions are consumed. There has been an exponential rise in investments, as companies have clearly realized the benefits of lower cost of ownership for infrastructure and have gained more confidence in the security of today’s solutions. DecisionSpace® 365 full-scale reservoir simulation is an integrated cloud-native reservoir simulation service providing users more than the capability of running simulation models on the cloud. It is an extensive solution that offers enhanced user experience through automation and artificial intelligence, allowing engineers to focus on assessment of the models, thus creating more value. It integrates the reservoir simulation service with data storage, database management, and other customer-defined libraries for an integrated workflow. This cloud-native reservoir simulation adds multiple benefits, including 1) uncapped scalability for running models faster or running multiple models in parallel; 2) interactive reservoir simulation to modify as the simulation is running; 3) integration with Jupyter Notebook, an open source web application, for model edits; 4) data analytics and intelligence for resource allocation and performance optimization; and 5) mobility and collaboration for teams to work on models from multiple locations on any device without the need to transfer gigabytes of data or keep track of changes by multiple users. DecisionSpace 365 full-scale reservoir simulation empowers you with the capability of performing advanced reservoir simulation on the cloud, and enables true integrated asset management by providing the ability to model subsurface reservoir systems simultaneously with surface facilities in a tightly coupled solution environment. Extensive scenario analysis can be performed, using cloud resources on demand to optimize field development and maximize recovery.
Field Gas Lift Distributed Optimization with Automated Control for Holonic Systems
Srinath Madasu, Senior Technical Advisor, and Shashi Dande, Principal Architect, Halliburton
Gas lift is one of the popular artificial lift methods used in offshore, unconventional, and mature fields. By reducing the density of the fluid column, gas lift lowers the flowing bottomhole pressure, resulting in higher drawdown and inflow from reservoir to the well. It is well known that gas lift is a very forgiving form of artificial lift. As in any form of artificial lift, surveillance and optimization are required for maximizing production and lowering operating expenses. This work utilizes a Bayesian optimization approach for gas lift optimization. Bayesian optimization is a powerful framework for finding the optimum value of an objective function that is computationally expensive to evaluate for holonic systems. A holonic system is a hierarchical distributed multi-agent system. Field-level optimization is a hierarchical system, since the field is composed of reservoirs that, in turn, are composed of wells interacting with each other. The holonic, distributed multi-agent framework that we present uses reservoir models to support multi-well, distributed, multi-level optimization, as well as automated control for field-level production. Bayesian optimization is a stochastic approach to optimize black box functions while supporting multi-level optimization without the need for an implicit formulation. This case shows a single objective function to optimize profit in a well cluster with a single gas lift supply. The injected gas is optimally distributed between the wells, using a multi-agent framework to perform the optimization.
Large-Scale Subsurface and Surface Integrated Asset Modeling – Lessons Learned
Shaika Al Jenaibi, Senior Reservoir Engineer, ADNOC
Implementing large-scale projects within a company can be challenging. These projects provide a good learning curve that can be beneficial for understanding the complexity of the work involved. ADNOC implemented a large-scale subsurface and surface Integrated Asset Modeling project in order to optimize production planning and operations-related work processes. This presentation will describe the experience and lessons learned from the utilization of this solution, including organizational approaches and learnings to ensure success during implementation and utilization. The developed system was supported by several structured business processes, and orchestrating the analytical work processes was followed with the corresponding approval system. Data management was greatly improved by automated workflows that included more than 150 configurable rules. In addition, the developed solution leveraged the rigor of physical and data-driven models to provide a desired and stable outcome ranging from potential evaluation, quota definition, capacity management, business plan validation, and several other business processes. The system had the capability to isolate wells, sectors, reservoirs, and/or fields for further evaluation. The information classification and management (ICM) system in discussion is already being utilized for capacity and deliverability of the short-term production plans. The system is shown to be effective in evaluating various growth scenarios and identifying which assets, fields, and/or reservoirs can be utilized to achieve those targets. Developing and implementing the solution on such a large scale raised various challenges at organizational, infrastructure, and solution/workflow levels. This presentation will discuss those challenges and the lessons learned during the implementation of this solution.
Simplified Physics-Based Reservoir Models for Country-Wide Integrated Capacity Planning
Shaika Al Jenaibi, Senior Reservoir Engineer, ADNOC
The industry currently uses multimillion-cell models for individual reservoirs with run times in the order of hours to days. Coupling traditional full-scale reservoir simulation models with a country-scale integrated asset model with over 100 reservoirs is computationally challenging and time consuming. Each individual model is required to provide reservoir pressure, gas/oil ratio (GOR), and water parameters in order to forecast production performance at the reservoir, sector, and well levels. The developed “simplified physics” reservoir models are used to replicate the behavior of each reservoir simulation model, and they take a few seconds to run. Net flux regions are assumed to represent reservoir drainage pressure and saturation performance, and are matched to the reservoir pressure behavior from full-physics reservoir simulation results. The same methodology is utilized to create simplified models for each sector in the reservoir. Mapping procedures are developed to mimic the behavior at the well level. Artificial intelligence (AI) reservoir proxies (such as neural networks) are also developed, and are used to compare the results with the performance of simplified models. The developed simplified reservoir models mimic the behavior of reservoir simulations within a 2 percent margin. In addition, the developed neural-network-based proxy models mimic over 90 percent of the cases across simplified reservoir models and full-scale reservoir simulation models. Using these models has reduced the computation time from days to minutes. Simplified reservoir models derived from numeric simulation scenarios using this approach have provided effective results. The ease of utilization makes them attractive for implementation in big integrated and multi-asset models, such as those being currently used in the country-wide integrated capacity model.
Initiate Digital Oilfield Application at Mature Offshore Oil Field in South East Sumatera, Indonesia
Lucky Bagus Waskito, Reservoir Engineer, Pertamina Hulu Energi Offshore South East Sumatra
The SES Block is a mature offshore oil field that has been produced for over 50 years. One of the major challenges in this asset is over 380 wells using electric submersible pumps (ESPs) generating high operating costs and big volumes of high-frequency data. Data management, searching for relevant information, and data analysis are complex and time-consuming tasks, thereby complicating and slowing decision making. The objective of our digital field journey was to address the process and technical challenges in building digital oilfield software platforms to improve operational efficiency, reduce operating costs, and optimize company profitability. The designs of the technical workflows and business processes are tailored to accommodate any kind of input across disciplines from reservoir, artificial lift, production, drilling, and completion to produce output as desired. The early stage process starts from installing SCADA at all wells to be able to extract real-time data and transform it into viable information – and then to integrate, analyze, visualize, and augment this data via the analytical dashboard. Dynamic predictions using machine learning provide forecasts of ESP performance and failure prediction, while artificial intelligence insights suggest potential remedial actions. These seamless workflows will allow production enhancement in the SES Block. Digital oilfield technology can be a game changer for future development of the SES Block, but applying digital technology is easier said than done, as companies in our industry are finding out. We faced several difficulties that required time and complex algorithms to find the solution. However, this early stage of digital oilfield application will answer the challenges in managing offshore mature assets efficiently – enabling effective and rapid monitoring of large numbers of wells with fewer resources. It will also help to prioritize intervention analysis, detect ESP problems early, prolong ESP run life, and reduce downtime. This will further lead to cost reduction, thus optimizing company profitability.
Reaching Asset Potential – A Multiple Time Horizon Optimization Problem
Gerardo Mijares, Global Director Landmark Services, Halliburton
Every decision in the operation and management of production in the life cycle of a field is somehow an optimization problem where something is being maximized or minimized, or both, subject to a number of constraints. Optimization is taking place at each decision point. The frequency at which this optimization process occurs and the type of optimization depends on the particular decision being made, and optimization algorithms must be applied to obtain the desired outcomes. The interaction of the different optimization processes, the varying time horizons, and the disparity of data sources make integrating these lifecycle optimization decision processes difficult and rarely achieved.This presentation defines a common framework and structure to the production management and operation activities in the life cycle of an asset within the context of the work processes executed and the desired business outcomes. The framework allows for easy understanding of the interactions between the different production-related activities, and defines the mechanisms to achieve the desired total optimization in a structured manner. Within this framework, the presentation will explain the different optimization problems and technologies applicable to achieve the desired business objectives. Different optimization technologies and approaches are required to address the different dimensions and characteristics of the problem. Across all of them, however, a common sequence of steps to update and validate the models is applicable and must be followed prior to its use for optimization purposes. The presented approach applies across the boundaries of traditional operations, production, reservoir, and planning disciplines by looking at the problem in a holistic manner, and, as such, attempts to provide the appropriate multidisciplinary integration to properly address this problem.
Production Surveillance Digital Solution for Supporting a Mature Field Redeveloping Business Model in Eastern Ecuador Basins: IGAPO Consortium Case
Juan Carlos Sandoval, Reservoir Engineer, IGAPO
In 2013, Petroamazonas, Ecuador’s national oil company, decided to redevelop mature fields in the country’s eastern basins. Halliburton, through the IGAPO Consortium, was awarded the challenge of redeveloping four mature fields. This alliance contemplated investments in redesigned exploitation plans, along with drilling and workover campaigns, with the objective of achieving incremental oil production. This required performing exhaustive surveillance of field operations, production, and assets – combined with challenges in data integration; the transfer, storage, and management of data sources; global decision makers; and shared decision-making processes. Halliburton Landmark and IGAPO co-designed a solution that has evolved over the years according to emerging customer, operational, and technological challenges. This solution has a robust and reliable integrated architecture based on the DecisionSpace® platform and the DecisionSpace Production Monitoring application, where historical data, daily records and real-time data, production, and measurements are collected, managed, and integrated in the right context for different user profiles and decision-making processes. Moreover, the solution is implemented on the Landmark Earth® cloud environment, which enables quick and secure access from anywhere in the world and ensures that solution databases are structured, safe, and backed up. This solution has been very successful as the key tool for decision making, event tracking, and operational control processes. The dashboards required for different users/teams have been created and integrated within the solution. The solution’s interaction capabilities have improved the integration between fields, offices, and global management teams. The business benefits include non-productive time reduction, faster problem identification, and better shared decision-making processes. Further, the data repository has become a new asset that is constantly shared and used for future planning, efficient surveillance, modeling and simulation, and data science initiatives.
A Collaborative Solution for Integrated Water Management and Quality Surveillance in Production and Injection Operations – Ecopetrol Case
Sergio Velasquez, Reservoir and Data Analytics Consultant, Halliburton on behalf of Ecopetrol
Ecopetrol produces oil and gas from several mature fields – in most of the cases, with high water cut and using an enhanced oil recovery (EOR) process. This intense activity around water has generated the need for an effective monitoring and surveillance tool for volume management including the production, injection, and disposal of water, quality and fluid specifications monitoring, problem diagnosis, and regulatory permissions and planning. All the data sources related to this purpose were disconnected and not easily accessible. In most of the cases, spreadsheets were the most common source. The overall goal was to transform static and isolated data into actionable information. Landmark and Ecopetrol applied a co-innovation methodology for defining and designing the solution. A multidisciplinary team was engaged, developing a customized data model for addressing the company’s needs and creating an integrated data flow that would consume data on demand to enable volume management and water quality monitoring, and provide a unified, dynamic visualization environment for making decisions. The solution was powered by a subject-oriented data repository (data mart) in an online customer-defined architecture, and it delivered four contextual dashboards for flow management, water quality management, EOR projects, and permissions management. The company has experienced benefits while the solution implementation is in progress, thanks to the Integrated Water Management solution. Engineers are saving effort in data collection and analysis of upcoming bottlenecks in EOR planning (from weeks to one day), since this insightful unified dynamic visualization environment enables faster detection of issues in water quality, monitoring the usage of official repositories and improvement in data governance. The result is the ability to make important decisions based on data and not on intuition. The solution provided a solid platform that was extensible to other business areas.
Intelligent Water-Alternating-Gas Process Using Downhole Control Valve (WAG-CV): Concepts, Tools and Simulations
Steve Knabe, Global Director, Evaluation and Production, Halliburton
Water-alternating-gas (WAG) injection is an accepted enhanced oil recovery (EOR) process commonly applied to oil reservoirs to improve oil recovery beyond conventional water or gas injection. The objective of the WAG process is to reduce residual oil saturation after conventional water or gas injection, and to control early water or gas breakthrough to the producers. Depending upon reservoir conditions (such as fluid and rock types, viscosity, and rock wettability), water is injected in the reservoir for 2 to 6 months, followed by a slug of gas, and the cycle is repeated. A variation on this technique is simultaneous water and gas injection (SWAG). In this work, the authors propose continuous injection of water and gas, injecting water through production casing and gas through tubing while choosing the optimum injection points for water and gas selectively along the lateral section of the well. This injection selectivity is achieved by using several interval control valves (ICVs) and a new mechanical well configuration to enable continuous injection. A numerical simulation model determines the optimum intervals in which to inject water or gas at any given time. The final purpose of this study is to generate a new WAG-type process called WAG-CV, which improves oil recovery significantly compared to traditional WAG injection. This new, innovative WAG approach maximizes the oil recovery factor by using intelligent downhole control valves. A unique mechanical configuration enables selectively injecting gas or water in any given region. The new approach uses an advanced optimization technique that proactively simulates where and when to inject the required slug for a specific region, using 3D numerical simulation. The results demonstrate that this approach can increase oil recovery by 5% over the traditional WAG process and 15% over classic water injection. The simulations showed that the new process can reduce water cut significantly and hold the gas/oil ratio (GOR) low over a long period.
Hybrid Metamorphosis: New Data, Science, and Capabilities in Existing Systems Blur Technology Lines Between Fossils, Renewables, Cloud, and the Edge
Keshava Rangarajan, Chief Architect Landmark R&D, Halliburton
Oil and gas companies have now realized the need to create the building blocks of emerging operating models from the ground up in order to tackle the challenges thrown up by the oncoming and increasingly volatile, hyper-connected digital reality. Computational platforms, real-time connectivity and mobility, streaming analytics, unmanned aerial vehicles, unmanned ground vehicles, remote monitoring and surveillance, predictive asset maintenance, post-modern enterprise resource planning (ERP), ERP extended to the Internet of Things, blockchain, cognitive and machine learning technologies, open-source software – these are all part of the lingua franca of this seemingly boundless new world.
The Cloud Experience at Eni, Using Geoscience Workflows
Mario Fiorani, IT Manager, Eni and Achille Miele, Information Management Platform Technology Principal Consultant, Halliburton
In the last few years, the oil and gas industry has been paying more and more attention to cloud technology, and it has become more involved in discussions on whether and/or how to make the cloud platform part of this ecosystem. For years, Eni has been part of this journey. IT and business professionals have begun to discuss the opportunity and value of including the cloud in their operational model of application delivery. As an outcome, at the end of 2018, exploration and technical IT organizations decided to experiment with implementing typical geology and geophysics (G&G) workflows within the cloud environment. A real G&G project, set up in DecisionSpace® Geosciences software, was chosen from an exploration projects portfolio and transferred onto the Microsoft Azure cloud platform. Consequently, for the next six months, the G&G interpreters implemented G&G workflows with DecisionSpace® Geosciences software – moving data from/to the cloud to/from on-premise infrastructure, and saving results in the corporate repository. This presentation will highlight the key points and success factors of this experience, with attention to constraints and issues encountered and how the team was able to make the project work with an astonishing rate of uptime and availability. Lessons learned will feed future adoption of the cloud platform for the benefits of upstream workflows and the operational model of technical computing infrastructure.
Petrobras Implements Web Applications to Meet Regulatory Changes in Well Integrity
Fabio Sawada Cutrim, Petroleum Engineer, Petrobras
The National Petroleum Agency of Brazil launched a new regulation in 2016 related to well integrity, establishing requirements and guidelines that will be applied during the well life cycle. With this change in regulations, Petrobras reconsidered its methods of information management and control for ensuring well integrity. As this volume of information is huge, it became necessary to automate this information in a structured way. To solve this challenge, Petrobras developed different web applications to generate, control, and monitor all information related to well integrity, according to the new requirements. All the applications were developed with the guidance of Petrobras, and were implemented by external companies over the last three years. The applications were implemented in two integrated platforms called “PoçoWEB” and “PortalCTPS.” The Landmark team developed the solutions for the second platform, using DecisionSpace® Integration Server software and taking advantage of easy integration with Engineer’s Data Model™ (EDM™) software. Additionally, Petrobras has been working on an R&D project with real-time applications to develop digital twins with data analysis solutions to monitor the main safety factors of wells. Since the launch of the new regulation, Petrobras has implemented and connected six applications on the DecisionSpace Integration Server platform, along with four applications on the “PoçoWEB” platform and one application in the production area. The main functionalities of these applications are to control and monitor people on board, and their certificates to work on drilling rigs; control and monitor equipment certificates; and consolidate all of the well’s information in anticipation of well handover, among other tasks.
Holistic Data Management Strategy for Large NOC’s
Sacha Abinader, Managing Director, Accenture, Salvador Almazan, Senior Manager, Accenture and Dale Blue, Senior Manager for Global Digital Services, Halliburton
Seizing the potential of big data is essential in today’s energy environment. Analytics can unlock the promise of big data, bringing to light new analytic insights at all stages of the industry value. And while analytics powers performance for better business outcomes in the energy industry, challenges – such as poor data quality and integration, fragmented use of analytics, and patchy ownership of data across processes – have created a “missing middle” for energy companies. Analytics can be a core catalyst to achieving high performance in the energy industry, but it takes pragmatic action to become analytically empowered. This presentation will travel through the data management and analytics journey to ROI in energy. It’s a path to seeing value in massive and diverse data, adopting an end-to-end process view to yield better insights, and making the cultural shift to become an insight-driven enterprise.
How Gyrodata’s Guide Center Has Increased Efficiency, Productivity, and Deliverables for Well Engineering Services
Erik de la Fe, Well Planning & Well Engineering Supervisor, Gyrodata
Collecting, interpreting, and managing data are critical elements in Gyrodata’s analytical processes, comprising an important phase in delivering our services to our partners. However, the amount of data coming into Gyrodata’s Guide Center from drilling rigs, customers’ offices, and companies’ archives can be overwhelming and difficult to analyze. We had to find a way to organize all this data and make it accessible on a 24/7 basis. By using Landmark cloud-based solutions to access all the applications, we now have the ability to easily connect from anywhere, and to collect, categorize, and graphically display the data received – making it easier for our Well Engineering team to analyze it and understand it. The combination and automated interaction of these cloud-based planning and engineering programs allow us to optimize the overall operations of a well in the planning phase, provide real-time modifications while drilling, and apply learnings (post well analysis) to future wells. Gyrodata’s Guide Center has been successful in meeting and exceeding our partners’ expectations and requirements. Landmark cloud-based solutions have given us the ability to do more with less, to focus more on integrating new practices, and to provide groundbreaking new technologies to improve drilling and post-drilling phase operations. The automation of pad well planning in all environments has led to the reduction of man-hours, thereby increasing the productivity of drilling specialists by allowing them to spend more time analyzing real-time drilling data. In West Texas, for example, our Guide Center Analysis Report saved one of our partners over five days of drilling. Additionally, to date, we have had several challenging laterals of more than 2 miles (3.2 kilometers) in length, with conventional assemblies, that are setting new drilling records with new performance goals.
Fueling Digital Innovation with AWS Cloud-Native Services
Carlos Castro, Global Solutions Architect, Amazon Web Services, Alex Page, Global Director of SmartDigital® Co-Innovation and Amanda Smith, Technology Development Manager IMPT, Halliburton
In this presentation, Amazon Web Services (AWS) and Halliburton Landmark will share examples of customer innovation that have been enabled by AWS cloud-native services, the OpenEarth® Community (OEC), and SmartDigital® co-innovation across the exploration and production (E&P) industry. AWS, a leading provider of cloud computing, offers cloud-native services – such as high-performance compute, the Internet of Things (IoT), machine learning, storage, and more – that fuel oil and gas companies’ digital innovations.
Agile Database Approach to Non-Operated Assets
Paulo Gomes, Petroleum Engineer, Galp Exploração e Produção Petrolífera
Galp’s operational information for non-operated wells was maintained in Excel. Any changes in Excel resulted in inducing process errors, data inaccuracy, and analysis inconsistencies. To solve this challenge, Galp was searching for a robust and up-to-date well database for non-operated assets. Using a structured database would further allow Galp to benchmark well activities and to create more accurately detailed operational metrics in order to influence operators’ decisions and optimize well operations. To standardize the non-operated well data in a structured database, Galp decided to use EDM™ software. This solution involved a one-time mapping and migration of all relevant information from Excel to the EDM database, using automated scripts. It also allowed Galp to conduct benchmark and detailed analysis, using the system’s Data Analyzer, which helps operators make critical engineering decisions during operations. With this innovative workflow, Galp was able to identify operational improvements and analyze rig performance status, both in a single field and across multiple regions and operators. Using a more robust and reliable database enabled Galp to improve its QA/QC benchmark results by augmenting the consistency and automation of analysis.
Digital Solution for Drilling and Well Services – An Example of How to Take Advantage of Your Data
Ricardo Bustos Acosta, Drilling Optimization Team Leader, Ecopetrol S.A.
Oil companies have been registering their operations through spreadsheets, isolated applications, or unique repositories (silos). This, however, does not allow them to share or obtain benefits from the data collected, which are normally presented after a manual reporting process in order to make decisions. Additionally, the data cannot be accessed or updated online, and it cannot be integrated with the rest of the data that exists for the same field, area, or country. This working modality challenges us to ask these questions: What is the smart way to take advantage of our data? How should we integrate all the data? How can we best use this data to make real-time decisions? This digital solution was recently implemented at Ecopetrol S.A. It was designed to be applied to high-performance data models and big data analysis strategies in order to increase the analytical power of historical data from drilling data and well services, collect the data in a centralized and automatic way, and execute the analysis process directly from the data sources and with a standard workflow – thus facilitating the decision-making process. This digital solution for drilling and well services began operating in January 2019. In the first 4 months, there was a 20 percent increase in the total number of users accessing the analysis of the results. New users included project managers, regional evaluation leaders, and data management coordinators. These users did not have prior access to this type of analysis using real-time information, and, with this solution, their search time for analytical results was reduced by 25 percent.
BHP’s Next Generation Work Station Ready (WSR) Well Logs
Karl Brand, Principal Master Petrophysical Data Management, BHP
Well logs provide critical data on rock and reservoir properties, and it is important to get the right well log data into the right hands at the right time. Traditionally, however, it takes hours to search for the right data in multiple locations, and then to manually convert these raw logs into work-station-ready (WSR) data. BHP wanted to automate the processes of generating standard WSR log sets and delivering these WSR log sets to the appropriate interpretation applications. BHP utilized the Landmark borehole data management (BHDM) system to create a single source of truth for its well log data from BHP’s conventional and unconventional assets. After loading the well log data into a centralized repository, the next step was to build a WSR process that would create the WSR log set, and then automatically deliver it to interpretation applications. This project involved three stages: 1) developing the process, 2) implementing business unit refinements, and 3) improving related standards and the curve selection process. In 2018, BHP collaborated with Landmark to create the next-generation WSR automated process, which creates a WSR data set within minutes instead of hours. It also delivers WSR data to the appropriate application, and enables geology and geophysics (G&G) staff to focus on exploration work, instead of on data preparation. This next-generation WSR process reduces the time it takes for daily data loading and interpretation, thus allowing for quick and efficient decision making.
Best Practices in Drilling Data Quality Management and Automation
Vividh Paliwal, Technical Sales Consultant – IMPT, Halliburton on behalf of ADNOC Onshore
ADNOC Onshore had challenges with the quality of its drilling data – inconsistencies and errors while entering the daily drilling reports; duplications, redundancies, and inconsistencies in data maintained in different databases and platforms; and a lack of integration and data exchange. These challenges led to unreliability and a lack of trust in the data. Current practice involved manual cleansing, which was time consuming with limited or no results. ADNOC Onshore needed a consistent and sustainable solution that could automate quality management workflows and address all the challenges through one solution, so that the time taken for decision making could be better utilized. In response to the challenges faced by ADNOC Onshore on drilling data quality and automation, Halliburton Landmark proposed a consistent data management strategy to take advantage of the technological functionalities available with DecisionSpace® Data Quality software. Deployment of Data Quality Rules in this software, and its integration with EDM™ software, helped to monitor and improve the quality of large amounts of data entered in the EDM system through OpenWells® software. This solution provided ADNOC the ability to automate a complete suite of rules-based tools in order to diagnose and correct bad data at the source, before this data could affect daily work and strategic decisions. The ability to configure customizable Data Quality Rules is helping ADNOC to have consistency in the data, reduce duplications, and avoid redundancies. The Data Quality Management and Automation Solution was successful in its pilot deployment phase, and ADNOC witnessed a significant improvement in data quality throughout the process, which puts the company on the right track to achieve its ultimate goals, which are to improve data quality by 50% in one year and then to 90% at the end of the second year; fully automate its data quality management; and provide data consistency and quality feedback to users in order to improve the data quality throughout the data management life cycle.
Insights Into Hydraulic Fracture Geometries, Using Fracture Modeling Honoring Field Data Measurements and Post-Fracture Production
Ron Dusterhoft, Technology Fellow, Production Enhancement, Halliburton
In unconventional resource plays, fracture models are used to determine the fracture geometries being generated. While the development of complex fracture models continues to evolve, the industry can still gain insights into fracture geometry by using planar fracture modeling. This presentation will demonstrate the results of honoring data measurements from a multitude of sources, and discuss the resulting observations and conclusions. Examples will illustrate the vast amounts of information that are possible to help engineers do a better job of including more diagnostics into routine operations in order to provide additional insight and, ultimately, optimal models and completion designs.
Reducing Uncertainty in Unconventional Reservoir Characterization
Seth Brazell, Geologist, Occidental Petroleum
Unconventional reservoirs of the Permian Basin hold tremendous potential for resource opportunities. The capacity to safely and economically produce oil and gas from these reservoirs hinges on understanding and predicting production drivers. A critical component of those drivers is reservoir quality. Modern and legacy development methods have generated a wealth of subsurface data that can be harnessed to reduce the uncertainty of unconventional reservoir characterization. Subsurface characterization workflows are often manual and time consuming, limiting the amount of data that can be analyzed. New tools are needed to better address the challenges associated with unconventional resources. Advanced analytics and machine learning (ML) are emerging technologies in the geoscience domain. These techniques are incorporated into the development of novel tools and workflows that efficiently leverage significantly more data, reducing uncertainty and generating more accurate probabilities at scale. Coupling ML products with powerful geoscience platforms, like DecisionSpace® software, generates robust subsurface models for visualization and analysis. A high-fidelity view of the subsurface enables petrotechnical professionals to more accurately quantify subsurface heterogeneity, reduce uncertainty, and understand key production drivers to enable timely and informed decision making. Anadarko is actively developing and deploying ML-based tools for unconventional reservoir characterization. These tools generate significantly larger datasets than have traditionally been incorporated into subsurface modeling. The datasets generate high-fidelity maps and models that are further used as inputs for advanced MVA models. By applying ML algorithms, Anadarko has generated many thousands of corrected and synthetic petrophysical logs, stratigraphic correlations, and geomechanics properties. These high-density datasets are generating high-fidelity, basin-wide models that can be used for sweet spot identification in campaign planning.
Listen Now
Unconventional Drilling in the New Mexico Delaware Basin
Garrett Granier, Senior Drilling Engineer, Occidental Petroleum
Many things can negatively impact drilling operations for unconventional horizontal wells, such as lateral lengths increasing over time, geological conditions (high-pressure flows, lost circulation, salt and anhydrite), and various other drilling effects. How do you continue to minimize well cost while maximizing performance? To solve these challenges, we implemented changes to wellbore architecture and casing setting depths. We also switched to drilling with rotary steerable systems, and used optimized pad layouts and well trajectories. The company also continued improvements in rig specifications and drilling fluid technology. We implemented techniques to predict, prevent, and mitigate the effect of downhole vibrations, and also optimized bit and bottomhole assembly (BHA) designs, as well as operating practices. Through these practices, overall drilling speed increased by 149 percent (measured in feet per day) during the period from 2012 to 2017. During that same period, average lateral lengths increased from approximately 3,000 feet (914 meters) in 2016 to approximately 7,000 feet (2133 meters) in 2017, with 10,000-foot (3048-meter) laterals now the most common. The economic success of New Mexico unconventional developments was heavily dependent on minimizing well cost, as it was a difficult area requiring complex optimization.
Listen Now
Applying DecisionSpace® Gun Barrel View for Easy Visualization and Interpretation of Complex Unconventional Well Planning Scenarios
Kane Nile, Geologist, Occidental Petroleum and Stephen Williams, Technical Advisor, Halliburton
Subsurface geological complexities and uncertainties of unconventional plays result in complex well planning scenarios that can be difficult to illustrate for management and investors. Using manual methods to create and annotate these well planning scenarios takes hours, and any updates to the subsurface model require the entire view to be recalculated and rebuilt. Thus, an automatically updatable visualization and interpretation tool for multiple well planning scenarios was required by the operators working in North American unconventional plays. Within DecisionSpace® Geosciences software, a tool for visualization and interpretation of complex well planning scenarios was created to overcome these challenges. The collaborative environment of the DecisionSpace platform has played an important role in the development of this new feature. A Gun Barrel view of complex well planning scenarios automatically measures and displays targets in a section view. This Gun Barrel view dynamically displays the spacing between new wells, existing wells, subsurface hazards, leases, and target zones to compare various development scenarios. The Gun Barrel view has been implemented in Anadarko’s Permian unconventional activities, accelerated by the SmartDigital® process from Halliburton Landmark. With this view, multi-zone targets and well spacing can be easily visualized and interpreted in terms of subsurface asset complexities, thus minimizing preparation time and uncertainty. The Gun Barrel view can also be converted into presentations to show complex well planning scenarios to management and investors to develop multi-year drilling campaigns.
Listen Now
Elastic Mechanical Properties from SEM and X-ray CT Imaging in Tight Formations
Joel Walls, Director Unconventional Resources Technology, Halliburton
Elastic properties of unconventional rock, including gas/oil shale and tight gas sand, are crucial in hydraulic fracture modeling. The two most important rock elastic properties are Young’s modulus and Poisson’s ratio. An alternative is to use rock physics modeling applied to mineralogy and porosity computed from ion-milled scanning electron microscope (SEM) images to compute elastic constants from small rock fragments. This method can also be applied to data from whole core computed tomography (CT) scans. This approach was used to develop a digital rock workflow to compute elastic properties from rotary sidewalls cores, drill cuttings, and core CT data. The new approach combines quantitative information obtained from 2D ion-milled SEM images with rock physics effective-medium models used to relate volume properties to elastic properties. These models can be obtained from wireline and/or laboratory measurements of bulk rock volumetrics together with elastic rock properties. This process of finding a rock physics model is called rock physics diagnostics. The SEM images provide porosity, organic matter volume, and pore structure. Mineralogy of the sample obtained through quantitative X-ray diffraction is added to those inputs. Well log data relevant to the local area are then used to establish a rock physics model linking the elastic properties to porosity, organic matter content, and mineralogy. These models are established for each basin and formation, based on available wireline log data. High-quality wireline data is key to successful rock physics diagnostics. In this study, wireline logs and core samples were obtained from a well in the Wolfcamp A formation in Texas, where elastic properties closely matched wireline log data. High-resolution CT scan data computed a thin-bed layer frequency curve, which was used to improve GOHFER® hydraulic fracture growth models. This approach does not require cores, and so can be especially valuable in quantifying elastic and mechanical properties along the lateral wellbore where wireline logs are seldom available.
Listen Now
Integrating a Drilling Events Database, DecisionSpace® Geosciences Software, and Seismic Imaging to Predict Subsurface Drilling Events
Kelsey Lewis, Geologist and Forbes McIntosh, Geophysics Technical Lead, BPX Energy
Unconventional shale exploration and development in North America continues to push the limits of technology as development expands into deep, fault-prone, high-pressure, and high-temperature environments. In areas lacking well log data, the prediction of subsurface fluid movement and potential cross flow in and out of a wellbore during the drilling process is challenging. Pressure differentials in an overpressured stacked alternating clastic-carbonate overburden section can lead to such cross-flow conditions that drilling is unable to increase mud weights to the density required to safely and successfully execute the well plan. The aim of this study is to predict geohazards in the overburden section with limited well log data. The prediction process begins with the creation of a drilling events database that is implemented using the suite of tools in DecisionSpace® Geosciences software. An innovative approach to predicting overburden drilling hazards results when the drilling events database is combined with seismic imaging. This combination of real-time drilling events with innovative seismic imaging yields a library of event analogs that provides a better understanding of the depositional environments, the local structural fabric, and the lateral probability of encountering drilling events of a similar nature. The workflow presented here combines drilling events with seismic imaging, and this integration leads to optimal well placement and improved geohazard identification and prediction throughout the thick overburden section. Failure to predict the occurrence of these losses and gains in the overburden section can have a negative impact on the development program economics. Additionally, it provides insight into variations in depositional environments, which has the potential to identify future prospectivity. The overall impact of this workflow is a safer and more economic development program with fewer unmitigated events and also fewer junked wells.
Listen Now
Getting the Most Out of Your COMPASS™ Software
Jonathan Lightfoot, Drilling Engineering Consultant, Occidental Petroleum
Quite often, it is difficult to find key well records needed for collision avoidance work. Our company needed to create a simplified offset wellbore research tool to aid in collision avoidance research workflows. We had the desire to make this available directly within the EDT™, COMPASS™, and OpenWells® applications. Innovative hyperlink research tools, workflows, and collision avoidance methods were employed directly within the COMPASS application. We were able to create useful links directly to critical well records. Important offset well environment details – such as scanned records, digital files, directional surveys, wellbore logs, GIS maps, and other useful offset information – were added directly to the user interface. Engineers and well planning team members now have efficient access to helpful well research tools directly within the COMPASS and OpenWells applications. This allows for efficient research when evaluating the status of offset wells in close proximity to future planned wells. These new methods of research improve the quality of work and streamline efficient direct access to critical well attributes and information, thus saving time and helping to ensure that the most accurate information is used in the planning and wellbore construction process.
Listen Now
Pore Pressure Estimation from Velocity Model in Tight-Gas Sandstones in Argentina’s Neuquén Basin
Mailen Fontana, Geoscience Technical Consultant, Halliburton on behalf of YPF
Knowledge of pore pressure behavior in a reservoir is important for optimizing drilling and production. The Jurassic tight-gas reservoir sequence in the Neuquén Basin consists of fluvial sandstones interbedded with shales and exhibits a low-relative impedance contrast. It is typically a Class I amplitude-vs.-offset variation, with permeability between 0.002 to 0.1 mD and 5–9 percent of porosity. This method of pore pressure prediction focuses on knowing the geological setting and compartmentalization, instead of the premise that pore pressure affects the compaction-dependent geophysical properties. In this presentation, we will discuss the abnormal pore pressure distribution found in the reservoir. A 3D Depth Hybrid Velocity Model case study, using horizons, faults, well picks, and interval PSDM migration velocities inside a structural framework, led to a robust comprehension of the pore pressure signature at the Punta Rosada formation. We are using Dynamic Frameworks to Fill® software to build a Velocity Model that honors the structure of the area. Then, the Hybrid Velocity Model is used as input in the Geopressure Analysis tool to calculate density, overburden gradient, pore pressure, and fracture gradient volumes. Finally, the volumes were blocked using the Earth Modeling tool, along with the 1D data from each well that was provided by the geomechanics team. New models and attributes extraction at the horizons of interest show us that the pore pressure is driven, primarily, by structural patterns: the key faults. In addition, their implication on subsurface pressure uncertainly causes recurrent challenges to reach the objective targets. Additionally, the velocity reversal is an indicator of possibly high overpressure, i.e., a zone in which the velocities all drop below the top of the Vaca Muerta formation: the Upper Punta Rosada formation. After analyzing the pore pressure behavior obtained from the calculations, we can find a strong relationship between the geological structure of the study area and the pore pressure distribution.
Listen Now
Automated Unconventional Production Forecasting Based on SPEE Monograph 4 Guidelines
Sergio Sousa, Managing Consultant – Reservoir and Production, Halliburton
Decline curve analysis (DCA) is an important tool to manage any unconventional asset, but appropriate use of it requires a lot more effort on unconventional assets because boundary-dominated flow cannot be assumed. Flow regime identification and the use of DCA methods that do not require boundary-dominated flow to be in place are essential elements for more accurate forecasting results. The solution defines a data warehouse to support all calculations with main input data source being an ARIES® database. The full automation is supported by AssetConnect™ software, while visualization for all results is defined in DecisionSpace® Analytics software. Main forecast results are stored back to the ARIES database. This solution is a configurable automated system that does all the heavy lifting automatically – from quality assurance and checking, through data normalization, flow regime identification, hindcasting, and, finally, forecasting – to provide improved production profiles for existing wells and traceability of all the underlying analyses, assumptions, and results.
Listen Now
Rapid Basin-Scale Geologic Reconnaissance Using Unsupervised Learning
Didi Ooi, Geologist, Occidental Petroleum
Understanding the combined effects of multiple wireline logs, along with depositional environments and lithologies, are key steps in developing a comprehensive understanding of unconventional reservoirs. However, to characterize reservoirs on a basin scale, it is necessary to efficiently analyze and integrate large wireline log data sets. Current software packages offer very limited basic unsupervised learning algorithms that can be used to quickly understand the natural separation of data sets. This time-consuming challenge is compounded by significant heterogeneities that exist within a basin, along with issues associated with data integrity. This presentation will highlight the possibility of analyzing tens of thousands of well logs, using an automated workflow that involves quality control (QC), statistical analyses, and unsupervised learning analyses, with the intention of rapidly identifying log-scale rock types and core-scale lithofacies. To further capture the varying degrees of heterogeneity observed within each of the target formations, clustering algorithms for each formation are chosen based on key performance indicators (KPIs). This integrated workflow allows us to rapidly perform high-resolution reconnaissance of the geology on a basin scale, and to assist in geological mapping and modeling, and petrophysical analyses. The results of this automated workflow highlight laterally continuous clusters that reflect the different paleoenvironments of the Delaware Basin – from the margins of the carbonate platform to the proximal part of the basin. Contrary to existing workflows using unsupervised learning, results can be quantified using KPIs (i.e., gap statistics). Using a similar workflow on the core data, a lithofacies scheme for the Permian Basin was rapidly established for each of the key formations. Benchmarking results against historical regional studies indicated that this workflow was able to quickly establish high-quality reservoir characteristics on a basin scale.
ESP Failure Prediction
Nasser Al Mahrooqi, Subsurface Data Innovation Manager, Petroleum Development Oman
Smarter Well Engineering Concepts Aid in Reducing Planning Time and Increasing ROP
Nasikul Islam, Industry Solutions Advisor, Halliburton on behalf of Wintershall Dea
Modern well engineers struggle with digital confusion; they either have too much data, or not enough, and the quality of the data is often questionable. This presentation will illustrate how the rate of penetration (ROP) can be optimized in any given field with an automated and time-saving process for designing wells, using machine learning (ML) techniques. We will discuss applying ML algorithms to predict optimized ROP for a prospect well. This model was trained with historical datasets from offset wells for two primary fields in Wintershall DEA North Sea operations, and the detailed workflow included data analysis for quality and completeness, statistical regression, and optimization. Results of the ML computations to an actual dataset of a well drilled were compared, and a high correlation was observed. The optimized results showed a 20% to 30% improvement in ROP from the benchmark well. This concept can be further developed to recommend optimum drilling parameters while drilling in real time in the absence of drilling engineers’ interpretations. A real-time drill-off test was validated on these datasets for optimizing the ROP based on weight on bit (WOB), flow rate, and total downhole RPM. Using a simple ML model and real-time feedback loop, drillers can continuously optimize based on real-time and historical data feed without the need to stop and perform a drill-off test. By prescribing optimized ROPs through automated ML of offset well attributes, engineers can push technical limits. Automated analysis, regression, and visualization of high-volume data can significantly reduce planning time and help establish optimized operational parameters to reduce drilling time and costs. The next step is to build a real-time downhole advisory system to help achieve the predicted ROPs by predicting and prescribing drilling parameters ahead of the bit. With this process, drilling engineers are able to do more value-added engineering and less data mining and management, and to ultimately drive efficiencies throughout the well construction process.
Benchmarking – Clustering Wells in the Construction of Offshore Wells
Fabio Sawada Cutrim, Petroleum Engineer, Petrobras
The implementation of benchmarking, as an instrument of business management, aims to verify the processes adopted, comparing them to each other, in order to increase the efficiency of operations. Applying it to the oil industry, the main challenge is to properly group the wells, so that their similar characteristics can be compared properly. Usually, the groups are divided based on the experience of the technicians, but this is completely error prone due to manual intervention – thus, triggering the need for generating an algorithm that can create the grouping objectively, considering all the parameters available. Initially, according to the method developed jointly by Petrobras and Halliburton, the wells were divided into two large groups: pre-salt and post-salt. Then, within each of these groups, artificial intelligence techniques were used to group wells with greater similarity, using clustering algorithms on variables obtained from historical data, such as drilled interval, water depth, fluid weight, start and end true vertical depth (TVD), and pre-salt and post-salt, among other variables. Given this cluster, new wells can be allocated through the application of classification algorithms. Our experiments indicated a difference of up to 60 percent in the duration of wells belonging to different clusters. In this way, clustering can be applied directly to decision-making processes, providing support for monitoring all the operations involved in well construction. Also, automatic clustering and allocation of the wells can mitigate distortions generated in a prevalent process, since it reduces the group subjectivity. Finally, in the future, we plan to extend this approach to completion and workover interventions, covering the entire life cycle of a well construction.
Building Human Superpowers With Machine Intelligence & Software UX
Robert Tuttle, Executive Technology Director, frog design
Human actors working inside digital systems during both exploration and production (E&P) cycles require unique forms of situational awareness and contextual interaction across the spectrum of data, information, and insights to take on the appropriate amount of cognitive load for decision and action. When bridging physical and digital workflows across production operations, human intelligence and muscle memory still provide critical checks and balances at scale. Machine learning cannot (yet) replace human intuition and experience. The explosive proliferation of business data and its supporting technologies can easily swamp traditional methods of processing and analysis, resulting in enormous expenditures in time and effort with potentially little gain. Methods of artificial intelligence, machine learning, natural language processing, and computer vision have developed at pace with the data explosion, and offer an exceptional set of tools for extracting actionable intelligence and business value in a more efficient and automated manner. However, like any complex set of tools, their proper use and potential misuse can be highly nuanced. Embracing this new role of data and this new toolset as enablers for highly intelligent user experiences provides tremendous opportunities. Valuable feedback loops for software user experience are created in the product and service development life cycle when data is understood and managed as both an output and an input. Workflows become more contextualized and operations are made more efficient when human intelligence becomes empowered and augmented with artificial intelligence, rather than completely replaced by it in full-automation scenarios.
A Multidisciplinary Approach for Dynamic Rock Typing Characterization Using Artificial Neural Network Methods in Hamra Quartzites Reservoir
Abdallah Sokhal, Head of Wireline Logging Operations and Data Management/Senior Petrophysicist, Sonatrach
The previous geological model of the quartzites Hamra (QH) reservoir failed to accurately classify and estimate the reservoir’s petrophysical properties. The QH reservoir comprises five electrofacies (EFs): EF1, EF2, and EF3 are similar to clean sandstones with different rock qualities (porous to compact sandstone), while EF4 is shaly sandstone and EF5 is shale. In this work, we suggested a multidisciplinary workflow to recharacterize the QH formation, using artificial neural network (ANN) methods. The main propose was to delineate EFs present in the QH reservoir by using an unsupervised ANN self-organizing map (SOM) clustering algorithm, and also to determine the hydraulic flow unit (HFU) and permeability in uncored wells by using a supervised neural network (NN) algorithm and a flow zone indicator (FZI) approach. Our results concerning the HFU indicated complete agreement with the geological and sedimentary control. Also, the FZI approach suggested the presence of eight HFUs in the QH reservoir. The best ones (such as HFU1, HFU2, and HFU3) were mainly located in zones QH2 and QH4. In this approach, our results also indicated complete agreement with the production testing results. Finally, permeability determined by NN is correlated to core permeability with good correlation coefficient (0.6), which indicated the conformity of our proposed approach.
Water Coning From Simulation to Neural Network – Comprehensive Study of Coning Prediction and Factors Affecting It
Islam Zewain, Senior Reservoir Engineer, Gulf of Suez Petroleum Company – BP JV
Producing undesirable phases, like water and free gas in oil wells, is a challenging problem in the oil industry. The main major reason for that problem is water coning, which is a rate-sensitive phenomenon generally associated with high producing rates. Strictly, a near-wellbore phenomenon, it develops when pressure forces draw fluids toward the wellbore to overcome the natural buoyancy forces that segregate gas and water from oil. This work uses Nexus® simulation software to build different mechanistic models with different parameters that affect water coning formation in oil reservoirs. Simulating water coning is a very challenging problem due to the instabilities of solvers regarding the severe saturation change around the wellbore, unless small time steps and small grid sizes were used. Local grid refinement (LGR) is used to accurately follow up water coning formation and to minimize solver convergent errors. Simulations were used to quantify the effect of every parameter on water progress to form coning around a wellbore. In this regard, a neural network was built using input and output parameters from the simulation runs to devise a simple approach for evaluating the critical rate of production and how the uncertainty in every parameter would affect the recovery and coning formation.
Improving Machine Learning Workflow and Business Value for Oil and Gas Applications
Patrick Brody, Avanade and Fahad Ahmad, Bo Liang, Meng Zhang, Chao Yang, Halliburton
Machine learning (supervised and unsupervised) and deep-learning techniques have been widely used to solve a variety of oil and gas problems. However, a few characteristics limit its success, such as data availability and quality, the heavy requirement of data preprocessing and feature engineering, the lack of correct labels, and the complicated business value not in pre-existed mathematical optimization. So, we applied two deep-learning approaches, including supervised and unsupervised, to oil and gas business cases. The first approach is a hybrid approach combining a fully convolutional network and a long short-term memory network (LSTM) with an attention mechanism. Temporal convolutions capture the local variations in the data, whereas LSTMs capture the long-term variations in the data. Fine-tuning of the model involves developing customized metrics based on business need. The second approach is utilizing an autoencoder, a deep-learning unsupervised method, for anomaly detection. The key idea is to train a model that encodes high-dimensional data to a low-dimensional space (latent-space representation) and then decodes them back to the original input. In comparison to a supervised machine learning technique for anomaly detection that can only detect a fixed set of features, an autoencoder is unsupervised and useful properties can be learned in latent-space representation. The model parameters are optimized to reduce reconstruction error. If the features within the input data are independent of each other, the decoding could be difficult, leading to reconstruction error. However, if there is a norm within the input data, this norm could be learned automatically by an autoencoder and leveraged when decoding the data back to the original input. When this approach was tested on equipment, the result was positive, presenting a difference in reconstruction error between data from the beginning and end of the equipment’s life span. Methods above were applied to different projects, resulting in substantially reduced turnaround time and improved business value.
Nature (Sometimes) Knows Best
Richard Burgess, Instructor, Texas Tech University
The natural world has served as a rich source of design inspiration throughout history. Design aesthetics and functionality have both been enhanced by imitating nature – a practice referred to as biomimicry. The appeal of biomimicry is not difficult to appreciate. Some proponents maintain that nature serves as an excellent laboratory and incubator where designs are tested and refined over many years. The durability and commercial success of products as diverse as wind turbine blades and Velcro bolster this argument. Moreover, the promise of biomimicry is not limited to the materials domain. Scientists, engineers, programmers, economists, and more are becoming increasingly interested in how human processes/systems can be improved by imitating nature. Given the complexity of some of the system-level challenges that we as a species are facing, the optimism underwriting this trend is understandable. However, natural processes have often evolved as the result of different pressures than we humans are facing, and, consequently, they serve different purposes. Imitating nature without understanding contextually significant details risks not only failing to solve the problem at hand, but also exacerbating it. In this talk, two examples of process biomimicry are discussed with an eye toward identifying general design heuristics that can be used to increase the likelihood of success. It is appropriate to continue to look to nature for design inspiration, so long as this is done from an informed, critical perspective.
Artificial Intelligence (AI) or Intelligence Augmentation (IA)? How to Produce Not Only Better Engineering Designs, But Also Better Designers
Mike Kirby, Associate Director, University of Utah
Artificial intelligence (AI) systems – and, in particular, machine learning (ML) algorithms – seem to be taking roles we thought were reserved for specialists, spanning medicine to engineering. Maybe Warren Bennis, the famed University of Southern California Marshall School of Business professor was right about the “factory of the future.” He claimed, “The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.” However, now there is a push in the other direction – namely, that AI and ML systems do not replace the need for specialists, but rather augment their skills, especially when dealing with uncertainty. Researchers, analysts, and decision makers are not only interested in understanding their data, but also in understanding the uncertainty present in the data. Quantification, communication, and interpretation of uncertainty are all necessary for the understanding and control of the impact of variability. These three things – quantification, communication, and interpretation of uncertainty – help add both understanding and robustness to the design process. In this talk, we will present recent work done at the University of Utah on the topic of “materials by design” – our attempts to engineer for specific purposes and operating conditions. We will focus on how AI/ML combined with visualization can aid in not only better designs, but also in producing better designers.
Enabling Creativity and Agility in the Workplace
Jeff Sharpe, Global Lead & Principal Director – Architecture & Places, frog design
Organizations in every industry are scrambling to respond to the rapid technological advancements and changing consumer expectations that are challenging their relevance and ability to deliver innovative solutions. From startups reinventing or creating new markets seemly overnight, to increasing expectations to deliver responsible and relevant products and experiences, enterprises are looking both inward and outward for a path forward. It is essential across industries to unlock new ways of working so that organizations can bring innovative products and services to market and stay relevant with their customers and employees. We have helped many of our clients reinvent the way they work by embracing creativity and collaboration as the foundation of their future. This takes a commitment to an aligned purpose, and to understanding your staff and empowering them to work in the way they know best. It means giving them the right tools and the right space. It also means a shift in language, behavior, and the way we measure success. We must think holistically and design for a future that we are creating. We will share the work with companies that are leaders in their industries – in particular, by putting the focus on creativity, not innovation. Participants will come away with the perspective necessary to champion and lead these efforts within their own organizations.
Robots as a Service: See How Automated Drones Can Monitor Operations in Real Time
Rick Baker, Chief Revenue Officer, Airobotics
The emerging technology of the Robots as a Service business model has the potential to revolutionize the way companies operate. Airobotics has selected Houston to launch this world-first initiative with its “robot” (the Optimus drone), Airbase docking station, and artificial intelligence (AI)-enabled analytics. Unlike traditional drone programs, which rely on drone pilots, Airobotics’ fully automated system has the unique capability of swapping its own batteries and payloads, using a robotic arm. By combining the automated nature of aerial data collection and in-house data processing algorithms, Airobotics’ fully automated drone system enables the delivery of customer insights within hours. It no longer takes days or weeks between data collection processing and final delivery of insights. Insights derived from an automated platform can offer ways to optimize operations and maintenance of physical assets, systems, and processes of the site’s day-to-day operations – from inspecting key assets and infrastructure to increasing safety, security, and compliance through real-time video data. One key example of this has been with a major operator in Australia, where our solution has become an integral part of the daily mining process. In fact, an entire Mining Playbook has been created that outlines key data to be captured and monitored, as well as safety and training exercises that are monitored through live video feed from the drone. Efficiencies, safety protocols, and improved processes are resulting from this being incorporated into the “business as usual” model for Area C. For all the talk of automation and robots replacing human jobs, that is hardly the case with Airobotics’ technology. Rather, the new uses of an automated, pilotless system can remove dangerous, time-consuming tasks to move workers upstream. As a result, companies can cut costs while dramatically reducing risk, increasing efficiency, and even saving lives.
Counting Alligators and Catching Crooks: Stories from a Life Misspent in the Swamp of Data
R. Mohan Srivastava, President, Srivastava Consulting
Data surrounds us. We count, we measure, and we create databases to help describe, analyze, and interpret our world, and to predict what we can’t see: the future, and things hidden by design or obscured by nature. Though data science tools are amenable to automated recipes, many data science problems still require the spice of creativity and imagination. This talk looks at the magic of data science through four very different projects from the files of Mo Srivastava, a geostatistician who has spent a career pondering data, mostly from earth sciences, but sometimes from things found in a corner grocery store. In 2011, Mo was written up in Wired for breaking an instant scratch lottery. Pattern recognition and a data transformation common in geostatistics led him to a trick for separating winners from losers without scratching anything off. After teaching the trick to his young daughter, Mo reported the problem to the lottery corporation, which pulled the game off the market. In the 1990s, Canada’s nuclear waste program recognized the need for earth modeling tools that supported studies of flow and contaminant transport in a way that was not only technically sound but that would also inspire confidence and comfort in public hearings. Mo developed a new method for modeling fracture networks that honored all available data with a high degree of visual realism. When the State of Florida wanted consulting advice on its census of Everglades alligators, Mo signed on – partly because he had developed methods for leveraging secondary information to model things that are hard to see (like healthy alligators) … but also because he wanted to blast around in a giant fan boat. Mo’s final example is based on the work of Kim Rossmo, the first beat cop in Vancouver to earn a PhD in criminology. Rossmo’s “geographic profiling” toolkit assists in investigations of serial crimes. Mo adapted a data declustering method from geostatistics to enhance a tool that aims to identify something that serial criminals want to keep secret: where they live.
How Accenture ‘Wise Pivoted’ to the Future
Sacha Abinader, Managing Director, Accenture
The oil and gas industry has survived the largest downturn in history and faces an uncertain future going forward. There are lessons from other industries and companies that can provide a roadmap for energy companies as they plan for the future. Accenture faced a series of existential threats in the first decade of the millennium. We live in an age that demands companies stay in a permanent state of change. The digital age calls for a new approach to organizational change that enables companies to make a wise pivot successfully. This approach requires companies to:
-
-
TRANSFORM THE CORE BUSINESS … to drive up investment capacity
-
GROW THE CORE BUSINESS … to sustain the fuel for growth
-
SCALE NEW BUSINESS … to identify and scale new growth areas at pace
This session will focus on Accenture’s “wise pivot” transformation, some of the key initiatives and imperatives to support the transformation, and what others can learn from our own experience.
SmartSuit for Extravehicular Activity: Current Challenges and Spacesuit Technology Development to Enable Future Planetary Exploration Missions
Ana Diaz Artiles, Assistant Professor, Texas A&M University
Extravehicular activity (EVA) is one of the most challenging activities that astronauts need to perform in space. NASA’s current gas-pressurized spacesuit, the Extravehicular Mobility Unit (EMU) is used in a microgravity environment, and it has not been designed to operate in different conditions such as planetary surfaces. Additionally, gas-pressurized spacesuits require astronauts to use their strength to move the suit, which can be fatiguing and can significantly affect the metabolic cost of human movement and locomotion in space. In particular, the current EMU spacesuit causes many astronauts musculoskeletal injuries and discomfort, which could lead to suboptimal EVA performance and could negatively impact mission success. In this talk, we will review the challenges of current spacesuits, and will introduce future concepts and technologies for upcoming planetary missions. Particularly, we will introduce the SmartSuit, a new spacesuit concept that our lab is investigating in collaboration with Prof. Robert Shepherd at Cornell University. The SmartSuit concept incorporates soft robotics technology in order to provide better mobility, and a soft and stretchable self-healing skin with optoelectronic sensors for enhanced safety and interaction with the astronauts’ surroundings. We expect this novel spacesuit intelligent architecture for EVA operations to increase human performance by an order of magnitude on several quantifiable fronts for exploration missions on Mars and other planetary environments.
Automated Artificial Intelligence – The Next Era of Data Science
Rajiv Shah, Data Scientist, DataRobot
The value of artificial intelligence (AI) is widely recognized, but most organizations struggle with finding data scientists and getting data science models deployed. DataRobot addresses these issues through automation. Automation allows organizations to improve their data science productivity. Automation also allows for a wider group of people to build data science models. To demonstrate these ideas, this presentation will cover how organizations can leverage automation to build models on a larger scale (thousands of models), while also achieving improved accuracy and rapid deployment. Automation also means less of a reliance on hand-coding, which opens analytics to more than just data scientists. This talk will share how DataRobot’s automation is enabling organizations like the Global Water Challenge to address needs such as clean water in Liberia and Sierra Leone.
From the Edge to the Cloud – A Single Data Platform
Robert Oberhofer, Head of Products, MongoDB
Cruising on the edge makes data management a challenge. In this talk, we will look at the challenges faced by the cruise industry with data collection in remote locations and synchronizing to the extreme distributed system. Learn how a unified data platform and Data as a Service (DaaS) strategy solve these challenges to enable digital transformation and drive business value.