Webinar #67: Rapid Forecasting of CO2 Plume Migration in Varied Geological and Engineering Scenarios utilizing Transfer Learning

Speakers: Dr. Siddharth Misra, Texas A&M University
Date: 3/2/2023

This webinar will present a method to significantly reduce the computational time required for 3D forecasting of CO2 plume migration using the Fourier Neural Operator (FNO) network and Transfer Learning (TL). The approach allows for reliable investigation of CO2 saturation and pressure evolution in large storage reservoirs, even under varying geological and engineering conditions.
The FNO network was trained on 72 scenarios, validated on 9, and tested on 9 using a 3D SACROC geomodel, where CO2 injection was fixed at specific locations and rates. Transfer learning was then implemented on the FNO model to enable rapid CO2 forecasting under diverse and uncertain conditions. With 25 training scenarios and 10 validation scenarios, the transfer learning implementation required only one-fourth the training time and half the data generation time, with half of the data storage requirement. The method produced a rapid forecasting time of 7 seconds with 2.5% error, and could handle more geomodel realizations, diverse injection locations, and various injection rates.

In comparison, FNO alone took 12 seconds for one scenario, while the CMG simulator required 40 minutes. The proposed method thus provides an efficient and reliable means of predicting CO2 plume migration in large storage reservoirs under varying conditions.

Webinar #65: Addressing the Computational Challenges for Modeling CO2 Storage with GPU

Speakers: Dr. Vincent Natoli, Dr. Leonardo Patacchini
Date: 12/15/2022

The process of CO2 storage, one of the key technologies put forward to address climate change, presents numerous challenges to traditional subsurface flow simulation. High cell-count grids are needed to accurately model localized phenomena along with complex new physics for thermodynamic mixing, geomechanics, and geochemistry. This challenging combination stretches the capability of legacy simulators to the breaking point and an alternative approach is required. Over the last decade, a new technology curve for high-performance computing HPC has emerged with the use of GPUs as general-purpose processors. On a chip-to-chip comparison with CPUs, the performance advantage of GPUs is nearly 10x. Stone Ridge Technology has harnessed this computational power in the development of ECHELON, its high-performance subsurface flow simulator. ECHELON demonstrates exceptional performance, scalability, and ability to treat high cell count models on a diverse set of industrial petroleum reservoir assets with a broad and rapidly expanding feature set. This talk will discuss trends in HPC and the energy industry that illustrate continuing strong demand for high computational intensity. It will highlight the capabilities of ECHELON that make it the ideal platform on which to build new computationally intense modeling capabilities for emergent technologies like the sequestration of greenhouse gasses.

Webinar #62: Differentiable Programming: Bridging the Gap between Numerical Models and Machine Learning Models

Speaker: Dan O’Malley, Los Alamos National Laboratory
Date: 9/22/2022

Many subsurface flow applications involve components where physical laws are well understood and other components where the physical laws are either poorly understood or not applicable. Numerical modeling excels at the former whereas interpolating data with machine learning (ML) excels at the latter, but neither approach can tackle these components simultaneously. Existing ML approaches (often called physics-informed ML, or PIML) to handling these types of components simultaneously are minor tweaks to standard ML methods (e.g., PIML might use physics data to train or a loss function that encourages the ML to obey an equation without any accuracy guarantees). Tweaking black-box ML models is fundamentally limited because “big data does not interpret itself”–meaningful, interpretable structure in models is a necessity to improve predictability, enable human understanding, and maximize the impact of small data. We show how Differentiable Programming (DP) enables us to to meld trustworthy numerical modeling with trainable ML to enhance workflows for physical model development, inverse analysis, and machine learning.

Webinar #59: Uncertainty Quantification for Transport in Porous media using Parameterized Physics Informed neural Networks

Speaker: Cedric Gasmi, Stanford University
Date: 4/21/2022

We present a Parametrization of the Physics Informed Neural Network (P-PINN) approach to tackle the problem of uncertainty quantification in reservoir engineering problems. We demonstrate the approach with the immiscible two phase flow displacement (Buckley-Leverett problem) in heterogeneous porous medium. The reservoir properties (porosity, permeability) are treated as random variables. The distribution of these properties can affect dynamic properties such as the fluids saturation, front propagation speed or breakthrough time. We explore and use to our advantage the ability of networks to interpolate complex high dimensional functions. We observe that the additional dimensions resulting from a stochastic treatment of the partial differential equations tend to produce smoother solutions on quantities of interest (distributions parameters) which is shown to improve the performance of PINNS. We show that provided a proper parameterization of the uncertainty space, PINN can produce solutions that match closely both the ensemble realizations and the stochastic moments. We demonstrate applications for both homogeneous and heterogeneous fields of properties. We are able to solve problems that can be challenging for classical methods. This approach gives rise to trained models that are both more robust to variations in the input space and can compete in performance with traditional stochastic sampling methods.

Webinar #58: Deep Learning-Accelerated 3D Carbon Storage Reservoir Pressure Forecasting Based on Data Assimilation Using Surface Displacement from InSAR: LLNL Task4 Prototype

Speaker: Dr. Hewei Tang, LLNL
Date: 3/10/2022

In this talk, we will present the LLNL team’s work on SMART Task4: real-time history matching and forecasting for geologic carbon storage. Due to high drilling cost, GCS projects usually have spatially sparse measurements from few wells, leading to high uncertainties in reservoir pressure prediction. To address this challenge, we use low-cost Interferometric Synthetic-Aperture Radar (InSAR) data as monitoring data to infer reservoir pressure build up. We develop a 3D ES-MDA workflow to assimilate surface displacement maps interpreted from InSAR and to forecast the evolution of reservoir pressure distribution. The computational efficiency of the workflow is boosted by deep learning-based surrogate models for surface displacement and reservoir pressure predictions, respectively. The workflow can complete data assimilation and reservoir pressure forecasting in half an hour on a personal computer.

Webinar #56: Virtual Digital Twin for Real-Time Integrated Power Plant Control Room and Field Operations  Research, Training, and Education

Speaker: Dr. Stephen E. Zitney, Process Systems Engineering Research, National Energy Technology Laboratory (NETL), U.S. Department of Energy (DOE)
Date: 1/27/2022

The digital twin is an important digitalization technology for meeting the challenge of reducing carbon emissions is the virtual digital twin. Digital twins are virtualizations of physical plant assets and represent the structure and behavior of those assets in real life.  NETL and its R&D partners has developed a virtual digital twin for a power plant with carbon capture and deployed it at NETL and the Advanced Virtual Energy Simulation Training And Research (AVESTAR®) Center at West Virginia University. The digital twin combines a real-time high-fidelity dynamic simulator for control room operations with an immersive VR/AR-based plant walkthrough environment for field operations. The virtual digital twin serves as a test bed to help address the critical R&D, workforce training, and education challenges facing the power industry on its drive toward cleaner, more flexible plant operations.

Webinar #51: Reservoir Analysis of a CO2 Sequestration Site: Experiment-Guided Field Scale Modeling 

Speaker: Similoluwa Oduwole, Master’s Candidate in Petroleum Engineering at Colorado School of Mines
Date: 10/14/2021

Geosequestration of CO2 requires guided transport to and reliable quantification of the storage capacity of geologic formations as well as safety assessment to ensure permanent retention of the injected fluid. We present a workflow based on the analysis of well logs, experimental insights, and reservoir simulation to better map CO2 saturation distribution from time-lapse seismic data. CO2 saturation maps are typically derived from 4D seismic data using rock physics models to assess fluid saturation and pressure effects. We present here a combination of our rock physics understanding to guide reservoir flow simulations. We find that in addition to fluid saturation changes, fluid reactions with the injected CO2 might better explain 4D seismic amplitude change maps. 

Webinar #49: Developing CO2 storage in depleted gas fields

Speaker: Dr Filip Neele, CCS Applied Geoscientist, TNO
Date: 8/26/2021 

Depleted oil or gas fields offer interesting opportunities for storage of CO2. They have a proven seal and trap and if the timing is right production facilities could be re-used. In The Netherlands, three consortia are preparing plans for large-scale storage in offshore depleted gas fields. In this presentation, the challenges of developing safe injection scenarios are discussed. Due to the often low pressure after production, the CO2 has to be brought from the high pressure in the transport pipeline to the low pressure in the deep subsurface. This process needs to be managed, to avoid too low temperatures in the well or the reservoir. Additional challenges are encountered in monitoring such projects; examples are shown related to flow distribution over injection wells.   

Webinar #47: National Risk Assessment Partnership Phase II Research: Tools and methods to quantify subsurface environmental risks at geologic carbon storage sites

Speaker: Dr. Robert Dilmore, National Energy Technology Laboratory
Date: 7/1/2021

A key to enabling commercial-scale deployment of geologic carbon storage (GCS) is ensuring that GCS sites will safely and effectively sequester large volumes of CO2 out of the atmosphere for hundreds of years, or more. To address that need, the US DOE’s Office of Fossil Energy and Carbon Management sponsored the National Risk Assessment Partnership (NRAP) – a multi-year, multi-national laboratory collaborative research project focused on developing and demonstrating methods and tools for quantitative assessment and management of subsurface environmental risks associated with GCS. Through NRAP’s second Phase of research researchers have developed a computational toolset to support aspects of stakeholder decision making related to evaluating long-term containment effectiveness, assessing risk of unwanted CO2 and brine migration, quantifying and managing potential induced seismicity risks, and design effective and efficient monitoring networks to detect potential leakage. NRAP has also released a set of (draft) recommended practices for assessing and managing leakage and induced seismicity risks at GCS sites, and built a catalog of use cases that demonstrate those tools and methods. The NRAP research team promotes the testing, use, and refinement of these products by the international carbon capture, utilization, and storage research, development, and deployment community.

Webinar #40: CarbonSAFE Initiative

Speaker: Mary Sullivan, National Energy Technology Laboratory
Date: 5/6/2021

The Carbon Storage Program implemented by the U.S. Department of Energy’s (DOE) Office of Fossil Energy and managed by the National Energy Technology Laboratory (NETL) is helping to develop technologies that safely and permanently store carbon dioxide (CO2) without adversely impacting natural resources or hindering economic growth.


Since its inception in 1997, the Carbon Storage Program is developing and advancing carbon capture and storage (CCS) technologies both onshore and offshore that will significantly improve the effectiveness of the technologies, reduce the cost of implementation, and be ready for widespread commercial deployment. The portfolio includes industry development projects, university research projects, national laboratory research (including research conducted at NETL), and international collaborations to leverage global expertise, test facilities, and field sites. The Program approaches these challenges through integration of two components: (1) individual technologies developed through R&D, and (2) field lab testing sites.

Building upon almost two decades of knowledge and experience gained from the field lab testing projects and Regional Carbon Sequestration Partnership (RCSP) Initiative efforts, the Program initiated the Carbon Storage Assurance Facility Enterprise (CarbonSAFE) Initiative. The CarbonSAFE addresses key gaps on the critical path toward CCS deployment by reducing technical risk, uncertainty, and cost of a geologic storage complex for 50+ million metric tons of CO2 over a 30 year timeframe from industrial sources. The CarbonSAFE Initiative is taking a phased approach: (1) the Integrated CCS Pre-Feasibility phase, (2) the Storage Complex Feasibility phase, and (3) Site Characterization and CO2 Capture Assessment. Subject to the availability of funding, a fourth phase would include permitting and construction. A total of 13 projects were completed under Phase I, which provided high-level evaluations of potential CCS scenarios in regions throughout the United States. Now nearing completion, six projects were awarded under Phase II to confirm the adequacy of storage complexes through initial site characterization. Phase III projects involve the entire integrated process; including a capture assessment, detailed site characterization and begin the permitting process. These efforts are designed to lead to commercial scale operations that demonstrate that long-term capture and storage can be performed safely and securely. This presentation will include the latest updates from the CarbonSAFE Initiative.


Webinar #35: HPC and AI/ML for Subsurface Applications

Speaker: Masashi Ikuta, NEC Corporation
Date: 3/18/2021

The presentation will start from introducing the unique vector computer we have today. The speaker will tell you what is vector and how it can address subsurface applications. The presentation will refer to some benchmarking efforts with some performance numbers.

Webinar #33: History Matching High Resolution Geologic Models: Some Recent Advances

Speaker: Dr. Akhil Datta-Gupta
Date: 3/4/2021

History Matching High Resolution Geologic Models: With the advances in subsurface characterization and imaging, petroleum reservoir models now-a-days routinely consist of multimillion cells representing geologic heterogeneity. Reconciling such high-resolution geologic models to dynamic data such as transient pressure, tracer, multiphase production history, and time-lapse seismic data is by far the most time-consuming aspect of the workflow for geoscientists and engineers.


Although significant advancements have been made in this area, current industry practice still involves iterative trial and error methods and often utilizes arbitrary property multipliers that can lead to loss of geologic realism and poor performance predictions. In this talk I will briefly touch upon three aspects of the inverse problem related to history matching: (1) recasting flow and transport equations as tomographic equations using a high frequency asymptotic approach and efficient solution of the forward problem (2) parsimonious representation of geologic models via re-parameterization using basis functions and (3) solution of the inverse problem via multi-scale multi-objective optimization. Several field applications including the post-combustion CO2 EOR project at the West Ranch Field, TX, will be presented to examine the power and utility of the approach..


Webinar #31: Fluid Monitoring Using Electromagnetics and Cloud/Artificial Intelligence: The devil is in the details

Speaker: Kurt Strack
Date: 1/21/2021

Reservoir monitoring to image fluid movement has long been very challenging, not only because of the technological hurdles but also because of the business model. To realize its highest value, it is important to be able to provide results as close as possible to real time to influence operational decisions.


Since the transition to Green energy requires renewables (like geothermal), improved hydrocarbon production with reduced carbon foot print (like EOR+) and carbon caption and sequestration (CCS), fluid monitoring is of great strategic value.

Over the last 20 years we developed a system of technologies to address the challenges in methodology, instruments, 3D modeling and data delivery. After numerous 3D feasibilities and several pilots, we are now able to receive the data from our instruments in near real time from anywhere in the world while keeping all data synchronized.

Electromagnetics is especially suited to directly image fluid movements in reservoirs. When we started our first 3D feasibility studies over 20 year ago, we realized that neither equipment nor software available was sufficient. Today, being able to carry out a full survey, we start out a 3D modeling feasibility study for either EOR or CCS applications and combine it with local noise measurements to design the optimum electromagnetic array which is then deployed in the field. Since electromagnetic sensors are sensitive to radio/cell phone signals, sending data in real time to the Cloud is tricky but is achievable. Next, the 3D modeling algorithm gets replaced with Artificial Intelligence (AI) to provide 3D models close to real time. Our ultimate vision is to do all of acquisition, processing and 3D imaging in real time directly from a cell phone.


Webinar #19: CASERM – an Industry/Agency/Academic Research Center for Subsurface Resource Science

Speakers: Dr. Thomas Monecke and Dr. Erik Westman
Date: 7/16/2020

CASERM is an NSF-funded I/UCRC with the purpose of developing fundamental knowledge that transforms the way geoscience data is used to locate and characterize subsurface earth resources. Work within the center leverages expertise in geology, data science, and mining engineering to integrate diverse geoscience data, inform decision making, and minimize geological risk. Although less than two years old, the center is supported by seven members including Rio Tinto, AngloGold Ashanti, and the USGS. This presentation will introduce the center, it’s objectives, and expertise available.

Webinar #18: A Case Study for Optimizing Shut-In Strategies During Hydraulic Fracturing Operations in the Bakken

Speaker: Dr. Ali Shahkarami
Date: 7/9/2020

Data science techniques have proven useful with the high volume of data collected in unconventional reservoir development workflows. In this paper, we present an analytics and machine learning use case for operations to minimize deferred production and quantify long-term production impacts due to frac hits in the Bakken and Three Forks formation during infill development.


Shutting in too many wells can be the largest expense incurred by a new completion, as operators not only work-over the offset wells, but also defer production for the entirety of the completion job. The outcomes of this study used significant amounts of data to provide the operator with a more efficient shut in strategy that can contribute to saving capital expenses and optimizing the rate of return on investment.
The use case applied a workflow to a large field dataset. We underscore that historical data can be used to quantify the zonal communication and to provide recommendations for future operations regarding a shut in radius. With this novel approach, we analyzed several well pads in Bakken basin and all in close proximity. The analysis included the following datasets: static geological/formation data, completion data, one-second pressure data, and production history. The method used in this study can be defined as a 3-step process: 1) Employing analytics to assess and evaluate fracture driven interferences during the completion of new infill wells. 2) Quantifying the long-term production impact that may occur after shutting an offset well. 3) Applying machine learning techniques to determine the optimal offset distance and degree of communication.
Pressure data from the offset monitoring wells were used to determine the presence of fracture driven communication between wells during a completion operation. Production data were also utilized to quantify the long-term impact of shut in and fracture driven interferences. Machine learning techniques were then applied to measure the influence of offset distance (and other parameters such as completion design, depletion history, and zonal variance) to communication. The results of the analysis indicated the distance at which communication occurs most often in offset wells from the hydraulic completion of new infill wells. Considering this information, an optimized shut in distance was proposed for offset wells in the area reducing it from the previous radius by 250-550 ft thus improving production metrics.


Webinar #17: Harnessing the Power of DOE Data Computing for End-user Analytics

Speakers: Kelly Rose, Aaron Barkhurst, MacKenzie Mark-Moser, Lucy Romeo, and Patrick Wingo
Date: 6/25/2020

The Offshore Risk Modeling (ORM) suite is comprised of innovative science- and data-driven computational tool and models designed to predict, prevent, and prepare for future oil spills. This R&D 100 award-winning suite of models, tools, and associated big-datasets span the full engineered and natural offshore system – from the subsurface, through the water column, and to the coast.


Progressing beyond current individual tools and disparate data, the ORM suite can be used to holistically assess offshore systems to mitigate key knowledge and technology gaps, and has been validated and adopted for use by domestic and international regulators, researchers, and industry. This presentation will briefly discuss how this comprehensive suite, including several key tools, have been optimized for predictions, analyses, and visualizations through online common operating platforms, hosted on EDX. In addition, we will discuss how ORM was designed to incorporate data with system-wide, science-driven models and tools in a secure, coordinates system, both internally for NETL and for inter-agency assessment and evaluation. The ORM is an expanding “one-stop-shop” for data, tools, and models to predict hazards and mitigate challenges that help solve many of the offshore energy industry’s most demanding and complex issues. To learn more about this effort and related work please see https://edx.netl.doe.gov/offshore/portfolio-items/risk-modeling-suite/


Webinar #15: IOT, Real-Time Analytics and Machine Learning: An Operator’s Perspective

Speaker: Dingzhou Cao, WPX Energy
Date: 5/14/2020

Digital transformation is a buzzword within oil and gas upstream for a while. In this webinar, the author would discuss the real-time analytics systems he built within Anadarko and the similar system he is building within WPX right now, regarding the technical details, the pros and cons of each systems, as well as the analytics models (including machine learning models) et al.

Webinar #14: Fusion of Physics with Analytics for Oil and Gas

Speaker: Dr. Sathish Sankaran, Xecta Digital Labs
Date: 5/7/2020

With ongoing digital transformation in the oil and gas industry, there is a significant surge in recent years in data collection and subsequently using data-driven models for fast computations. However, several of these applications spanning drilling, completions, reservoir and production engineering share a few common traits – limited extrapolation, less explainability and need for a lot of data (often expensive) for training.


While we have used physics-based methods to derive white box equations to understand and analyze hydrocarbon systems for over a century, they often come at the price of computational complexity and expensive data collection. Are there viable alternatives that can effectively blend the domain knowledge of reservoir physics with data-driven models to improve generalization and interpretability? From a practical standpoint, limited availability of reliable tools and poor understanding of the strengths and weaknesses of these methods is a hurdle for any meaningful adoption by practitioners. In this talk, we will cover some recent approaches of hybrid models that combine physics and analytics with applications to multiphase flow through reservoir porous media and pipelines. We will also identify some strategic areas from the industry perspective, where future research work could be directed to improve oil and gas operations in a sustainable manner.


Webinar #13: SEAM – SEG Collaborative Research – History and Way Forward

Speakers: Dr. Josef Paffenholz and Dr. Michael Oristaglio
Date: 4/30/2020

Former chair of the SEAM board of directors Josef Paffenholz and project manager Mike Oristaglio will talk about the history, administrative structure, technical accomplishments so far and future plans of the SEG Advanced Modeling (SEAM) corporation.

Webinar #12: What Hath Reverend Bayes Wrought: Powerful Probabilistic Inference on Your Laptop

Speakers: Dennis Buede, Ph.D. and Joseph Tatman, Ph.D.
Date: 4/23/2020

Bayesian networks are an algorithmic implementation for (1) defining a joint probability distribution and (2) engaging in both probabilistic and Bayesian inference as new information becomes available. This webinar introduces an exemplary Bayesian network, touches on the underlying mathematics, and provides some real-world examples. The real-world examples are diagnosing equipment malfunctions and classifying tunnels.

Webinar #11: Evaluating Variable Importance in Black Box Models: A Comparison of Strategies

Speaker: Dr. Jared Schuetter, Battelle
Date: 4/16/2020

Data-driven models built using machine learning techniques are increasingly being used for many oil and gas applications, e.g., geologic characterization, drilling optimization, production data analysis, reservoir management, predictive maintenance, etc. Because of the “black box” nature of these models, an explicit relationship cannot generally be extracted between the input variables (predictors) and output variables (responses).


As such, ascertaining the relative importance of different predictors or “features” in terms of their impact on the response variable(s) of interest is not straightforward. To that end, we conducted a comparative assessment of different model agnostic variable/feature importance strategies for black-box models suitable for oil and gas problems. These methods include two variants of R2 loss, partial dependence and accumulated local effects (ALE) plots, local interpretable model-agnostic explanations (LIME), and Shapley additive explanations (Shapley Values). The different approaches were tested on a closed reservoir simulation and compared across time to understand which reservoir characteristics mattered the most as the simulation unfolded.


Webinar #10: Physics-informed Machine Learning for Real-time Unconventional Reservoir Management

Speaker: Dr. Hari Viswanathan
Date: 4/9/2020

We present a physics-informed machine learning (PIML) workflow for real-time unconventional reservoir management. Reduced-order physics and high-fidelity physics model simulations, lab-scale and sparse field-scale data, and machine learning (ML) models are developed and combined for real-time forecasting through this PIML workflow.


These forecasts include total cumulative production (e.g., gas, water), production rate, stage-specific production, and spatial evolution of quantities of interest (e.g., residual gas, reservoir pressure, temperature, stress fields). The proposed PIML workflow consists of three key ingredients: (1) site behavior libraries based on fast and accurate physics, (2) ML-based inverse models to refine key site parameters, and (3) a fast forward model that combines physical models and ML to forecast production and reservoir conditions. First, synthetic production data from multi-fidelity physics models are integrated to develop the site behavior library. Second, ML-based inverse models are developed to infer site conditions and enable the forecasting of production behavior. Our preliminary results show that the ML-models developed based on PIML workflow have good quantitative predictions (>90% based on R2-score). In terms of computational cost, the proposed ML-models are O(10^4 ) to O(10^7 ) times faster than running a high-fidelity physics model simulation for evaluating the quantities of interest (e.g., gas production). This low computational cost makes the proposed ML-models attractive for real-time history matching and forecasting at shale-gas sites (e.g., MSEEL – Marcellus Shale Energy and Environmental Laboratory) as they are significantly faster yet provide accurate predictions.


Webinar #8: Machine Learning Reveals Surprising Findings

Speaker: Dr. Paul Johnson, Los Alamos National Lab
Date: 3/12/2020

Our recent work with applying machine learning tools to geophysical data sets is leading to remarkable results. Seismic signals we once regarded as noise turn out to be the most important signal in the system regarding probing fault physics—revealing far more information about the instantaneous and future behavior of faults.


Parallel work applying the same approach to fluid induced fractures shows similar results—we can probe the physics of an evolving hydraulic fracture at all times applying continuous seismic waves. But there’s more. We find that by applying convolutional neural networks (CNN’s) to denoise InSAR data, we can obtain an order of magnitude better resolution that an InSAR expert. The approach can be applied to stick-slip, slow-slip, surface inflation and deflation. We are including InSAR-measured surface displacements as a constraint on the stress model we are developing applying a joint inversion of seismic and gravity data. For all of these problems we are beginning a visualization effort where we imagine entering into an evolving Earth model, such as a stress model containing fluid forcing, fracture, displacement, etc. We further envision the ability to hear and see seismic events, observe fluid flow through fractures, and synthesize useful characteristics from high dimensional space. That information could be local stress-displacement, fluid volume with time, etc. I will present a brief overview of these topics.


Webinar #4: Top-Down Modeling – Data-Driven Reservoir Simulation & Modeling using AI and Machine Learning

Speaker: Dr. Shahab D. Mohaghegh, WVU
Date: 2/13/2020

To efficiently develop and operate a petroleum reservoirs, it is important to have a model. Currently, numerical reservoir simulation is the accepted and widely used technology for this purpose. Data-Driven Reservoir Modeling (Also known as Top-Down Modeling or TDM) is an alternative to numerical simulation. 


TDM uses AI and Machine Learning in order to develop full field reservoir models based on facts (field measurements) rather than mathematical formulation of our current understanding of the physics of the fluid flow through porous media. Unlike other empirical technologies that forecast production based on curve fitting (Decline Curve Analysis), or only the combination of production/injection data for its analysis , TDM integrates all available field measurements such as well locations and trajectories, details of completions and stimulations, well logs, core data, well tests, seismic, operational constraints (wellhead pressure and choke setting), and production/injection history into a cohesive, comprehensive, full filed reservoir model. TDM simultaneously history matches dynamic variables such as oil production, gas-oil-ratio, water cut, reservoir pressure and water saturation through a completely automated process using only wellhead (not FBHP) data as input. TDM’s automated history matching is based on all field measurements. Upon Completion of its history matching and prior to its use for forecasting and field development planning, TDM is blind validated in time and space. This data-driven reservoir simulation and modeling technology has been field validated in multiple onshore and offshore fields in the Middle East, Southeast Asia, North Sea, Caspian Sea, and Mexico. The novelty of Data-Driven Reservoir Modeling stems from the fact that it is a complete departure from traditional approaches to reservoir modeling. The Fact-Based, Data-Driven Reservoir Modeling manifests a paradigm shift in how reservoir engineers and geo-scientists model fluid flow through porous media. In this new paradigm current understanding of physics and geology is substituted with facts (data/field measurements), as the foundation of the model. This characteristics of TDM makes it a viable modeling technology for unconventional (shale) assets where the physics of the hydrocarbon production (in the presence of massive hydraulic fractures) is not yet, well understood.