Remember Me

Example of better sweep efficiency due to polymer flooding.
Maps of water saturation changes(ΔSw),synthetic example. (a) 4D seismic. Average of the results of the simulation models before (b) and after (c) a probabilistic seismic history matching.

4D seismic is a tool for reservoir monitoring. Through the repetition of seismic surveys during the production of the field it is possible to identify changes that occurred due to the production, as pressurized regions, areas flooded by water injection, etc.

The spatial information (maps of changes) provided by 4D seismic data are extremely important for updating simulation models as they are complementary to well data (fluid rates and pressure) traditionally used in the history matching process. While well data are rich temporarily (usually monthly measurements) and poor spatially (location of wells), 4D seismic data are usually poor temporally (some surveys over the life of the field) but rich spatially (maps in the extension of the fields). Thus, the great value of 4D seismic is related to the ability of this tool to provide information between the wells. For instance, this information may help locate bypassed hydrocarbons, optimize infill drilling and manage injection, impacting considerably on the field management.

4D seismic data can be integrated with simulation models qualitatively and quantitatively. In both cases the aim is to adjust the simulation model so that it generates responses with similar behaviors to the ones observed in 4D seismic (pressure and saturation changes). In the qualitative approach the simulation models updating is manual and 4D seismic data are used as a guide for adjusting the model attributes such as fault transmissibility and permeability. In the quantitative approach, the seismic data are included in the objective function of an assisted history matching (or automatic) and the matching procedure is performed by optimization algorithms.

Although the capacity of 4D seismic to improve reservoir models is well known, there are still many challenges especially concerning the quantitative data integration. In this sense, the research developed in UNISIM is focused on the quantitative use of 4D seismic to update the simulation models aiming the reduction of reservoir uncertainties. We also propose different forms of integration between these two types of data to get the most of them, including unconventional approaches such as the use of engineering constraints to assist the interpretation of geophysical data.

Sistema de Produção do Benchmark UNISIM-I-D (UNISIM, 2015)
UNISIM-I-D Benchmark Production System (UNISIM, 2015)

Most of the oil reserves in Brazil are offshore, including heavy oil in Campus Basin and Carbonate Pre-salt fields in Santos Basin. One of main challenges today related to develop these reservoirs is to integrate the reservoir simulation with production facilities (Integrated Asset Modeling - IAM) in order to develop and manage petroleum fields more accurately. There are many challenges in the integration of reservoir simulation and production facilities.

In this area, several methodologies to model the integration of reservoirs and production systems have been applied in the oil and gas industry in recent years due to the need to model properly the integrated solution of models that represent the flow of fluids through the reservoir up to the surface.

The objectives of the research project are: (1) to evaluate and to apply numerical coupling between reservoir and production facilities, and (2) to evaluate optimization techniques applied to field management and production strategy definition including operational restrictions and multiple reservoirs producing to common surface facilities, especially in Pre-Salt and heavy oil mature fields production scenarios.

A validation study of these coupling methodologies is demanded where the production system will be tested on common operating conditions during production and injection of fluids, verifying benefits and limitations of each coupling methodology.

In application cases, the research will investigate field development project improvements due to production rate management using decoupled, explicit and implicit approaches for coupling reservoir simulator and adequate surface network models, approaching production system modeling, subsea and surface networks, separation and treatment facilities and flow assurance controls inside integration modeling context and the impact in production optimization.

This project involves the whole decision-making process linked to the choice of the production strategy of oil fields considering uncertainty of various types, risk mitigation and integration with production systems.

Integrated models will be developed based on reservoir study cases benchmarked by UNISIM to decision analysis regarding exploitation strategy selection in Brazilian Pre-Salt and heavy oil fields scenarios.

The success of this project can influence the creation of a new research area at Unicamp, including new projects and new perspective in the graduate program allowing the University to form students with a better knowledge in the integration of reservoir simulation and production engineering.

This new area is extremely important for the Pre-Salt fields which are offshore and include plan to share production in common surface facilities. The same importance may occur for the heavy oil fields from Campos Basin with high water rate production, which are mature fields where the importance of the integration will help to increase oil production.

Example of better sweep efficiency due to polymer flooding.
Example of proxy model application, constructed from a predefined number of simulations (before optimization) to cover all the solution space during the production optimization and allowing the decision maker to choose between maximizing the net present value (NPV ) or the minimization of cumulative water production (Wp).

Proxy Models

The steps of petroleum field management require a high effort and computational time, especially for complex models with high heterogeneity, and for probabilistic problems. Sometimes, process of scaling, uncertainty quantification, risk analysis, history matching, optimization and production forecast and decision making require many runs of simulation to search quantified and matched models based on available data in order to provide a reliable production forecast of a petroleum field.

Another important point to note is that the production forecast from adjusted numerical reservoir models are almost invariably "wrong", in which these predictions, in most of the time, do not match the observed production for a relatively short period of time. It is not clear is how wrong they might be in any given scenario, and this fact has led to increased research on quantification of uncertainty in oil reservoirs using mathematical and statistical approaches. In addition, understanding that this uncertainty is crucial for decision-making process at all levels of management of an oil field.

Over the recent years, studies of fast models, proxy models, which are considered low-fidelity ones to "emulate" the numerical reservoir simulator have been developed to speed up a stage of the reservoir management process and consequently reduce the time decision-making. In contrast to techniques based on representative solutions, proxy models, built from mathematical and statistical approaches can, in theory, produce fast and consistent models that can substitute the simulator, high-fidelity model, in the steps of a petroleum reservoir studies. Another important point to highlight is that the quality of proxy model generated will strongly depend on used mathematical approach (algorithm) and the observed data (history) used to build it.

The most used proxy models in the oil industry are: multivariate kriging model (KG); artificial neural network (ANN); polynomial regression models (PRM); response surface methodology (RSM) and emulators (EM).

Example of better sweep efficiency due to polymer flooding.
Example of application of emulation technique to build risk curve from field cumulative water production (Wp) for 3,652 days, comparing the generated curve by the simulator and the emulator. The risk curve was constructed after a cross validation of the emulator with the simulator response and field history data.

Emulator

Despite the advancement of generation techniques of proxy models over the past decade, there are still many barriers to effective implementation in stages of oil reservoir management. These include the high cost in computational time to quantify probabilistic problems; effective ways to parameterize a geological model to be realistic and easily matching models with the historical data; effective methods to calibrate numerical models from the available field data; analysis of measurement errors of various types of data which the model will be calibrated; and effective techniques for forecasting and decision making. In this context, the emulation technique arises.

This technique builds a statistical model for the errors occurring during the simulation of numerical models of the reservoir, and predicts both a mean error and variance that are used to amend the misfit (or match quality) expression. By including these corrections in the misfit term, the responses estimated by the emulator will tend to stay close to those that would be obtained using a reservoir numerical simulator. In addition to using the emulator to a past period, that is, history data, one can also measure around the forecast production using the emulation technique.

The main challenge of this research is on the speed and effectiveness in which the models will be calibrated by combining Bayes linear emulation approach, developed by Durham University, Department of Mathematical Sciences, Statistics and Probability group of Professor Michael Goldstein, Dr. Ian Vernon at Durham and Dr. Camila Caiado, and statistically accurate stochastic sampling algorithms. The key concept is to make every high computational cost simulation, highlighting learning about the response to the linear method of Bayes. Thus, the key concept here is to make every expensive simulation count, so as we learn about the response surface with the Bayes Linear methodology, we can target our expensive function evaluations in regions where the probability of finding a good fitting model is high. Once there are a number of matched models from the emulation technique, production forecasts will be generated using a valid statistical methodology, with there is another object of study along this line of research.

Example of better sweep efficiency due to polymer flooding.
Figure – Critical locus of a pure hypothetical component. Source: Whitson and Brulé (2000)

Several of the Decision Analysis stages and activities that rely on reservoir simulation are traditionally based on a simpler reservoir formulation, the Black-oil model, of three components and three phases. A Black-oil model usually meets adequately the needs of forecasting and modelling of more common reservoirs, such as dry gas reservoirs, non-volatile oil reservoirs under primary depletion, reservoirs under water injection or immiscible gas injection, among others.

When the reservoir fluid involved is a retrograde condensate (with phase behavior next to the mixture critical point), when the oil reservoir is subject to miscible gas enhanced oil recovery (EOR) or reactive EOR, the process physics makes black-oil models considerably less reliable. Therefore it becomes important to assess the impact on the reservoir simulation of more advanced phase-equilibria models based on equations of state, detailed experimental data, mass transport phenomena, chemical kinetics and rigorous thermodynamics.

Since it requires some chemical and thermodynamical knowledge, compositional simulations has a greater interface with its counterpart, the black-oil simulation. However, this process has a great computational cost, since the number of equations to be solved by block increases geometrically with the number of components employed. The number of components typically varies from 5 to 12 or more, some of which are lumped pseudo components that represent petroleum’s fractions.

The advance on the experimental, reservoir characterizations, modelling and computer (hardware and software) areas gave rise to great evolutions on the speed and reliability of the compositional simulation process of reservoir models, allowing its recent widespread

Everything that is possible to be done for traditional reservoirs on decision analysis steps with black-oil simulator can and should be done with compositional simulators on other reservoirs, when the process physics requires that and with enough computational power and numerical efficiency.

One important aspect to be emphasized is that UNISIM group research has the main objective of developing methodologies for practical cases applications. Generally, the main aspects that orient compositional researches on the group are (1) the physical and chemical understanding of water alternating gas (WAG) enhanced oil recovery (EOR) from the reservoir engineering point of view (2) integrating experimental phenomenological data in compositional (mechanistic) simulator models (3) the study of real cases or close to real cases for compositional subcases and (4) modelling and validating phase equilibria on software for several applications.

One important concept involved in compositional research is integrations, since it is about a very multidisciplinary activity. This way, it has been sought through research ways to promote a more effective integration with reservoir characterizations, the fluid uncertainties analysis and cooperation with the research focused on coupled reservoir and production systems.