Published on: 06/13/2017
Reservoir simulation involves the mathematical manipulation of a significant amount of information, most of which is subject to uncertainties due to factors such as insufficient measurements, lack of accuracy and spatial heterogeneities, as previously discussed. Consequently, it is imperative to quantify the impact of these parameters’ variations on the simulation workflows (modeling, analysis and interpretation) to improve asset management and maximize return on investment (ROI). This will effectively support the design of field strategies and the parameterization in history matching.
When it comes to decision-making, a decision variable is the one that decision-maker has control over. An optimal strategy is that which optimizes the value of an objective function. Given that sensitivity analysis is a process which creates new information about alternative strategies, it allows improvements to the quality of the process outputs.
During the history matching process, the simulated parameters are constantly reviewed making justifiable changes, if necessary, in order to produce the best match possible between the simulated model and actual field performance extracted from historical field data. As previously mentioned, this has commonly been done by minimizing a known objective function. Although this function quantifies the history matching, it does not directly describe how each parameter introduces errors in the simulation and the way in which these parameters should be modified . In this context, the concept of sensitivity analysis is widely employed to provide a better understanding into the degree of the impact such parameters can have on the accuracy of history-matched models. The insights provided by this understanding coupled with the experience of scientists and engineers involved in the process can provide significant solutions to solve those issues.
Fanci  argues that any method that quantifies uncertainties and risks associated with a particular prediction may be viewed as a sensitivity analysis. From the standpoint of Fiacco , a methodology for conducting a sensitivity analysis is a well-established requirement of any scientific discipline. This should be an integral part of any solution methodology, which cannot be understood without such information.
In summary, sensitivity analysis aims to determine the parameters that have a greater impact on the objective function. This process carries out a limited number of simulation cases to determine the parameters to be adjusted and the ranges of these adjustments. The information obtained from this process is further employed to plan history matching and production optimization tasks which, on the other hand, require a higher number of realizations. Schiozer et. al., , explains that the most efficient manner to reduce computational costs in a risky process is decreasing the number of variables; this can be accomplished by choosing the most critical parameters through sensitivity analysis.
Sensitivity analysis is widely applied in the oil industry for decision-making and model development. In accordance with Pannell , the process is responsible for supporting key tasks, such as:
Sensitivity analysis is usually conducted using the following sequence of steps. Initially, a “base case” model is run and a baseline optimal strategy is devised. This initial optimal strategy maximizes the value of the objective function given the initial assessments about probability distributions. Following a sensitivity analysis, the subjective beliefs are revised, and depending on how the perceptions change, the optimal strategy may or may not be altered. As the distributions are likely to be less uncertain, improvements can be made to increase confidence in the optimal strategy. 
Tornado Chart is referred as a way of presenting basic sensitivity results. It is mostly used for project risk management and Net Present Value (NPV) estimates. For each uncertain parameter, it calculates the model output for lower and upper bounds, while takes best estimates for all other uncertain variables. That is, it shows the relative importance of each parameter in the overall error, hence identifying the high-impact variables.
Dai et. al., define two different categories to carry out a sensitivity analysis study:
Local Sensitivity Analysis: the local approach calculates the gradient of an uncertain parameter output by keeping all other parameters fixed. This quantifies the impact of this selected variable on the simulation outputs. However, it presents some limitations: the variation interval of the parameters should be small, and the effects caused by the interactions between input variables cannot be characterized.
Global Sensitivity Analysis: the global method considers the entire variation range of an uncertain parameter. This makes it possible to quantify the contribution from both individual parameter and its interactions with the output variability. Compared to the local sensitivity analysis, this approach provides more complete information, but it is too demanding in terms of computational efforts.
Among several techniques to perform a sensitivity analysis, some are widely used such as Monte Carlo and Response Surface Methodology (RSM). The Monte Carlo method consists of randomly sampling the inputs to provide a picture of the output values and probabilities: one random iteration samples each stochastic input, setting new input values. The model takes these inputs and calculates the outputs, which are recorded. The process is then repeated until sufficient samples are collected to generate a probability distribution of the outputs.
The RSM method is a statistical technique, which allows a systematic approach for minimizing the number of simulation runs. By careful design of experiments, the objective is to optimize a response (output) which is influenced by several independent variables (inputs). Changes are made to the input variables in order to identify the reasons for changes in the response. The response can be represented graphically, either in 3-D or as contour plots that help to visualize the shape of the response surface.
Several applications of sensitivity analysis in the oil industry are available in the literature. For example, Khosravi et. al.  utilized a surface response method to realize the most influential parameters on pressure drop during the development of a fractured reservoir model. Sensitivity charts indicate that aquifer size is the most impacting parameter, and matrix block size and fracture permeability are the two next important variables. On the other hand, Victorino et. al., proposed a sensitivity analysis to verify important parameters that impact on production-systems, using a commercial simulator and identifying the boundaries of the production system. The parameters that provided greater impact were a combination of diameters and gas flow.
After carrying out a series of simulation runs and adjusting the parameters through sensitivity analysis, the outputs can be properly managed in order to find out the better match between historical production and simulation data. This improves the quality of a development plan while investigating errors in the model. In that light, Kraken allows the importing of both the simulated cases and historic data, and then create comparisons (i.e. simulated cases vs. historical data). The following example shows the comparison made between three simulated cases vs. the historical data for a single well (NA1A) oil production trend.
Therefore, as the engineer is performing reservoir management tasks, Kraken provides an efficient, user-friendly powerful platform to evaluate simulation results hence improving data analyses. In addition to the example above, it is also possible to create comparisons among various wells within Kraken, as shown below:
Additionally, Kraken has a “Case Comparison” process which sets up a comparison of properties’ distribution between two reservoir grids. In the following example, “Water Saturation” is displayed for two grids (Case 1 and Case 2) and on the left side, a Case Comparison was created to assess the contrast of water distribution in both grids.
As this comparison has been modeled based on the difference operation between the cases, the graphical legend (shown in Figure 5 above) indicates that the bluish region means no significant discrepancy. On the other hand, the red region featured in the resultant grid indicates that Case 1 has considerably greater water saturation in that region in opposition to Case 2. In terms of sensitivity analysis and optimization, the presented tool is strongly useful once it allows assessing the variations of different realizations and production schemes.
Essentially, the development of advanced mathematical models and numerical solutions has considerably enhanced the evaluation of uncertain data impacts within reservoir modeling and running future prediction workflows. The uncertainties that come from geological data represent the main limitation for reservoir characterization. In an effort to mitigate uncertainty propagation, sensitivity studies play a key role. It enables to assess the most influential parameters and to minimize the global error introduced in the processes. Research has been widely conducted to mitigate uncertainties while maintaining as much as of the geological model’s fidelity as possible, and it is said to be a constant challenge for those who work with numerical modeling in both scientific and engineering fields.