sequential simulations of mixed discrete-continuous properties: sequential gaussian ... ·...

12
Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian Mixture Simulation Dario Grana, Tapan Mukerji, Laura Dovera, and Ernesto Della Rossa Abstract We present here a method for generating realizations of the posterior probability density function of a Gaussian Mixture linear inverse problem in the combined discrete-continuous case. This task is achieved by extending the sequen- tial simulations method to the mixed discrete-continuous problem. The sequential approach allows us to generate a Gaussian Mixture random field that honors the covariance functions of the continuous property and the available observed data. The traditional inverse theory results, well known for the Gaussian case, are first summarized for Gaussian Mixture models: in particular the analytical expression for means, covariance matrices, and weights of the conditional probability density function are derived. However, the computation of the weights of the conditional distribution requires the evaluation of the probability density function values of a multivariate Gaussian distribution, at each conditioning point. As an alternative so- lution of the Bayesian inverse Gaussian Mixture problem, we then introduce the sequential approach to inverse problems and extend it to the Gaussian Mixture case. The Sequential Gaussian Mixture Simulation (SGMixSim) approach is presented as a particular case of the linear inverse Gaussian Mixture problem, where the linear operator is the identity. Similar to the Gaussian case, in Sequential Gaussian Mixture Simulation the means and the covariance matrices of the conditional distribution at a given point correspond to the kriging estimate, component by component, of the mixture. Furthermore, Sequential Gaussian Mixture Simulation can be conditioned by secondary information to account for non-stationarity. Examples of applications D. Grana ( ) · T. Mukerji Stanford University, 397 Panama Mall, Stanford, CA 94305, USA e-mail: [email protected] T. Mukerji e-mail: [email protected] L. Dovera · E. Della Rossa Eni E&P, Via Emilia 1, Milan 20097, Italy L. Dovera e-mail: [email protected] E. Della Rossa e-mail: [email protected] P. Abrahamsen et al. (eds.), Geostatistics Oslo 2012, Quantitative Geology and Geostatistics 17, DOI 10.1007/978-94-007-4153-9_19, © Springer Science+Business Media Dordrecht 2012 239

Upload: others

Post on 19-Apr-2020

41 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Simulations of MixedDiscrete-Continuous Properties: SequentialGaussian Mixture Simulation

Dario Grana, Tapan Mukerji, Laura Dovera, and Ernesto Della Rossa

Abstract We present here a method for generating realizations of the posteriorprobability density function of a Gaussian Mixture linear inverse problem in thecombined discrete-continuous case. This task is achieved by extending the sequen-tial simulations method to the mixed discrete-continuous problem. The sequentialapproach allows us to generate a Gaussian Mixture random field that honors thecovariance functions of the continuous property and the available observed data.The traditional inverse theory results, well known for the Gaussian case, are firstsummarized for Gaussian Mixture models: in particular the analytical expressionfor means, covariance matrices, and weights of the conditional probability densityfunction are derived. However, the computation of the weights of the conditionaldistribution requires the evaluation of the probability density function values of amultivariate Gaussian distribution, at each conditioning point. As an alternative so-lution of the Bayesian inverse Gaussian Mixture problem, we then introduce thesequential approach to inverse problems and extend it to the Gaussian Mixture case.The Sequential Gaussian Mixture Simulation (SGMixSim) approach is presented asa particular case of the linear inverse Gaussian Mixture problem, where the linearoperator is the identity. Similar to the Gaussian case, in Sequential Gaussian MixtureSimulation the means and the covariance matrices of the conditional distribution ata given point correspond to the kriging estimate, component by component, of themixture. Furthermore, Sequential Gaussian Mixture Simulation can be conditionedby secondary information to account for non-stationarity. Examples of applications

D. Grana (�) · T. MukerjiStanford University, 397 Panama Mall, Stanford, CA 94305, USAe-mail: [email protected]

T. Mukerjie-mail: [email protected]

L. Dovera · E. Della RossaEni E&P, Via Emilia 1, Milan 20097, Italy

L. Doverae-mail: [email protected]

E. Della Rossae-mail: [email protected]

P. Abrahamsen et al. (eds.), Geostatistics Oslo 2012,Quantitative Geology and Geostatistics 17,DOI 10.1007/978-94-007-4153-9_19, © Springer Science+Business Media Dordrecht 2012

239

Page 2: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

240 D. Grana et al.

with synthetic and real data, are presented in the reservoir modeling domain whererealizations of facies distribution and reservoir properties, such as porosity or net-to-gross, are obtained using Sequential Gaussian Mixture Simulation approach. Inthese examples, reservoir properties are assumed to be distributed as a GaussianMixture model. In particular, reservoir properties are Gaussian within each facies,and the weights of the mixture are identified with the point-wise probability of thefacies.

1 Introduction

Inverse problems are common in many different domains such as physics, engineer-ing, and earth sciences. In general, solving an inverse problem consists of estimatingthe model parameters given a set of observed data. The operator that links the modeland the data can be linear or nonlinear.

In the linear case, estimation techniques generally provide smoothed solutions.Kriging, for example, provides the best estimate of the model in the least-squaressense. Simple kriging is in fact identical to a linear Gaussian inverse problem wherethe linear operator is the identity, with the estimation of posterior mean and covari-ance matrices with direct observations of the model space. Monte Carlo methodscan be applied as well to solve inverse problems [12] in a Bayesian frameworkto sample from the posterior; but standard sampling methodologies can be ineffi-cient in practical applications. Sequential simulations have been introduced in geo-statistics to generate high resolution models and provide a number of realizations ofthe posterior probability function honoring both prior information and the observedvalues. References [3] and [6] give detailed descriptions of kriging and sequentialsimulation methods. Reference [8] proposes a methodology that applies sequentialsimulations to linear Gaussian inverse problems to incorporate the prior informationon the model and honor the observed data.

We propose here to extend the approach of [8] to the Gaussian Mixture case.Gaussian Mixture models are convex combinations of Gaussian components thatcan be used to describe the multi-modal behavior of the model and the data. Ref-erence [14], for instance, introduces Gaussian Mixture distributions in multivariatenonlinear regression modeling; while [10] proposes a mixture discriminant analy-sis as an extension of linear discriminant analysis by using Gaussian Mixtures andExpectation-Maximization algorithm [11]. Gaussian Mixture models are commonin statistics (see, for example, [9] and [2]) and they have been used in different do-mains: digital signal processing [13] and [5], engineering [1], geophysics [7], andreservoir history matching [4].

In this paper we first present the extension of the traditional results valid in theGaussian case to the Gaussian Mixture case; we then propose the sequential ap-proach to linear inverse problems under the assumption of Gaussian Mixture distri-bution; and we finally show some examples of applications in reservoir modeling.If the linear operator is the identity, then the methodology provides an extension ofthe traditional Sequential Gaussian Simulation (SGSim, see [3], and [6]) to a new

Page 3: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Gaussian Mixture Simulation 241

methodology that we call Sequential Gaussian Mixture Simulation (SGMixSim).The applications we propose refer to mixed discrete-continuous problems of reser-voir modeling and they provide, as main result, sets of models of reservoir faciesand porosity. The key point of the application is that we identify the weights ofthe Gaussian Mixture describing the continuous random variable (porosity) with theprobability of the reservoir facies (discrete variable).

2 Theory: Linearized Gaussian Mixture Inversion

In this section we provide the main propositions of linear inverse problems withGaussian Mixtures (GMs). We first recap the well-known analytical result for pos-terior distributions of linear inverse problems with Gaussian prior; then we extendthe result to the Gaussian Mixtures case.

In the Gaussian case, the solution of the linear inverse problem is well-known[15]. If m is a random vector Gaussian distributed, m ∼ N(μm,Cm), with meanμm and covariance Cm; and G is a linear operator that transforms the model m intothe observable data d

d = Gm + ε, (1)

where ε is a random vector that represents an error with Gaussian distributionN(0,Cε) independent of the model m; then the posterior conditional distributionof m|d is Gaussian with mean and covariance given by

μm|d = μm + CmGT(GCmGT + Cε

)−1(d − Gμm) (2)

Cm|d = Cm − CmGT(GCmGT + Cε

)−1GCm. (3)

This result is based on two well known properties of the Gaussian distributions:(A) the linear transform of a Gaussian distribution is again Gaussian; (B) if the jointdistribution (m,d) is Gaussian, then the conditional distribution m|d is again Gaus-sian.

These two properties can be extended to the Gaussian Mixtures case. We as-sume that x is a random vector distributed according to a Gaussian Mixture withNc components, f (x) = ∑Nc

k=1 πkN(x;μkx,Ck

x), where πk are the weights and thedistributions N(x;μk

x,Ckx) represent the Gaussian components with means μk

x andcovariances Ck

x evaluated in x. By applying property (A) to the Gaussian compo-nents of the mixture, we can conclude that, if L is a linear operator, then y = Lxis distributed according to a Gaussian Mixture. Moreover, the pdf of y is given byf (y) = ∑Nc

k=1 πkN(y;Lμkx,LCk

xLT ).Similarly we can extend property (B) to conditional Gaussian Mixture distribu-

tions. The well-known result of the conditional multivariate Gaussian distributionhas already been extended to multivariate Gaussian Mixture models (see, for exam-

Page 4: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

242 D. Grana et al.

ple, [1]). In particular, if (x1,x2) is a random vector whose joint distribution is aGaussian Mixture

f (x1,x2) =Nc∑

k=1

πkfk(x1,x2), (4)

where fk are the Gaussian densities, then the conditional distribution of x2|x1 isagain a Gaussian Mixture

f (x2|x1) =Nc∑

k=1

λkfk(x2|x1), (5)

and its parameters (weights, means, and covariance matrices) can be analyticallyderived. The coefficients λk are given by

λk = πkfk(x1)∑Nc

�=1 π�f�(x1), (6)

where fk(x1) = N(x1;μkx1

,Ckx1

); and the means and the covariance matrices are

μkx2|x1

= μkx2

+ Ckx2,x1

(Ck

x1

)−1(x1 − μkx1

)(7)

Ckx2|x1

= Ckx2

− Ckx2,x1

(Ck

x1

)−1(Ckx2,x1

)T, (8)

where Ckx2,x1

is the cross-covariance matrix. By combining these propositions, themain result of linear inverse problems with Gaussian Mixture can be derived.

Theorem 1 Let m be a random vector distributed according to a Gaussian Mixturem ∼ ∑Nc

k=1 πkN(μkm,Ck

m), with Nc components and with means μkm, covariances

Ckm, and weights πk , for k = 1, . . . ,Nc. Let G : �M → �N be a linear operator, and

ε a Gaussian random vector independent of m with 0 mean and covariance Cε , suchthat d = Gm + ε, with d ∈ �N , m ∈ �M , ε ∈ �N , then the posterior conditionaldistribution m|d is a Gaussian Mixture.

Moreover, the posterior means and covariances of the components are given by

μkm|d = μk

m + CkmGT

(GCk

mGT + Cε

)−1(d − Gμkm

)(9)

Ckm|d = Ck

m − CkmGT

(GCk

mGT + Cε

)−1GCkm, (10)

where μkm and Ck

m, are respectively the prior mean and covariance of the kth Gaus-sian component of m. The posterior coefficients λk of the mixture are given by

λk = πkfk(d)∑Nc

�=1 π�f�(d), (11)

where the Gaussian densities fk(d) have means μkd = Gμk

m and covariances Ckd =

GCkmGT + Cε .

Page 5: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Gaussian Mixture Simulation 243

3 Theory: Sequential Approach

Based on the results presented in the previous section, we introduce here the sequen-tial approach to linearized inversion in the Gaussian Mixture case. We first recap themain result for the Gaussian case [8].

The solution of the linear inverse problem with the sequential approach requiressome additional notation. Let mi represent the ith element of the random vector m,and let ms represent a known sub-vector of m. This notation will generally be usedto describe the neighborhood of mi in the context of sequential simulations. Finallywe assume that the measured data d are known having been obtained as a lineartransformation of m according to some linear operator G.

Theorem 2 Let m be a random vector, Gaussian distributed, m ∼ N(μm,Cm) withmean μm and covariance Cm. Let G be a linear operator between the model m andthe random data vector d such that d = Gm+ε, with ε a random error vector inde-pendent of m with 0 mean and covariance Cε . Let ms be the subvector with directobservations of the model m, and mi the ith element of m. Then the conditionaldistribution of mi |(ms,d) is again Gaussian.

Moreover, if the subvector ms is extracted from the full random vector m with thelinear operator A such that ms = Am, where the ith element is mi = Aim, with Ai

again linear, then the mean and variance of the posterior conditional distributionare:

μmi |(ms,d) = μmi+ [

AiCmAT AiCmGT](C(ms,d))

−1

[ms − Aμm

d − Gμm

]

(12)

σ 2mi |(ms,d) = σ 2

mi− [

AiCmAT AiCmGT](C(ms,d))

−1

[ACmAT

i

GCmATi

]

, (13)

where μmi= Aiμm, σ 2

mi= AiCmAT

i , and

C(ms,d) =[

ACmAT ACmGT

GCmAT GCmGT + Cε

]

. (14)

To clarify the statement we give the explicit form of the operators Ai and A. Inparticular, Ai is written as

Ai = [0 0 . . . 1 . . . 0 ], (15)

with the one in the ith column. If the sub-vector ms has size n, ms = {mi1,mi2 ,

. . . ,min}T , and m has size M ; then the operator A is given by

A =

⎢⎢⎢⎣

0 0 . . . 1 . . . 00 . . . 1 0 . . . 0...

......

......

...

0 1 . . . 0 0 0

⎥⎥⎥⎦

, (16)

Page 6: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

244 D. Grana et al.

where A has dimensions n × M and the ones are in the i1, i2, . . . , in columns. The-orem 2 can be proved using the properties (A) and (B) described in Sect. 2 (see [8]).Then, by using Theorem 1, we extend the result to the Gaussian Mixture case.

Theorem 3 Let m be a random vector distributed according to a Gaussian Mixture,m ∼ ∑Nc

k=1 πkN(μkm,Ck

m), with Nc components and with means μkm, covariances

Ckm, and weights πk , for k = 1, . . . ,Nc. Let G a linear operator such that d =

Gm+ε, with ε a random error vector independent of m with 0 mean and covarianceCε . Let ms be the sub-vector with direct observations of the model m, and mi the ith

element of m. Then the conditional distribution of mi |(ms,d) is again a GaussianMixture.

Moreover, the means and variances of the components of the posterior condi-tional distribution are:

μkmi |(ms,d) = μk

mi+ [

AiCkmAT AiCk

mGT] (

Ck(ms,d)

)−1

[ms − Aμk

m

d − Gμkm

]

(17)

σ2(k)mi |(ms,d)

= σ 2(k)mi

− [AiCk

mAT AiCkmGT

] (Ck

(ms,d)

)−1

[ACk

mATi

GCkmAT

i

]

, (18)

where μkmi

= Aiμkm, σ

2(k)mi

= AiCkmAT

i , and

Ck(ms,d) =

[ACk

mAT ACkmGT

GCkmAT GCk

mGT + Cε

]

. (19)

The posterior coefficients of the mixture are given by

λk = πkfk(ms,d)∑Nc

�=1 π�f�(ms,d), (20)

where the Gaussian components fk(ms,d) have means

μk(ms,d) =

[Aμk

m

Gμkm

]

, (21)

and covariances Ck(ms,d)

.

In the case where the linear operator is the identity, the associated inverse prob-lem reduces to the estimation of a Gaussian Mixture model with direct observationsof the model space at given locations. In other words, if the linear operator is theidentity, the theorem provides an extension of the traditional Sequential GaussianSimulation (SGSim) to the Gaussian Mixture case. We call this methodology Se-quential Gaussian Mixture Simulation (SGMixSim), and we show some applica-tions in the next section.

Page 7: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Gaussian Mixture Simulation 245

4 Application

We describe here some examples of applications with synthetic and real data, inthe context of reservoir modeling. First, we present the results of the estimation ofa Gaussian Mixture model with direct observations of the model space as a spe-cial case of Theorem 3 (SGMixSim). In our example, the continuous property isthe porosity of a reservoir, and the discrete variable represents the correspondingreservoir facies, namely shale and sand. This means that we identify the weightsof the mixture components with the facies probabilities. The input parameters arethen the prior distribution of porosity and a variogram model for each componentof the mixture. The prior is a Gaussian Mixture model with two components and itsparameters are the weights, the means, and the covariance matrices of the Gaussiancomponents. We assume facies prior probabilities equal to 0.4 and 0.6 respectively,and for simplicity we assume the same variogram model (spherical and isotropic)with the same parameters for both. We then simulate a 2D map of facies and poros-ity according to the proposed methodology (Fig. 1). The simulation grid is 70 × 70and the variogram range of porosity is 4 grid blocks in both directions. The simu-lation can be performed with or without conditioning hard data; in the example ofFig. 1, we introduced four porosity values at four locations that are used to condi-tion the simulations, and we generated a set of 100 conditional realizations (Fig. 1).When hard data are assigned, the weights of the mixture components are determinedby evaluating the prior Gaussian components at the hard data location and discreteproperty values are determined by selecting the most likely component.

As we previously mentioned, the methodology is similar to [8], but the use ofGaussian Mixture models allows us to describe the multi-modality of the data andto simulate at the same time both the continuous and the discrete variable. SG-MixSim requires a spatial model of the continuous variable, but not a spatial modelof the underlying discrete variable: the spatial distribution of the discrete variableonly depends on the conditional weights of the mixture (20). However, if the mix-ture components have very different probabilities and very different variances (i.e.when there are relatively low probable components with relatively high variances),the simulations may not accurately reproduce the global statistics. If we assume,for instance, two components with prior probabilities equal to 0.2 and 0.8, and weassume at the same time that the variance of the first component is much bigger thanthe variance of the second one, then the prior proportions may not be honored. Thisproblem is intrinsic to the sequential simulation approach, but it is emphasized incase of multi-modal data. For large datasets or for reasons of stationarity, we oftenuse a moving searching neighborhood to take into account only the points closest tothe location being simulated [6]. If we use a global searching neighborhood (i.e. thewhole grid) the computational time, for large datasets, could significantly increase.In the localized sequential algorithm, the neighborhood is selected according to afixed geometry (for example, ellipsoids centered on the location to be estimated)and the conditioning data are extracted by the linear operator (Theorem 3) withinthe neighborhood. When no hard data are present in the searching neighborhood andthe sample value is drawn from the prior distribution, the algorithm could generate

Page 8: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

246 D. Grana et al.

Fig. 1 Conditional realizations of porosity and reservoir facies obtained by SGMixSim. The priordistribution of porosity and the hard data values are shown on top. The second and third rows showthree realizations of porosity and facies (gray is shale, yellow is sand). The fourth row shows theposterior distribution of facies and the ensemble average of 100 realizations of facies and porosity.The last row shows the comparison of SGMixSim results with and without post-processing

isolated points within the simulation grid. For example, a point drawn from the firstcomponent could be surrounded by data, subsequently simulated, belonging to thesecond component, or vice versa. This problem is particularly relevant in the caseof multi-modal data especially in the initial steps of the sequential simulation (inother words when only few values have been previously simulated) and when thesearching neighborhood is small.

To avoid isolated points in the simulated grid, a post-processing step has beenincluded (Fig. 1). The simulation path is first revisited, and the local conditional

Page 9: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Gaussian Mixture Simulation 247

Fig. 2 Linearized sequential inversion with Gaussian Mixture models for the estimation of poros-ity map from acoustic impedance values. On top we show the true porosity map and the acousticimpedance map; on the bottom we show the inverted porosity and the estimated facies map

probabilities are re-evaluated at all the grid cells where the sample value was drawnfrom the prior distribution. Then we draw again the component from the weights ofthe re-evaluated conditional probability. Finally, we introduce a kriging correctionof the continuous property values that had low probabilities in the neighborhood.

Next, we show two applications of linearized sequential inversion with GaussianMixture models obtained by applying Theorem 3. The first example is a rock physicsinverse problem dealing with the inversion of acoustic impedance in terms of poros-ity. The methodology application is illustrated by using a 2D grid representing asynthetic system of reservoir channels (Fig. 2). In this example we made the sameassumptions about the prior distribution as in the previous example. As in traditionalsequential simulation approaches, the spatial continuity of the inverted data dependson the range of the variogram and the size of the searching neighborhood; however,Fig. 2 clearly shows the multi-modality of the inverted data. Gaussian Mixture mod-els can describe not only the multi-modality of the data, but they can better honorthe data correlation within each facies.

The second example is the acoustic inversion of seismic amplitudes in terms ofacoustic impedance. In this case, in addition to the usual input parameters (priordistribution and variogram models), we have to specify a low frequency model ofimpedance, since seismic amplitudes only provide relative information about elasticcontrasts and the absolute value of impedance must be computed by combining theestimated relative changes with the low frequency model (often called prior model inseismic modeling). Once again, the discrete variable is identified with the reservoirfacies classification. In this case shales are characterized by high impedance values,

Page 10: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

248 D. Grana et al.

Fig. 3 Sequential Gaussian Mixture inversion of seismic data (ensemble of 50 realizations). Fromleft to right: acoustic impedance logs and seismograms (actual model in red, realization 1 in blue,inverted realizations in gray, dashed line represents low frequency model), inverted facies profilecorresponding to realization 1, maximum a posteriori of 50 inverted facies profiles and actual faciesclassification (sand in yellow, shale in gray)

and sand by low impedances. The results are shown in Fig. 3. We observe that eventhough we used a very smoothed low frequency model, the inverted impedance loghas a good match with the actual data (Fig. 3), and the prediction of the discretevariable is satisfactory compared to the actual facies classification performed at thewell. In particular, if we perform 50 realizations and we compute the maximum aposteriori of the ensemble of inverted facies profiles, we perfectly match the actualclassification (Fig. 3). However, the quality of the results depends on the separabilityof the Gaussian components in the continuous property domain.

Finally we applied the Gaussian Mixture linearized sequential inversion to a layermap extracted from a 3D geophysical model of a clastic reservoir located in theNorth Sea (Fig. 4). The application has been performed on a map of P-wave veloc-ity corresponding to the top horizon of the reservoir. The parameters of the vari-ogram models have been assumed from existing reservoir studies in the same area.In Fig. 4 we show the map of the conditioning velocity and the corresponding his-togram, two realizations of porosity and facies, and the histogram of the posteriordistribution of porosity derived from the second realization. The two realizationshave been performed using different prior proportions: 30 % of sand in the firstrealization and 40 % in the second one. Both realizations honor the expected pro-portions, the multi-modality of the data, and the correlations with the conditioningdata within each facies.

Page 11: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

Sequential Gaussian Mixture Simulation 249

Fig. 4 Application of linearized sequential inversion with Gaussian Mixture models to a reservoirlayer. The conditioning data is P-wave velocity (top left). Two realizations of porosity and facies areshown: realization 1 corresponds to a prior proportion of 30 % of sand, realization 2 correspondsto 40 % of sand. The histograms of the conditioning data and the posterior distribution of porosity(realization 2) are shown for comparison

5 Conclusion

In this paper, we proposed a methodology to simultaneously simulate both continu-ous and discrete properties by using Gaussian Mixture models. The method is basedon the sequential approach to Gaussian Mixture linear inverse problem, and it canbe seen as an extension of sequential simulations to multi-modal data. Thanks to thesequential approach used for the inversion, the method is generally quite efficientfrom the computational point of view to solve multi-modal linear inverse problemsand it is applied here to reservoir modeling and seismic reservoir characterization.We presented four different applications: conditional simulations of porosity andfacies, porosity-impedance inversion, acoustic inversion of seismic data, and inver-sion of seismic velocities in terms of porosity. The proposed examples show that we

Page 12: Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian ... · 2020-01-28 · Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential

250 D. Grana et al.

can generate actual samples from the posterior distribution, consistent with the priorinformation and the assigned data observations. Using the sequential approach, wecan generate a large number of samples from the posterior distribution, which infact are all solutions to the Gaussian Mixture linear problem.

Acknowledgements We acknowledge Stanford Rock Physics and Borehole Geophysics Projectand Stanford Center for Reservoir Forecasting for the support, and Eni E&P for the permission topublish this paper.

References

1. Alspach DL, Sorenson HW (1972) Nonlinear Bayesian estimation using Gaussian sum ap-proximation. IEEE Trans Autom Control 17:439–448. doi:10.1109/TAC.1972.1100034

2. Dempster AP, Laird NM, Rubin DB (1977) Maximum likelihood from incomplete data viathe EM algorithm. J R Stat Soc, Ser B, Methodol 39(1):1–38. doi:10.2307/2984875

3. Deutsch C, Journel AG (1992) GSLIB: geostatistical software library and user’s guide. OxfordUniversity Press, London

4. Dovera L, Della Rossa E (2011) Multimodal ensemble Kalman filtering using Gaussian mix-ture models. Comput Geosci 15(2):307–323. doi:10.1007/s10596-010-9205-3

5. Gilardi N, Bengio S, Kanevski M (2002) Conditional Gaussian mixture models for environ-mental risk mapping. In: Proc. of IEEE workshop on neural networks for signal processing,pp 777–786. doi:10.1109/NNSP.2002.1030100

6. Goovaerts P (1997) Geostatistics for natural resources evaluation. Oxford University Press,London

7. Grana D, Della Rossa E (2010) Probabilistic petrophysical-properties estimation integrat-ing statistical rock physics with seismic inversion. Geophysics 75(3):O21–O37. doi:10.1190/1.3386676

8. Hansen TM, Journel AG, Tarantola A, Mosegaard K (2006) Linear inverse Gaussian theoryand geostatistics. Geophysics 71:R101–R111. doi:10.1190/1.2345195

9. Hasselblad V (1966) Estimation of parameters for a mixture of normal distributions. Techno-metrics 8(3):431–444. doi:10.2307/1266689

10. Hastie T, Tibshirani R (1996) Discriminant analysis by gaussian mixtures. J R Stat Soc B58(1):155–176. doi:10.2307/2346171

11. Hastie T, Tibshirani R, Friedmann J (2009) The elements of statistical learning. Springer,Berlin

12. Mosegaard K, Tarantola A (1995) Monte Carlo sampling of solutions to inverse problems.J Geophys Res 100:12431–12447. doi:10.1029/94JB03097

13. Reynolds DA, Quatieri TF, Dunn RB (2000) Speaker verification using adapted Gaussianmixture models. Digit Signal Process 10(1–3):19–41. doi:10.1006/dspr.1999.0361

14. Sung HG (2004) Gaussian mixture regression and classification. PhD thesis, Rice University15. Tarantola A (2005) Inverse problem theory. SIAM, Philadelphia