Here we post the top questions from our recent webinar with answers from our team.
Posted in: ECHELON Software
On 22 and 27 April 2023 SRT and Eni presented webinars on ECHELON v2023.1. At the conclusion of each, time was left for live questions. Those questions, and the answers from SRT and Eni staff are presented below.
1. Is coupling with other software (such as Petrel) dynamic or static ?
The link with geomodeling software such as Petrel is static. Such software generates models which we then run with ECHELON through a queuing system. Possibly, iterations between the geomodeler and ECHELON are performed during history matching using software such as ResX.
ECHELON can also be coupled iteratively to other types of software such as Simulia Abaqus (geomechanics) or GAP (surface network modeler), in which case the connection is dynamic.
2. What is the maximum number of wells in a real case study where ECHELON is coupled with GAP?
There is no hardcoded limit. So far, we have been using the coupling with GAP throughout RESOLVE with some tens of wells.
3. What would be the limits of CCS simulation by ECHELON in this version ?
The current ECHELON release supports two phases (gas and water) when aqueous-vapor equilibrium is accounted for, hence it is not possible to simulate CO2 sequestration problems with gas condensate. Currently this feature is available in fully-implicit.
4. What is the parallelization principle of ECHELON? Is it domain decomposition?
In terms of the parallelization, ECHELON uses two different levels. Within a GPU there is a very fine-grain type coupling between the thousands of cores on the GPU but when we extend the simulation to multiple GPUs there needs to be a communication between them, and that is through a standard domain decomposition methodology.
5. I am impressed with the effects of CO2 solubility in water in terms of long term distribution in the reservoir - do you have any papers about this work available to share?
Please take a look at the technical papers Eni and Stone Ridge Technology have written and been a part of here.
6. What was the impact of the number of GPUs on the 5.7 million cell model - especially on the field pressure?
We strive to use convergence (both linear and nonlinear) parameters that are tight enough to ensure minimal differences in simulation results when changing hardware, be it the type of GPU or the number of GPUs used in parallel runs. In this specific case, the impact of changing the number of GPUs on the field pressure was less than the line width in the plot.
7. ECHELON can be used to model CO2 sequestration, in addition, does it have geochemistry modeling capability?
At the moment ECHELON cannot model geochemistry.
8. How good is ECHELON on desktop GPUs; is there a performance difference between Windows and Linux?
ECHELON can run well on desktop GPUs under both Windows and Linux. However, under Windows, there are two NVIDIA GPU driver modes: WDDM (for graphics) and TCC (for compute only). ECHELON can run under either mode, but running under the WDDM mode imposes a small performance penalty.
9. How fast is FNS? Has it been tested on field assets?
FNS is highly optimized and usually solves networks with dozens of wells and hydraulic flow-lines in the order of 10 milliseconds. Especially with larger simulation models it does not visibly impact the overall simulation time. FNS is being used on multiple large asset models, including those where wells from multiple reservoirs are coupled together.
10. What is the performance penalty of coupling to FNS if the coupling is performed at the bottom hole or at the wellhead?
Coupling FNS to ECHELON results in a small performance penalty that is negligible if we couple periodically, and that grows to a small but perceptible cost when solving the network at each Newton iteration. However, in practice, such a tight coupling is seldom necessary.
FNS is capable of solving large networks where almost all flow-lines require VLP table interpolation. Coupling on the bottom hole adds one VLP table per well. If the ratio of wells to overall flow-lines is large, then computational costs increase. However, it is often advantageous to perform the well hydraulics inside FNS instead of the simulation engine because the computations are based on IPR tables. More generally speaking, the difference in computational time is minuscule and is not considered to be a factor in the coupling decision.
11. Does your latest vapor-aqueous equilibrium extension include non-isothermal treatment of solubility? Or is that functionality part of your near-term roadmap?
ECHELON is currently an isothermal simulator, we have included thermal capabilities in our roadmap, in particular in the context of CO2 injection.
12. Can ECHELON handle mixed precision numerical problems? and also does ECHELON support tensor cores? Can ECHELON be used with a Hopper-Grace-Hopper configuration?
ECHELON takes advantage of mixed-precision for its stability and flash calculations, where faster single precision calculations are used to obtain an initial guess before switching to double precision. ECHELON can also use single precision preconditioning of the linear system of equations, but our observation is that while this brings performance benefits in some cases, it can seriously degrade convergence in others and can therefore not be recommended as the default option.
At the moment, ECHELON does not take advantage of tensor cores. We have still not had access to Grace-Hopper chips, but we have some ideas about how to use them.
13. Can ECHELON simulation runs be easily modified using the input data file (probably in a text editor) instead of using GUI?
ECHELON uses the industry-standard (ECLIPSE-like) syntax, which is well suited to be modified “by hand” in a text editor (possibly an enhanced text editor with syntax highlighting for more comfort).
14. How did you couple ECHELON with ESMDA using MPS?
At the moment, running ensemble models using MPS requires a customized setting that we need to help our clients with. We are currently working towards a more user-friendly solution.
15. Do you plan to implement geomechanical simulation methods?
This is currently not in our workplan, but we do not exclude it.
16. Is there a better elliptical solver for the pressure equation than AMG?
In the context of reservoir simulation, AMG is currently the most robust elliptical solver for the pressure equation resulting from CPR decoupling. Note that there are different flavours of AMG with their own pros and cons (typically, a trade-off between set-up time and convergence properties). Due to the high set-up cost of AMG, it is possible that for smaller models a lighter pressure solver such as nested factorization (still within a CPR framework) may be faster, but we did not explore this option.
17. What was the best solver for the pressure and saturation equations? AMG for the pressure and GMRES +ILU(0) for the saturation solve?
ECHELON currently uses CPR with AMG for the first stage, and offers a choice of second stage preconditioners depending on the problem, one of them being ILU(0). Our Krylov solver is restarted GMRES.
18. When will pressure maintenance and multi segment capability be added?
Pressure maintenance (in fact, a flexible PID controller able to support pressure maintenance as possible use case) is already available. Multi-segmented wells are part of our medium/long-term roadmap.
19. Can ECHELON handle CO2 convective mixing?
ECHELON supports aqueous-vapor phase equilibrium, including in particular CO2 dissolution in the aqueous phase, and the corresponding impact on density. It can therefore model convective mixing. Nevertheless, ECHELON does not yet support molecular diffusion or hydrodynamic dispersion, meaning that the gravity fingers will be governed by numerical dispersion only. Addition of hydrodynamic dispersion is part of our short-term roadmap.
20. Does ECHELON have the option of hysteresis for low salinity modeling?
In the current ECHELON release, it is possible to model hysteresis and low salinity flooding simultaneously, however only the high-salinity relative permeability curves will undergo hysteresis. Adding hysteresis support for the low salinity curves would be trivial if a client or partner were to request it.
21. Can you say a few words about Geomechanical coupling?
ECHELON can currently be coupled to third-party geomechanical software through iterative coupling (either at the timestep level, or with a multi-rate method, that is to say periodically). Case studies have so far been performed with the Abaqus software, but the approach is, in principle, reproducible with other third-party software.
22. How was the flow of information for multi-GPU using CUDA-aware MPI? Device-host or Host-device?
Within the same node, a CUDA-aware MPI will transfer from GPU to GPU without traversing host memory. Similarly, if the nodes contain properly configured Infiniband adaptors, the transfer between GPUs on different nodes can occur with direct communication between the Infiniband adaptors and GPUs, also without traversing host memory. If the system is not configured for this, there is a fallback path through host memory.
23. Which poly degradation model are you using?
ECHELON currently supports a linear decay-type degradation model with two half times: one for polymer in solution, and possibly a different one for adsorbed polymer.
Emily Fox
Emily Fox is Stone Ridge Technology's Director of Communication.
What we are doing to help improve the reservoir simulation industry.
ECHELON version 2024.1 features. Contact us today to trial our leading reservoir simulation software.
Read article →Generative AI is poised to play a significant role in Reservoir Engineering workflows.
Read article →