Eni announced the industrial deployment of ECHELON software, the fully GPU-based reservoir simulator developed by our company, Stone Ridge Technology.
Eni, the Italian energy major headquartered in Milan, has made two recent announcements related to GPU computing and its digital transformation agenda. In the first, released on 31 October, Eni announced that it has ordered an addition to the supercomputing systems housed in their Green Data Center. The new Dell machine, called HPC5, comprises 1820 compute nodes, each with four NVIDIA V100 GPU accelerators, delivering a combined processing power of 34 PetaFLOPS. According to Eni the new machine is a crucial component of their digital agenda. Claudio Descalzi, CEO of Eni, commented: "Our investment to strengthen our supercomputer infrastructure and to develop proprietary technologies are a crucial part of the digital transformation of Eni. Having great computing power and sophisticated algorithms at our disposal makes us leaders in the modern energy sector and projects us on to the future."
The second announcement, released November 12, 2019 concerns software, another crucial element of Eni's digital transformation agenda. It announces the industrial deployment of ECHELON software, the fully GPU-based reservoir simulator developed by our company, Stone Ridge Technology.
With HPC5, the Eni Green Data Center represents the largest industrially deployed computing capacity in the world. As pointed out in the excellent article by Timothy Prickett Morgan in The Next Platform, 98% of the flops and 94% of the bandwidth in HPC5 is delivered by the NVIDIA GPUs. What does a company like Eni do with this kind of massive computing power? Companies like Eni use supercomputers for two essential technical problems. The first is related to finding hydrocarbons (seismic imaging) and the second is related to development, optimization and planning strategies for extracting the hydrocarbons over the lifetime of the asset (reservoir simulation). Seismic imaging can be thought of as an ultrasound of the earth. Reservoir simulation models how oil, gas, and water flow under the ground in the presence of wells. Seismic imaging is a tool for finding a needle in a haystack. It involves processing enormous volumes of data recorded from the earth's subsurface, most of which does not contain trapped resources. Reservoir simulation is the tool needed to most efficiently retrieve the needle, once found.
Historically, seismic imaging has taken the lion's share of computing power with reservoir simulation at a distant second place. This is changing, however, for three essential reasons. The first is the emergence, over the last decade, of new parallel reservoir simulators like ECHELON software. Whereas the algorithms attendant to seismic imaging are naturally amenable to massive parallelism, it is more challenging to expose such parallelism in reservoir simulation. It took more time and effort to create parallel reservoir simulators. ECHELON reservoir simulation software is unique in that it was designed from inception to run on NVIDIA GPUs. This gives it an advantage over CPU based codes that have added GPU capability only as an afterthought.
Second, energy companies are interested in developing larger more fine-grained models. They have been restricted to using smaller models by the performance limitations of legacy simulators. The larger models capture greater detail and demonstrate more fidelity to the actual physical system. This is particularly important when using advanced methods of oil recovery.
Finally, energy companies are interested in running more models — many more in fact. Ensemble modeling is a technique where thousands of different realizations of a model are simulated to provide an envelope of possible outcomes with statistical weighting. Reservoirs are thousands of feet underground where the exact structure and composition of the subsurface rock is not precisely known. The goal of reservoir modeling is to find parameter values that match the available data from seismic surveys, well logs, and historical production. In practice, the model is under-constrained and there are an infinite number of models that will fit the data. Ensemble modeling recognizes and embraces this uncertainty, and provides statistical bounds on future production. Running ensembles yields increased fidelity between simulations and historical production data, hence more reliable production forecasts, and ultimately better-informed business decisions. This history matching process has previously involved many slow and tedious iterations and approximations to match the model simulation to the data. Now, this workflow is greatly accelerated and ready for optimization by, for example, AI techniques which can be used to automate and improve the process. This ambition to run ensembles of high-fidelity models with more and more advanced physics is driving the compute requirements of reservoir simulation.
Periodically I'm asked by my colleagues in the hardware business to estimate for them the total addressable market (TAM) for hardware directed towards reservoir simulation. This is difficult to say with any precision, but I can confirm that it is growing. There is a supercomputing arms race underway between the supermajors — only some of which is public. The one metric that we have is the TAM for seismic which is a mature HPC application and I think the spend on seismic hardware is a reasonable bound for how much companies may be willing to commit to hardware for reservoir simulation as long as they are getting value from the effort.
Recognizing the current and growing advantages of computing on NVIDIA GPUs, Eni partnered with SRT to enhance the capabilities of ECHELON software and to develop advanced features and innovative workflows beyond the capability of legacy simulators. With HPC5 packed with NVIDIA GPUs and ECHELON reservoir simulation software making maximal use of them, Eni is well poised to execute on its digital transformation agenda.
Vincent Natoli is the president and founder of Stone Ridge Technology. He is a computational physicist with 30 years experience in the field of high-performance computing. He holds Bachelors and Masters degrees from MIT, a PhD in Physics from the University of Illinois Urbana-Champaign and a Masters in Technology Management from the Wharton School at the University of Pennsylvania.
What we are doing to help improve the reservoir simulation industry.
Leveraging modern GPUs and neural computing libraries, scientists are using deep learning technology to solve differential equations that emerge from a diverse set of physics-based problems. This blog introduces the topic and explores potential applications and limitations of the technology.Read article →