EAGE UPDATE
What drives a high performance computer maestro Dr Vincent Natoli is a member of the Technical Committee for the upcoming Third EAGE Workshop on High Performance Computing (HPC) for Upstream being held in Athens on 1-4 October, 2017. He obtained a Bachelor and Masters degree from MIT and a PhD from University of Illinois Urban-Champaign, which he followed in mid career with a Masters in Technology Management from the Wharton School at the University of Pennsylvania. He is president and founder of Stone Ridge Technology, which focuses on fast reservoir simulation and other applications of modern multi-core and GPU compute architectures. Previous experience includes being technical director, High Performance Technologies (HPTi), and a senior physicist at ExxonMobil. Here he talks about the workshop, the future of HPC and recruiting young people into the discipline.
What attracted you to computational physics in the first place? I was fascinated by computers with my first introduction. In high school I used to eat my lunch as quickly as possible so I could run down to our math office where, somehow, we had a DEC PDP11 that I programmed in BASIC. The idea of simulating nature and the physical world on a computer has always been compelling to me. It combines my dual interest of understanding the working of our physical world with physics and the powerful calculation capability of modern computers. How would you encourage young people to follow your example in exploring computer technology? There are so many opportunities for learning and self-guided education on the Internet. For high school students I would say put away Facebook, Instagram and Snapchat for a while and instead go through a tutorial on C and C++. Build a small Linux-based computer using the Raspberry Pi kit (https://www.raspberrypi.org) and get accustomed to the Unix operating system, post your projects in Github and document them on a web page. In my experience the best way to learn computer technology is to do something not just read about it. Does the industry sufficiently understand the value and function of HPC applications? In my career I have worked with three different industries. I have primarily contributed to the oil and gas industry, starting my career with ExxonMobil and later with my own company Stone Ridge Technology, but I have also worked in bioinformatics and in finance. Of these three I believe that the oil and gas industry understands the value and function of HPC best. Finance is a close second and bioinformatics a distant third. The two most prominent HPC applications in
O&G are seismic processing and reservoir simulation. I believe the industry understands the value of fast, robust HPC capability because there is so much at stake in the exploration and development of hydrocarbon assets. Even a small bump up in productivity on fields worth many billions of dollars is a significant achievement that dwarfs the modest investment in hardware and software. Dr Vincent Natoli with Ms Simanti Das (ExxonMobil)
At the workshop what will be the outstanding issues? Some of the salient topics that I think will be discussed are the continuing transition to efficient parallel architectures including the new Intel Knights Landing and NVIDIA GPUs. There is a large and growing gap between the capability of modern chip architectures with a high level of parallelism and the legacy software that dominates the industry. I believe there are massive performance and efficiency gains to be realized. What interests you and your company most about the potential of HPC? We want to change the way people do their work with our deep knowledge of HPC, numerical methods and computational science. We want reservoir engineers to think how they might work if they could do simulations in one tenth or one one hundredth the current time, if they could run thousands of realizations of their models on a small cluster or if they could routinely run huge models when needed and still turn simulations around in a few hours. We want to bring new capability to the industry to help it become more cost and time efficient stewards of our hydrocarbon resource. Where do you see HPC in 10 years’ time? The evolution of HPC depends on trends and developments in both hardware and software. Hardware evolves very rapidly while software changes
EAGE NEWSLETTER MIDDLE EAST ISSUE 1 2017
at the Second EAGE high performance computing workshop.
much more slowly. The trends in hardware, concentrating more and more parallelism and capability on-chip will continue into 2027 aided by several more turns of feature shrink and clever design. Memory bandwidth into the chip will also continue to grow riding on the new 3D memory technology found in this year’s NVIDIA Pascal and Intel KNL architectures. These hardware advances will continue to challenge HPC software developers to design in massive fine-grained parallelism. The performance difference between those that do this successfully and those who continue with legacy software will become painfully apparent. What in your opinion have been the landmarks in the contribution of HPC to the E&P industry? In seismic processing HPC has enabled a progression of successively more complex reverse time migration (RTM) processing from isotropic to TI to VTI to TTI and now FWI providing increasing fidelity and resolution to practitioners. In reservoir simulation HPC enables engineers to do a multitude of ‘what-if’ scenarios for field development using fast simulation of large, complex reservoir models. The ability to optimize asset development strategies with a computer before you take drill to earth saves enormous time and money for producers.
5