First Break February 2024 - Digitalization / Machine Learning

Page 1

VO L U M E 4 2 I I S S U E 2 I F E B R U A R Y 2 0 2 4

SPECIAL TOPIC

Digitalization / Machine Learning EAGE NEWS Sustainable Energy Award for young professionals CROSSTALK How coal mines are firing up renewable energy TECHNICAL ARTICLE Environmental monitoring using multispectral UAV


Discover Sercel Land Services SOLVING COMPLEX CHALLENGES SOLVING COMPLEX CHALLENGES WITH TECHNOLOGY-DRIVEN EXPERTISE WITH TECHNOLOGY-DRIVEN EXPERTISE Sercel specializes in solving the most complex operational challenges for Sercel in and solving the most complex operational challenges for seismicspecializes contractors operators, ensuring that projects are completed seismic contractors and operators, ensuring that projects are completed on time, within budget, and according to technical specifications. on time, within budget, and according to technical specifications. Trust us to streamline your operations, improve operational efficiency, and Trust to streamline your operations, improve operational efficiency, and deliverusunrivaled project success. deliver unrivaled project success. Explore our suite of land services: Explore our suite of land services: • Train & launch & launch Start your project with confidence • Train Start your project with confidence • Asset optimization Get theoptimization best from your equipment • Asset Get the best from your equipment • Operational excellence excellence Steer your crew performance • Operational Steer your crew performance

For more information For more information

Nantes, France sales.nantes@sercel.com Nantes, France Houston, USA sales.nantes@sercel.com sales.houston@sercel.com Houston, USA sales.houston@sercel.com www.sercel.com

www.sercel.com


FIRST BREAK® An EAGE Publication

CHAIR EDITORIAL BOARD Gwenola Michaud (gmichaud@gm-consult.it) EDITOR Damian Arnold (arnolddamian@googlemail.com) MEMBERS, EDITORIAL BOARD •  Lodve Berre, Norwegian University of Science and Technology (lodve.berre@ntnu.no) •  Philippe Caprioli, SLB (caprioli0@slb.com) •  Satinder Chopra, SamiGeo (satinder.chopra@samigeo.com) •  Anthony Day, PGS (anthony.day@pgs.com) •  Peter Dromgoole, Retired Geophysicist (peterdromgoole@gmail.com) •  Kara English, University College Dublin (kara.english@ucd.ie) •  Stephen Hallinan, CGG (Stephen.Hallinan@CGG.com) •  Hamidreza Hamdi, University of Calgary (hhamdi@ucalgary.ca) •  Clément Kostov, Freelance Geophysicist (cvkostov@icloud.com) •  Martin Riviere, Retired Geophysicist (martinriviere@btinternet.com) •  Angelika-Maria Wulff, Consultant (gp.awulff@gmail.com) EAGE EDITOR EMERITUS Andrew McBarnet (andrew@andrewmcbarnet.com) MEDIA PRODUCTION Saskia Nota (firstbreakproduction@eage.org)

75

Artificial intelligence and life on Mars.

Editorial Contents 3

EAGE News

17

Personal Record Interview — Dmitry Bozhezha

18

Monthly Update

20

Crosstalk

23

Industry News

PRODUCTION ASSISTANT Ivana Geurts (firstbreakproduction@eage.org) ADVERTISING INQUIRIES corporaterelations@eage.org EAGE EUROPE OFFICE Kosterijland 48 3981 AJ Bunnik The Netherlands • +31 88 995 5055 • eage@eage.org • www.eage.org EAGE MIDDLE EAST OFFICE EAGE Middle East FZ-LLC Dubai Knowledge Village Block 13 Office F-25 PO Box 501711 Dubai, United Arab Emirates • +971 4 369 3897 • middle_east@eage.org • www.eage.org EAGE ASIA PACIFIC OFFICE UOA Centre Office Suite 19-15-3A No. 19, Jalan Pinang 50450 Kuala Lumpur Malaysia • +60 3 272 201 40 • asiapacific@eage.org • www.eage.org EAGE AMERICAS SAS Av. 19 #114-65 - Office 205 Bogotá, Colombia • +57 310 8610709 • +57 (601) 4232948 • americas@eage.org • www.eage.org EAGE MEMBERS CHANGE OF ADDRESS NOTIFICATION Send to: EAGE Membership Dept at EAGE Office (address above) FIRST BREAK ON THE WEB www.firstbreak.org ISSN 0263-5046 (print) / ISSN 1365-2397 (online)

Technical Articles 33 DC resistivity inversion using conjugate gradient and maximum likelihood techniques with hydrogeological applications Cassiano Antonio Bortolozo, Jorge Luís Porsani, Fernando Acácio Monteiro dos Santos and Tristan Pryer 41 Innovative environmental monitoring methods using multispectral UAV and satellite data Benjamin Haske, Tobias Rudolph, Bodo Bernsdorf and Marcin Pawlik

Special Topic: Digitalization / Machine Learning 49 Lessons learnt for tuning a machine learning fault prediction model Hadyan Pratama, Matthew Oke, Wayne Mogg, David Markus, Arnaud Huck and Paul de Groot 57 Ensemble history-matching workflow using interpretable SPADE-GAN geomodel Kristian Fossum, Sergey Alyaev and Ahmed H. Elsheikh 65 Geomechanical parameter derivation while drilling in unconventional plays: a combination of surface drilling data, gamma ray data, and machine learning techniques Marvee Dela Resma and Ivo Colombo 69 Adopting technology to revolutionise and accelerate the flow of seismic data from sensor to customer Erik Ewig, John Brittan, Cerys James, John Oluf Brodersen and Sverre Olsen 75

Artificial intelligence and life on Mars Neil Hodgson and Sam Tyler

79 Large volume analysis of core and thin section images in the assessment of Brazil pre-salt reservoir distribution Edward Jarvis, Haoyi Wang, Jonathan Dietz and Thomas Van Der Looven 89 A framework for mineral geoscience data and model portability John McGaughey, Julien Brossoit, Kristofer Davis, Dominique Fournier and Sébastien Hensgen 94

Calendar

cover: Ikon Science is currently working with several super-majors and independents, bringing Generative AI and Machine Learning to subsurface data collaboration, interrogation and visualisation with their Curate: Data Management Platform.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

1


European Association of Geoscientists & Engineers

Board 2023-2024

Near Surface Geoscience Circle Esther Bloem Chair Andreas Aspmo Pfaffhuber Vice-Chair Micki Allen Contact Officer EEGS/North America Adam Booth Committee Member Hongzhu Cai Liaison China Deyan Draganov Technical Programme Officer Wolfram Gödde Liaison First Break Hamdan Ali Hamdan Liaison Middle East Vladimir Ignatev Liaison CIS / North America Musa Manzi Liaison Africa Myrto Papadopoulou Young Professional Liaison Catherine Truffert Industry Liaison Mark Vardy Editor in Chief Near Surface Geophysics Florina Tuluca Committee member

Oil & Gas Geoscience Circle Edward Wiarda President

Laura Valentina Socco Vice-President

Pascal Breton Secretary-Treasurer

Caroline Le Turdu Membership and Cooperation Officer

Peter Rowbotham Publications Officer

Yohaney Gomez Galarza Chair Johannes Wendebourg Vice-Chair Lucy Slater Immediate Past Chair Erica Angerer Member Wiebke Athmer Member Tijmen Jan Moser Editor-in-Chief Geophysical Prospecting Adeline Parent WGE & DET SIC liaison Matteo Ravasi YP Liaison Jonathan Redfern Editor-in-Chief Petroleum Geoscience Aart-Jan van Wijngaarden Technical Programme Officer

Sustainable Energy Circle Carla Martín-Clavé Chair Giovanni Sosio Vice-Chair

SUBSCRIPTIONS First Break is published monthly. It is free to EAGE members. The membership fee of EAGE is € 80.00 a year including First Break, EarthDoc (EAGE’s geoscience database), Learning Geoscience (EAGE’s Education website) and online access to a scientific journal.

Maren Kleemeyer Education Officer

Aart-Jan van Wijngaarden Technical Programme Officer

Esther Bloem Chair Near Surface Geoscience Circle

Companies can subscribe to First Break via an institutional subscription. Every subscription includes a monthly hard copy and online access to the full First Break archive for the requested number of online users. Orders for current subscriptions and back issues should be sent to First Break B.V., Journal Subscriptions, Kosterijland 48, 3981 AJ Bunnik, The Netherlands. Tel: +31 (0)88 9955055, E-mail: subscriptions@eage.org, www.firstbreak.org. First Break is published by First Break B.V., The Netherlands. However, responsibility for the opinions given and the statements made rests with the authors. COPYRIGHT & PHOTOCOPYING © 2024 EAGE All rights reserved. First Break or any part thereof may not be reproduced, stored in a retrieval system, or transcribed in any form or by any means, electronically or mechanically, including photocopying and recording, ­without the prior written permission of the publisher.

Yohaney Gomez Galarza Chair Oil & Gas Geoscience Circle

2

FIRST

BREAK

PAPER The publisher’s policy is to use acid-free permanent paper (TCF), to the draft standard ISO/DIS/9706, made from sustainable ­forests using chlorine-free pulp (Nordic-Swan standard).

Carla Martín-Clavé Chair Sustainable Energy Circle

I

VOLUME

42

I

FEBRUARY

2024


HIGHLIGHTS

04

Marie Tharp Award launched

11

Energy transition vision revealed at GET

16

Time to join Digital 2024 conversation

Energy transition topics

will feature in course programme at 2024 Annual In line with this year’s 85th EAGE Annual Conference in Oslo — ‘Technology and Talent for a Secure and Sustainable Energy Future’ — we will be emphasising the importance of education including energy transition-related courses on geologic hydrogen and geothermal energy.

Windpower presentation at previous event.

Maren Kleemeyer, EAGE education officer says: ‘Learning new skills is more critical during these times of rapid change to support the energy transition. Therefore, we would like to stimulate a faster distribution and sharing of knowledge between EAGE members by offering more and newer courses during the Annual conference.’ In February, two courses have already been confirmed: one with Dariusz Strąpoć (SLB) on ‘Exploration of subsurface natural geologic hydrogen and stimulation for its enhanced production’, and the other with Denis Voskov (TU Delft) on ‘Reservoir engineering of geothermal energy production’ covering the latest updates in these topics. The exploration of subsurface natural geologic hydrogen and stimulation for its enhanced production will be discussed along with comparison of carbon footprint versus the price of the full palette of different sources of hydrogen. Included will be a comparison of energy output per mass and per volume between H2 and all major fuels. All industrial and natural sources and generating mechanisms and corresponding association with other gases as well as consumption fluxes will be described. Global occurrences and seepages of natural H2 will be presented along with worldwide ongoing and planned exploration activity. Geologic setting of FIRST

BREAK

the only H2 production field in Mali will be discussed. A series of challenges associated with natural H2 exploration and production will be detailed, including natural H2 systems differences and analogues to petroleum systems, drilling and logging associated challenges, downhole sampling and transportation, safety issues, and finally storage capacity challenges. The natural and stimulated H2 systems require novel reactive transport and geologic systems modelling efforts. The current status of such modelling adjustments for H2 will also be part of the course. Strategies for stimulating natural hydrogen (orange H2) subsurface will be featured as well as enhancement of natural H2 generation and production rates including physical and chemical. The main challenges associated with subsurface stimulation will also be presented. The landscape activity and research initiatives in the orange H2 space will be updated and Helium exploration will be touched upon as a related topic, as in certain scenarios, this gas can be associated with natural H2. Heating and cooling demand adds up to almost 50% of the EU’s total gross energy consumption. A large portion of this energy could be delivered by direct heat geothermal resources. The hands-on course ‘Reservoir engineering of geothermal energy production’ starts I

VOLUME

42

I

FEBRUARY

2024

3


EAGE NEWS

with introductory lectures on basics of the geothermal energy production and basics of reservoir simulation, followed by two simulation exercises (using open-source software) starting from a simplified conceptual model and finishing with a full 3D model in realistic geological sediments. Participants should have prior knowledge of basic Python programming. Visit eageannual.org for more details. Delegates can take advantage of the all access registration to participate in the courses, together with workshops, field trips and other activities — all at a reduced rate.

Learning by doing in our hands-on course.

EAGE launches Marie Tharp sustainable energy award for young professionals Over 2023, the geoscience and engineering community has endeavoured to advance the energy transition more than ever. To anticipate further transformations and the need to innovate, we are launching the Marie Tharp Award, a sustainable energy young professionals award dedicated to promising and creative talents among the next generation of leaders committed to transforming energy systems and speeding up the global energy transition. Named after the renowned American geologist and oceanographic cartographer Marie Tharp, this recognition is a legacy between past and future generations to encourage students to explore beyond boundaries. The award is meant to acknowledge perseverance, willpower, and excellence in conserving our planet, in the spirit of Marie Tharp’s imaginative thinking and discoveries. The award is also intended to be a catalyst for future to spotlight on young talents in the geoscience and engineering community and accelerate the change needed for a sustainable future. Eligible candidates must be MSc and PhD students pursuing a college curriculum directed towards a career in geoscience and engineering to support the clean energy transition. The submission procedure consists of a nomination pack4

FIRST

BREAK

I

VOLUME

42

I

age that must be sent to awards@eage.org by 23:59 CET on 1 March 2024. The application process is straightforward yet comprehensive, requiring candidates to submit their CV, a list of publications (if applicable), and a nomination letter outlining their motivation, achievements, and ongoing projects. The letter should emphasise the student’s plans to contribute to advancing practices in the field of energy transition and the commitment to the geoscience and engineering community, as well as EAGE, in this

domain. Recognising the multi-dimensional nature of the challenge, the award encourages applications from a diverse range of disciplines. The award winner will receive a grant to attend the EAGE 2024 GET Conference. Participation in the conference will not only recognise the awardee’s contribution but also provide an invaluable opportunity to network with industry experts, gain insights into the latest advancements, and fuel a passion for driving positive change in the energy transition landscape.

We encourage direct nominations of creative talents among the next generation of energy transition leaders.

FEBRUARY

2024


GET TO THE FINISH LINE FASTER CONVENTIONAL IMAGING

DUG MP-FWI IMAGING

Data courtesy of Shell Depth migration

Model building

Regularisation

Demultiple

Designature

Deghosting

DUG MP-FWI Imaging

Why jump so many hurdles when there is a fast lane to superior results? DUG Multi-parameter FWI Imaging completely replaces the conventional processing and imaging workflow. It delivers high-resolution reflectivity images for both structural and quantitative interpretation, using field-data input—without the many time-consuming, subjective steps of the conventional method.

IS

9001

COMPASS

RVICES SE

IS O

SURANCE AS

SURANCE AS

RVICES SE

COMPASS

Choose your lane.

O 27001

Let’s talk—contact info@dug.com

dug.com/fwi


EAGE NEWS

Coaching programme offers support for your career move

Efficient personal development is within reach with our coaching programme.

If you are a geoscientist looking to pivot your career and explore new opportunities, look no further… Whether you are a young or seasoned professional, let the EAGE coaching programme empower you to take your career to the next level. Choosing a career among several options or transitioning from one role to another role can be both exciting and challenging. Our coaching programme is designed to equip you with skills, knowledge, and strategies needed to smoothly navigate this change. Whether you’re

eyeing a role in data science, software engineering, or any other domain, our experienced coaches can help prepare you to take the next step. The coaching programme can benefit you in a number of different ways, for example, building confidence by planning and prioritising your next moves while acquiring new habits and discipline to overcome the potential challenge of a new career role; creating a strategy to acquire any new technical skills; getting involved in group coaching for group and

team guidance tailored to your unique strengths and weaknesses in a cross-learning environment; and of course networking benefits of getting together with your peers to compare experiences. Check out the programme on the Learning Geoscience platform. EAGE recognises the industry’s present challenging economic circumstances and provides assistance to members who are currently unemployed and looking for possibilities to access educational programmes. If you are unemployed and unsure about your next steps, you should definitely consider the coaching programme. Contact EAGE and check the Economic Hardship Programme on eage. org/membership/hardship-programme. Also connect with our coaches on LinkedIn: Esther Bloem, Gwenola Michaud and Lucia Levato who are building a supportive community of like-minded professionals who are passionate about carving out new career paths. They created a LinkedIn page that at the moment serves as a hub introducing the programme. For inquiries and updates, visit their LinkedIn page: www.linkedin.com/company/navigatinggeoscience-careers.

EAGE Online Education Calendar START AT ANY TIME

VELOCITIES, IMAGING, AND WAVEFORM INVERSION - THE EVOLUTION OF CHARACTERIZING THE EARTH’S SUBSURFACE, BY I.F. JONES (ONLINE EET)

SELF PACED COURSE

6 CHAPTERS OF 1 HR

START AT ANY TIME

GEOSTATISTICAL RESERVOIR MODELING, BY D. GRANA

SELF PACED COURSE

8 CHAPTERS OF 1 HR

START AT ANY TIME

CARBONATE RESERVOIR CHARACTERIZATION, BY L. GALLUCIO

SELF PACED COURSE

8 CHAPTERS OF 1 HR

START AT ANY TIME

NEAR SURFACE MODELING FOR STATIC CORRECTIONS, BY R. BRIDLE

SELF PACED COURSE

9 CHAPTERS OF 1 HR

6-9 FEB

THE INTERPRETER’S GUIDE TO DEPTH IMAGING, BY S. MACKAY

INTERACTIVE ONLINE SHORT COURSE

4 HRS/DAY

19-22 FEB

PALYNOLOGY FOR GEOLOGISTS, BY M. STEPHENSON

INTERACTIVE ONLINE SHORT COURSE

4 HRS/DAY, 6 PARTS

21-22 FEB

BEYOND CONVENTIONAL SEISMIC IMAGING, BY E. LANDA

INTERACTIVE ONLINE SHORT COURSE

4 HRS/DAY, 5 PARTS

29 FEB 11 APR

NAVIGATING CAREER CHALLENGES AND OPPORTUNITIES OF THE ENERGY TRANSITION, BY E. BLOEM, L. LEVATO & G. MICHAUD

EXTENSIVE ONLINE COURSE

24 HRS (INCL. 6 WEBINARS OF 2.5 HRS EACH)

* EXTENSIVE SELF PACED MATERIALS AND INTERACTIVE SESSIONS WITH THE INSTRUCTORS: CHECK SCHEDULE OF EACH COURSE FOR DATES AND TIMES OF LIVE SESSIONS

6

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


EAGE NEWS

Workshop to explore frontiers of marine acquisition

The 4th EAGE Marine Acquisition Workshop in Oslo will be a significant event for operators, contractors, manufacturers, and academia to discuss the latest geophysical and technical developments and innovations in marine acquisition. ‘In response to the energy trilemma, our workshop will explore not only hydrocarbon exploration and production seismic solutions but also energy transition related applications such as CCS and offshore wind. We welcome contributions on marine seismic methods and other relevant marine geophysical methods, aiming to drive innovation in the industry forward,’ says Martin Widmaier, chair of the Technical Committee. The aim is to provide a comprehensive overview of the latest advances in marine acquisition, the workshop will cover seismic source and sensor technologies, novel acquisition geometries, survey design solutions, and operational aspects. Attendees will share recent experiences and lessons learned from case studies, and explore future visions. The scope encompasses a broad spectrum of applications, from hydrocarbon exploration and reservoir monitoring to high-resolution near-surface methods for offshore wind, CCS development surveys, nuclear waste management, and marine mineral exploration. The workshop features a diverse technical programme encompassing a variety of topics including Seismic source technology - innovative source configurations, blending technologies, and environmental considerations; Acquisition and modelling

Speaker and audience interaction at a vibrant workshop.

- ocean bottom seismics, new deployment methods, and survey planning advancements; Geophysical applications, e.g., distributed fibre optic sensing and high energy geophysics like Muon geotomography; and New energy applications such as offshore wind, marine minerals, CCS monitoring, and nuclear waste management. Abstract submissions are open until 31 March 2024, 23:59 CET. For more information and to submit your abstract, visit the event website.

EAGE and SUT to collaborate on offshore renewable energy workshop in Boston EAGE and the Society for Underwater Technology (SUT) have launched the first joint event in a planned global workshop series to foster collaboration and knowledge exchange across the offshore energy sector. The inaugural EAGE/SUT Workshop on Integrated Site Characterization for Offshore Renewable Energy is to be held on 20-23 May 2024, in Boston, Massachusetts with the aim of addressing the intricate challenges posed by diverse geological and metocean conditions in the region. Offshore renewable energy projects, ranging from near-shore shallow waters to deep-sea environments, face inherent com-

plexities arising from geological histories and varying metocean conditions. A comprehensive approach is required, integrating geological, geophysical, geotechnical, and environmental data within a robust geological and engineering framework. The workshop is tailored to bridge the gap and enhance understanding between the geotechnical and geoscience communities. It will provide insights into industry updates, including market conditions, supply chain dynamics, and perspectives on the project pipeline. Discussions will cover government permitting processes and licence data requirements, site assessment procedures, and environmental impact assessments. FIRST

BREAK

The EAGE/SUT Workshop in Boston marks a pivotal moment for the offshore renewable energy sector in the United States. By concentrating on integrated site characterisation and tackling region-specific challenges, the workshop should foster collaboration and propel the industry towards a sustainable future. Industry experts and academics are encouraged to contribute to the workshop, ensuring a rich and diverse exchange of knowledge that will shape the trajectory of offshore renewable energy development in the US.

Join us! I

VOLUME

42

I

FEBRUARY

2024

7


EAGE NEWS

Forum set to revisit Namibia’s hydrocarbon prospects EAGE’s Sub-Saharan Africa Energy Forum takes place on 4-6 March 2024 in Windhoek, capital of Namibia and provides a premier opportunity to discuss offshore prospects in one of the most underexplored oil and gas provinces in the world. The workshop will focus on the Africa petroleum system providing a a great space to engage with fellow professionals and thought leaders in essential discussions about integrative studies and future projects. Join us and share your expertise on relevant topics due for discussion such as

the regional petroleum geology, machine learning, hydrogeology, remote sensing, geothermal investigation, mineralogy and more. Register now for the EAGE’s Sub-Saharan Africa Energy Forum to avail of a 50% discount on registration fees for all participants residing in Africa!

View the Technical Programme and register now!

Register now and bring your unique perspectives and insights to the forum.

Serbian mountain venue for 12th BGS Congress

The spectacular Kopaonik Mountain will frame the next edition of the BGS Congress.

Coming May, our Associated Society BGS (the Balkan Geophysical Society) will open its doors to its 12th Congress and Technical Exhibition. This year the event is scheduled for Kopaonik Mountain in Serbia, hosted by the Association of Geophysicists and Environmentalists of Serbia (AGES). As one of the six country societies in the BGS partnership, AGES hosts the meeting on a rotation basis and has selected the spectacular location of the Kopaonik National Park for the 2024 edition. The Kopaonik Mountain massif is one of the largest and longest mountains in the country, with its peak at 2017 m. Occasionally the ski season there lasts 8

FIRST

BREAK

I

VOLUME

42

I

until May, but it is also an ideal destination for hikers. Together with its sister societies, AGES operates as a not-for-profit organisation with the mission to promote collaboration and mutual assistance between geophysicists from the member countries, support university education in the field of geophysics, as well as networking and friendship among members. The BGS Congress is organised every third year, at the end of each country’s presidency. The first edition was held in Athens in 1996 and the last one till now – the 11th – was held in Bucharest in 2021 but took place online. The last time AGES led the Congress was in 2009 – the 5th

FEBRUARY

2024

edition – when almost 250 professionals and students contributed to a successful gathering in Belgrade under the theme ‘Geophysics at the crossroads’. The forthcoming edition (27-31 May 2024) will be dedicated to ‘Geophysics for a better world’ and has the ambition to go beyond its traditional topics (earth physics, geodynamics and seismology, regional geophysics and tectonics, near-surface, engineering and environmental geophysics, coastal and marine geophysics, energy and resources, geo-hazards, climate change and risk assessment, remote sensing and UAV geophysics, computing technology in geophysics and geoscience for society, education and environment). Young professionals and students will be a focus, with a special programme dedicated to them to emphasise the importance of young leaders for a sustainable and inclusive future – working together for a positive social change. The event will be supported by EAGE and the National Petroleum Committee of Serbia – World Petroleum Council (NNKS). Keep an eye out for the websites of AGES and EAGE for updates.


EAGE NEWS

WORKSHOP

REPORT

Seabed seismic workshop features continuing research and innovation

The 85 geoscientists and engineers who attended the Second EAGE Seabed Seismic Today Workshop in Milan on 18-20 September were treated to the presentation of novel technologies and numerous successful case histories, some not previously presented. We report on some of the highlights. Images obtained with the so-called reflectivity FWI are becoming increasingly accepted in the industry. Whether they are an accurate representation of the subsurface is unclear. In his keynote speech René-Édouard Plessix highlighted the importance of pre-processing to magnify FWI sensitivity to the events that we want to interpret. Preconditioning in the model space is important to speed-up convergence. Carbon Capture and Sequestration (CCS) was featured in the New Energy session. Case histories from Malaysia, Norway and the UK showed some innovative methods such as the deployment of self-recovery nodes and using unmanned surface vehicles in very shallow water. Despite progress, further enhancement in resolution is needed. The desire for removing the effects of the water layer in OBN processing in complex geological scenarios calls for more advanced methods such as multidimensional deconvolution. Davide Calcagni (ENI), during his opening address, and Ahmad Riza Ghazali (Petronas), in his keynote presentation in the joint session with the 7th EAGE Borehole Geophysics workshop, highlighted that shear and converted waves recorded during seabed acquisitions are not used to their full potential. Dr Ghazali proposed a consortium with academic and industrial partners for improving PS processing. Several representatives of processing companies reported that their processing centres spend more time than planned in processing auxiliary data, near-field hydrophones (NFH) in particular. It was proposed that chief geophysicists of the major acquisition and processing companies should get together to agree on a standard. However, several acquisition contractors were hesitant, probably due to the perceived cost of upgrading long-in-the-tooth gun controllers.

Seabed seismic acquisition Carsten Udengaard and Nicolas Tellier presented and discussed field data from respectively an optimized airgun array and a large volume, low pressure pneumatic source, both commercially available for the seismic industry. Tim Bunting proposed a method based on the measurement of the temperature for OCXO type of clocks to estimate and compensate for clock drift. This could lead to a commercial alternative to CSAC.

Seabed seismic committee and speakers group photo.

Chris Walker presented the logistic challenges and solutions of the world largest seismic to date in the Arabian Gulf in water depths from zero to 30 m. Simultaneous recording of onshore and shallow water data when either onshore vibrators or offshore airguns were activated was one of the several distinctive features of this project. Hugo Ruiz presented a technology that, if incorporated in an OBN survey, enables the measurement of node depths with a relative accuracy of a few centimetres in deep waters. Processing and model building Arash Jafar Gandomi presented a machine learning method for noise attenuation in the Z component based on the assumption that the P component is noise-free whilst the radial component contains the noise reference. Discussion raised issue of conditions that may invalidate the assumption FIRST

BREAK

that the radial component is the reference for the shear-on-z noise. Max Vassallo presented a method (Spectral gap-based survey design with time dithers) to optimally design simultaneous source surveys that facilitates deblending model building. Denes Vigh presented elastic FWI applied to a sparse OBN dataset acquired in the Gulf of Mexico with two source vessels and triple source. The distance between receivers was 1200x1200 m. Tom Rayment presented multiparameter FWI asserting that the reflectivity obtained has resolution superior to that obtained using pre-processing and migration. The modelling kernel is visco-acoustic with fixed Qp. Fang Wang presented the results of processing sparse nodes and streamers acquired over Nordkaap basin (Barents Sea) as well as in Fram (Norwegian Sea). The Barents Sea example used sources above the streamers that enabled the acquisition of offsets close to 0 m. However, the first 100 m of offsets were discarded because of clipping. On both examples, elastic FWI handled well anisotropy, as shown from the ties to the wells. New energy and case studies Koon Hong Ho showed a node survey design for CCS on a depleted reservoir 140 m depth over 100 km2 in a survey area covered by a gas cloud. Eventually, a parallel shooting configuration was chosen. Sandrine David presented a case history to assess the value of OBN acquisition for CCS in the Sleipner field. Target depth was 800-1000 m, water depth 80 m. The objective was to demonstrate that mini-streamers can provide better imaging in the overburden and plume level whilst the OBN (500 x 525 m) was used for velocity model building but not imaging. I

VOLUME

42

I

FEBRUARY

2024

9


EAGE NEWS

Geoenergy explored

CONFERENCE

REPORT

through geostatistical lenses

During a rainy week in the picturesque coastal city of Porto, over 120 international geostatisticians gathered for the 5th EAGE Petroleum Geostatistics Conference. This is the report.

There was plenty to talk about at Porto meeting.

The scientific programme offered 45 oral presentations and 22 posters in an event attended by a mix of academics and industry coming from Norway, Saudi Arabia, UK, France, and Brazil among others. In line with tradition, there were plenty of presentations on geomodelling and geostatistics, which are core elements in this community. Reliable modelling and uncertainty quantification of subsurface variables are important for capturing the right dependencies and the physical realism. This in turn generates robust estimates of geological subsurface properties and volumetric predictions. Compared with earlier versions of the event, there was a growing interest in artificial intelligence (AI) and machine learning (ML). Several presentations focused on new approaches for bridging geomodelling and geostatistics with AI and ML, and described ideas on how the community can utilise developments in AI and ML for efficient modelling and prediction. Even though there are still substantial bottlenecks with computational costs, available training data, realistic uncertainty quantification, and interpretability of results created by ML and AI, this field is clearly thriving, and numerous promising research directions were presented at the event. Moving beyond petroleum, the vision of the conference was ‘Towards a sustain10

FIRST

BREAK

I

VOLUME

42

I

able era of geoenergy’, and the community was keen to see new interests related to the energy transition. There were inspiring presentations covering H2, CO2, geothermal, and uranium mapping. Keynotes also highlighted elements related to the energy transition, including CO2 storage and sequestration, as well as decision-making under uncertainty. Sebastian Geiger from Delft, the Netherlands, gave a keynote presentation on rapid dynamic reservoir modelling for sensitivity analysis and decision support systems in the context of CO2 storage and monitoring. Fernanda Veloso from BRGM, France, in his keynote prsented a pilot scale project for CO2 sequestration and storage in the Paris basin. Reidar Bratvold from Stavanger, Norway, offered a keynote on experts being optimistically biased in predicting reservoir performance. On the last day of the conference, we had a brief panel debate on the energy transition and the role of geostatistics in this setting. The panel consisted of Colin Daly (SLB), Amilcar Soares (Lisbon University), Reidar Bratvold (University of Stavanger), and Ana Sousa (government representative, Lisbon) with interesting reflections on the current challenges and opportunities. At universities, the word ‘petroleum’ is no longer prominent in any study programme or course, and this change has taken place rather quickly.

FEBRUARY

2024

In companies, there is now more focus on energy as a whole, and the trend is maybe more multi-disciplinary activity and regarding this as a portfolio of energy contributions with smaller margins than before. There is a tricky balance between sustainable energy for all, at low costs and the ambitious goals of reducing CO2 emissions on our way to net-zero. Geomodelling and geostatistics are in many ways enabling disciplines that are in a good position to contribute to the energy transition, but there will be other questions with new energy sources and the type and amounts of data could be different. The community is endorsing AI and ML, but there is a gap between what we are doing currently to understand properties of these methods for geostatistics and what we are after down the line. Tighter margins might not mean more AI than geostatistics. Reliable uncertainty quantification will continue to be important and this also leads to improved decision-making processes. Apart from the scientific presentations, we ventured across the beautiful Douro River for the conference dinner which took place at an old port wine storage site turned into a restaurant and show place. In the setting of food and music, folklore dancing and port wine, we had the pleasure of digesting the tasty codfish and delicious sweets.


EAGE NEWS

GET2023 enhances the energy transition vision

The fourth edition of EAGE’s Global Energy Transition Conference and Exhibition (GET2023) was a notable event in the energy and geoscience sectors. This is what went down. The strategic committee’s focus was to deliver an engaging programme which placed the wide-ranging challenges, solutions, and future of our industry in a global perspective. The main themes of our plenary sessions included the role of governments and regulatory frameworks, our own company journeys and why a people-centred energy transition is key. This last point is linked to how we can achieve the UN’s Sustainable Development Goal (SGD7) of ensuring access to affordable, reliable, and sustainable modern energy for all and having a just and inclusive energy transition. The inclusion of individuals from a range of professional backgrounds provided broad perspectives and facilitated thought-provoking discussion among the conference attendees. Over 300 delegates from various countries and organisations convened, featuring 92 presenters, 5 keynote speakers, and 40+ panellists. We welcomed a diverse array of professionals, with significant representation from disciplines such as CO2 storage, geophysics, geology, and geothermal, as well as substantial participation from fields like reservoir engineering, environmental science, petrophysics, petroleum engineering, mining,

‘As conference chair I was really encouraged by the active participation and strong discussions in the strategic sessions. I believe that events like GET are essential to keep the dialogue between regulators, operators and service companies moving at pace to identify the most effective solutions for Net Zero.’ Ellie MacInnes GET2023 chair and new business development, Green Tech at CGG

‘Looking back on GET 2023 it is easy to highlight the impact and technical quality of the topics covered. However, I feel the true success of the event was the cross-domain discussions that were enabled between practitioners of geoscience who are all working towards to the common goal of a successful energy transition. The EAGE has exciting plans for GET 2024 and I look forward to seeing you in Rotterdam.’ Mike Branston GET2023 Technical Committee chair and new energy domain lead at SLB

Conference high on transition conversation.

and geochemistry. Students, academic researchers, government agencies, technical and engineering service providers, and energy company leaders and practitioners were all among the audience. Adrian Robinson, new ventures origination manager at Chevron, highlighted the conference’s cooperative spirit and potential for growth, saying: ‘This enabled a rich dialogue and the sharing of multiple perspectives, as well as an effective communication and learning experience for delegates on a variety of new energies topics. I encourage expansion of the technical themes in future meetings to differentiate the GET and cement its reputation in Europe.’ The exhibition space evolved into a dynamic area where attendees interacted, and companies showcased innovative services. The space became a key location fulfilling EAGE’s goal to merge technical knowledge sharing with broader discussions on the essential skills and solutions within the geoscientific community to facilitate the energy transition. The technical programme, as usual, reflected the solid technical tradition of EAGE, with over 100 presentations capturing a wide audience, reflecting the multi-faceted nature of the energy transition. Mike Branston, GET2023 technical committee chair and new energy domain lead at SLB, commented: ‘We saw a continued growth in the topic of carbon storage, with representation from new research through to case examples of the key projects that are in operation today. The topic of geothermal energy was also a key area of focus with talks on geological and geophysical characterization as well as discussions on risk mitigation and future solutions aimed at improving performance. Looking toward the challenges of the future, both hydrogen and energy storage were well represented as was public engagement. Each of these will have critical roles in the mid to long term and reflect the breadth and success of the technical programme.’ GET2023’s strategic programme took an expansive view, with energy sector leaders, policymakers, and financial experts in discussion together. Expert panellists from CGG, WoodMackenzie, Equinor, iCRAG, BRGM, S&P Global, TotalEnergies, FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

11


EAGE NEWS

Meeting moment to remember.

IEA, Shell, and DNV Energy Systems, among others, had the opportunity to share their insights into sustainable financing, the complexities of government regulations, and the rise of new business models in the energy sector. A focal point of the programme was the transformation of traditional businesses into sustainable energy entities, aligning with global environmental goals. Sessions on ‘Accelerating new energy technologies’ and ‘New energy business models’ provided insights into progress in the sector. Additionally, ‘Geoscience communication and societal engagement’ highlighted the importance of effectively communicating scientific knowledge and engaging communities in the transition process. Overall, the discussions illuminated the complex and dynamic nature of the energy transition, showcasing a variety of sustainable energy solutions and strategies. A notable feature was LUSVAL’s session on climate change strategies, which provided an engaging, interactive experience to the delegates using the En-ROADS tool to simulate the impact of proposed solutions on global climate factors. The dedicated technical discussions, a new addition this year, focused on key topics faced within the energy transition and complemented the wider strategic programme. The discussion on the shift from a fuel intensive to a material intensive energy system was a significant highlight and succeeded in bringing to the fore the role that geoscience must play in support of the supply network needed for a successful transition. In support of the core programme on carbon storage we hosted two dedicated sessions which targeted key areas of recent experience: the evolving regulatory framework for carbon storage and a review of recent successes achieved by leaders in the development of carbon storage. That latter closed the event with a very open and honest account of what has happened and what needs to happen as we scale up carbon storage from the perspective of innovators, operators and regulators. José María González Muñoz (Repsol), noted the evolving nature of CCS regulation, remarking on three key takeaways - ‘the necessity of collaboration, the importance of adapting to a dynamic regulatory environment, and the anticipation of future technical and financial risk-based improvements’. Ranging from meet-and-greet events with EAGE communities to the celebration of the Minus CO2 Challenge winners, participants were offered a space for continued conversation and connections beyond the technical sessions. The Icebreaker Reception and Conference Evening provided perfect settings for casual networking and stimulating conversations, enriching the overall experience. Before the main conference and exhibition, participants were offered pre-conference activities, including a short course 12

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

and an explorer tour. The Explorer Tour was particularly noted for its visits to leading R&D energy centres such as the Institut de Chimie Moléculaire et des Matériaux d’Orsay, SOLEIL Synchrotron, and EDF Lab, providing attendees with a first-hand look at energy innovation. The ‘Basics of carbon capture and storage’ course received high praise from delegates from diverse professional backgrounds, including geology, environmental science, policy, and investment. Vincent Prieux from CGG specifically commended the comprehensive talk by Mike Stephenson (director, Stephenson Geoscience Consulting) for its depth and relevance.

‘From the initial cohort of early adopters at the inaugural GET to its evolution into a mainstream event held in Paris, it has been a privilege for me, as a technical committee member, to witness the remarkable growth of the GET conference. This growth is evident not only in the diversity of subjects covered but also in the advancement of technical skills and the ambitious scope of the conference. Against the backdrop of the recent COP conference and the inevitable energy transition facing our industry, the GET conference stands as a guiding beacon, showcasing the vital role that geoscience can and must play — from wind farms to geothermal and, indeed, CCS/CCUS. I’ve had the pleasure of engaging in insightful talks, valuable networking, and, thanks to Paris and EAGE, enjoying delightful cuisine. In this light, sponsoring the GET conference was a straightforward decision for SpotLight.’ Habib al Khatib GET2023 Technical Committee member and CEO at SpotLight

As the event drew to a close, anticipation for GET2024 began to build. Scheduled for 4-7 November 2024 in Rotterdam, the next edition promises to continue the discussion on the shaping of our energy future. The conference will feature four sub-conferences dedicated to offshore wind energy, carbon capture and storage, geothermal energy, and hydrogen and energy storage. Stay tuned to eageget. org for more details on the upcoming conference, where the journey towards a sustainable energy future continues.


EAGE NEWS

Synergy Day proves a first for Latin American students In a landmark event, the EAGE Student Chapter Universidad de Los Andes with the support of the EAGE Student Chapter Universidad Nacional de Colombia, Bogotá recently hosted the inaugural Synergy Day, marking the first-ever geosciences congress tailored by and for students of the field. This vibrant event was held in Bogotá, Colombia, with a primary focus on showcasing diverse research endeavours from undergraduate students and student research groups. Featuring a lineup of 11 presentations, the day covered a spectrum of geoscience topics, ranging from paleontology to energy transition-related issues. Exploration geophysics and white hydrogen in Colombia emerged as focal points, highlighting the

inter-disciplinary nature of contemporary geoscience research. The integration of artificial intelligence in seismic signal processing and its implications for ecosystem services were also discussed. The poster session showcased research in petrology and seismology, all underpinned by methodologies for information processing. Beyond academic presentations, Synergy Day sought to foster more inclusive dialogues. The ‘Shaking the rocks’ panel discussed the importance of addressing gender roles and diversity in geosciences. Voices from Colombia’s network of female geologists, student bodies and industry professionals came together to explore perceptions of gender roles in

Representatives of ‘Shaking the rocks’ panel discussion.

geosciences and the challenges the sector faces in achieving equality. Synergy Day not only marked a significant event in the academic calendar but also served as a rallying call for unity and cooperation in the pursuit of advancing geoscience research and addressing the challenges of our changing world.

Nigerian event encourages empowerment of students The 41st Nigerian Association of Petroleum Exploration Annual International Conference and Exhibition (NAPE AICE 2023) unfolded as a dynamic showcase of the world of geoscience, energy transition, and technology innovation. With attendance made possible by the generous support of the EAGE Student Fund, the EAGE Student Chapter from the University of Port Harcourt embarked on a journey of discovery. A notable highlight of their experience was the Basin Evaluation Competition (BEC), a simulated petroleum exploration scenario akin to the Laurie Dake Challenge. Teams, armed with datasets specific to the Niger Delta Basin, engaged in prospect and lead identification, risk assessment, and field development planning. Although their university was absent from the competition this year, the Chapter’s past victories in 2016 and 2018 (first position) and 2017 (third position) underscore their commitment to excellence. In

Quiz time in an exhibition booth at the NAPE AICE 2023.

the end the University of Nigeria Nsukka (UNN) students were the winners. The lead technical paper session featured a comprehensive review by Clement Chukwuka, an industry subject matter expert and geologist at Chevron Lagos, Nigeria. He spoke on the regulatory framework and critical enablers for the viability of the oil and gas business in Nigeria, pointing to the need for regulatory flexibility to encourage the return of International Oil Companies to the Niger Delta Basin. At the Energy Odyssey Case Study session by Shell, students were immersed

in defining plays, prospects, and computing volumes. Armed with hard copies of data, formula sheets, seismic information, and field structural maps, the hands-on experience provided invaluable insights into real-world challenges and solutions. The Women in Geoscience & Engineering (WiWE) keynote speaker highlighted the importance of harnessing technological innovation for inclusion and sustainability, underlining the key role of diversity in geosciences. The students’ journey also included crucial meetings with faculty advisors from the University of Calabar, Niger Delta University, Rivers State University, and Akwa Ibom State University. Discussions centred on the need to establish active EAGE Student Chapters and facilitate the Port Harcourt EAGE Local Chapter, fostering a collaborative and supportive environment for aspiring geoscientists.

EAGE Student Calendar 15 FEB

SELECTION OF THE TEAMS FOR THE SECOND ROUND OF LAURIE DAKE CHALLENGE

ONLINE

28 FEB

DEADLINE EAGE STUDENT CHAPTERS RENEWAL

ONLINE

FOR MORE INFORMATION AND REGISTRATION PLEASE CHECK THE STUDENT SECTION AT WWW.EAGE.ORG.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

13


EAGE NEWS

Geoenergy

OUR JOURNALS

THIS MONTH

celebrates its first year

Basin Research (BR) is an international journal which aims to publish original, high impact research papers on sedimentary basin systems. A new edition (Volume 36, Issue 1) will be published in February. Geophysical Prospecting (GP) publishes primary research on the science of geophysics as it applies to the exploration, evaluation and extraction of earth resources. Drawing heavily on contributions from researchers in the oil and mineral exploration industries, the journal has a very practical slant. A new edition (Volume 72, Issue 2) will be published in February, featuring 33 articles. Editor’s Choice articles: • Temporal dispersion correction for wave-propagation modelling with a series approach – W. A. Mulder • Elastic properties of unconsolidated sandstones of interest for carbon storage – C. M. Sayers et al. Near Surface Geophysics (NSG) is an international journal for the publication of research and developments in geophysics applied to the near surface. The emphasis lies on shallow land and marine geophysical investigations addressing challenges in various geoscientific fields. A new edition (Volume 22, Issue 1) will be published in February, featuring 7 articles. Editor’s Choice article: •  The utilization of ghost reflections retrieved by seismic interferometry for layer-specific characterization of the shallow subsurface – F. Shirmohammadi et al. Petroleum Geoscience (PG) publishes a balanced mix of articles covering exploration, exploitation, appraisal, development and enhancement of sub-surface hydrocarbon resources and carbon repositories. A new edition (Volume 30, Issue 1) will be published in February, featuring 13 articles. Editor’s Choice articles: •  Fault-seal analysis in the Greater Bay du Nord area, Flemish Pass Basin, offshore Newfoundland – Asdrúbal J. Bernal •  Fracture distribution along open folds in southern Tunisia: implications for naturally fractured reservoirs – Ruaridh Smith et al.

CHECK OUT THE LATEST JOURNALS

BR

14

FIRST

BREAK

NSG

I

VOLUME

42

GP

PG

I

2024

FEBRUARY

In its first year of publication, the new EAGE/Geological Society journal Geoenergy has exceeded all our targets for submission and published contributions. Attracting a wide range of papers, the journal launched in January 2023 sets a new benchmark for publications focused on energy transition, perhaps the biggest global challenge of the 21st century. Building on the strong legacy of the long-running Petroleum Geoscience journal, also published jointly by the EAGE/Geological Society, this timely new publication meets the requirement for a technically robust journal with focus on geoscience and engineering and the challenge to unlock renewable resources from the subsurface. In the first year we have had 58 submissions and already published 19 papers, covering a range of topics, from carbon capture and sequestration, critical minerals, nuclear waste storage and geothermal energy. An Editorial Board was established, led by Prof Jonathan Redfern (University of Manchester) as editor-in-chief, bringing together an international team of deputy editors with diverse expertise: Prof Sebastian Geiger (TU Delft, Netherlands), Dr Kathryn Moore (University of Exeter, UK), Prof Rosalind Archer (Griffith University, NZ), Prof Zuleima Karpyn (PennState, USA) and journal manager Lucy Bell. Their dedication is highly valued, as is the hard work of referees, because without that commitment our society and ability to edit/improve and then ultimately publish these ground-breaking papers would be impossible. Some of the most read papers in 2023 include: 3D reservoir simulation of CO2 injection in a deep saline aquifer of the Lower Paleozoic Potsdam Sandstone of the St Lawrence Platform, Gentilly Block, Quebec by Konstantinovskaya et al., Structural discontinuities and their control on hydrothermal systems in the Great Basin, USA by Siler, and Exploring natural hydrogen hotspots: a review and soil-gas survey design for identifying seepage by Langhi et al. We have several thematic collections in progress, including ‘Digitally enabled geoscience workflows: unlocking the power of our data’ and ‘Sustainable geological disposal and containment of radioactive waste’. We look forward to announcing more in the coming year, focusing on the main themes driving our science. We have also initiated new viewpoint articles that highlight key themes or issues that affect this area of research and associated industries. Later in the year a selection of papers from Geoenergy will be showcased at the EAGE Annual meeting in Oslo, where authors have been invited to present their work at a dedicated session. We look forward to seeing you all there, and to your future submissions, as Geoenergy establishes itself as the leading journal in the subject.


EAGE NEWS

Best chapters meet online with mineral exploration on the agenda

Presentation from Joeri Brackenhoff, vice-president of the EAGE Local Chapter Netherlands.

In a culmination of their outstanding achievements, the EAGE Local Chapter Netherlands and the IPN Mexico Student Chapter, recognised respectively as the Best Local Chapter and Best Student Chapter of 2023, joined forces for a collaborative webinar on mineral exploration. Last November more than 30 members, including students and industry professionals, met to share knowledge and discuss collaboration. The chosen topic, mineral exploration, reflected the critical role minerals play in the energy transition and the evolving landscape of geoscience technologies and practices. Feven Desta, assistant professor at TU Delft, Netherlands, spoke about the fundamental role minerals play in our daily lives and the importance

of environmentally friendly extraction. According to Desta, ‘Everything around us is built up out of minerals, the buildings we work in, our phones, our daily lives, all are founded on minerals. So, we need minerals, we need extraction, and we need to do it environmentally friendly.’ Chris Nind, vice-president of business development at Abitibi Geophysics, Canada, and a member of the EAGE Technical Committee on Mineral Exploration Geophysics, highlighted the need for integration across geology, geophysics, and geochemistry to improve significant discovery rates in mineral exploration. ‘Success will require integration making use of our entire mineral exploration toolkit to improve significant discovery rates,’ Nind said.

Beyond the technical discussions, the webinar served as an inspirational platform for two of EAGE’s most active communities to connect, share best practices, and offer valuable advice. The dialogue extended from experienced professionals to the next generation of geoscientists and engineers, fostering a collaborative spirit that encapsulates the essence of EAGE. If you missed the live session, fear not, as EAGE continues to provide opportunities to connect, learn, and engage with geoscience professionals worldwide. Make sure to join or renew your EAGE membership to stay connected and benefit from a world of opportunities in 2024. The collaborative spirit exhibited by the Best Chapters of 2023 serves as a beacon for the vibrant future of geosciences within the EAGE community. Visit eage.org for more success stories of our communities. You are also welcomed to be part of our Local Chapters, Student Chapters, Special Interest and Technical Communities.

Presentation from Jasay Munguía, vice president of the EAGE Student Chapter IPN México.

The EAGE Student Fund supports student activities that help students bridge the gap between university and professional environments. This is only possible with the support from the EAGE community. If you want to support the next generation of geoscientists and engineers, go to donate.eagestudentfund.org or simply scan the QR code. Many thanks for your donation in advance!

D O N AT E T O DAY ! FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

15


EAGE NEWS

Get involved in the conversation at EAGE Digital 2024

The Fourth EAGE Digitalization Conference and Exhibition (25-27 March, Paris, France) promises to be a pivotal event in the energy industry. The Plenary Panel and Strategic Programme will convene experts and leaders to engage deeply on how to leverage people, data, and innovative technologies to make smarter investment decisions, build resilient companies, and drive our journey towards net zero. Here is what you can look forward to. Day 1: New ways of working Panel: Demystifying Generative AI - With the advent of technologies like ChatGPT, the aim is to uncover the potential for innovation that generative AI brings and will assess the key risks and challenges associated with its implementation at the enterprise level. Roundtable: Is the future citizenship development? - The advent of low-code and no-code platforms has ushered in a new era in software development, allowing non-programmers to create tailored software solutions, the ‘democratisation’ of software development in the energy sector. How do such platforms enable geoscientists, engineers, and other domain experts to meet specific business needs through custom software? Roundtable: Dynamic digital ecosystems - This session focuses on the need for oil and gas companies to develop low-carbon technologies and innovate their business models and how open innovation can create robust digital ecosystems which can be leveraged for competitive advantage. Case histories: Reducing the carbon footprint of geoscientists - The session will provide a comprehensive platform to discuss the various aspects of a geoscientist’s role and how these can be adapted to minimise environmental impact. Roundtable: Automated geoscience insights - The dialogue will focus on the impact of automation and digitalization in the energy sector, especially in light of increasing renewable energy use and decarbonization efforts.

Day 2: Data & Tools Panel: Challenges for enterprise AI - The session will evaluate critical challenges, including data security, legal concerns in a data-driven environment, the extensive data and computing requirements of AI, and the need to adapt workforce, culture, and existing infrastructure. Roundtable: Status of OSDU adoption - We will take a critical look at the adoption of the Open Subsurface Data Universe (OSDU) standard across the global energy sector. The session will discuss the current status of industry readiness, exploring the challenges, progress, and strategies different organisations are employing to integrate OSDU. Roundtable: Managing a highly dynamic technology environment - This panel will address key challenges for tech managers in the O&G industry, specifically navigating a dynamic technology development environment with an increasingly rapid turnover of ideas, solutions and applications; and leveraging disruptive technologies, while managing stability in processes, to develop good industrial solutions. Case histories - Optimised computing platforms - Case studies will highlight the advancements and efficiencies gained through optimised computing in the energy sector. Various real-world examples will be discussed. Roundtable: Digital twins and the next generation of integration platforms - Here we will focus on the latest developments and future trajectory of digital twin technology and integration platforms in the energy sector.

Day 3: People Panel Human barriers to adopting new digital solutions - The panel will examine the often under-acknowledged human challenges inhibiting the adoption of new technologies in the energy sector. Roundtable: Efficient change management - We will examine the critical role of change management in ensuring the effective adoption and implementation of digital technologies in the energy sector. Roundtable: Innovation leadership - Leadership plays a critical role in the success of digital initiatives. Technology and innovation leaders must set the tone for a culture that values experimentation, embraces disruption, and encourages creativity and risk-taking. How can leaders empower the people within their organisations and leverage the latest digital tools? Case histories: Accelerating geoscience learning -The session will showcase impactful examples of how accelerated learning processes are transforming geoscience education and practice. Roundtable: Reshaping tomorrow’s energy professions - Meeting modern geoscience challenges increasingly relies on the integration of domain expertise and broad data science competencies to develop unique solutions for complex problems. We will analyse the ways in which energy professions are evolving and the challenges in empowering existing professionals to better leverage the power of data and digital tools to accelerate innovation and net zero goals.

Interested in contributing to these critical discussions? Contact us at europe@eage.org for speaking opportunities. Register at www.eagedigital.org before 10 February to benefit from reduced fees.

16

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


PERSONAL RECORD INTERVIEW

Dmitry Bozhezha

Personal Record Interview

Pursuing geoscience in time of war As war enters third year, Dmitry Bozhezha continues to provide EAGE’s service point in Kyiv, Ukraine, running a busy events and membership programme. In his academic and industry career dedicated to innovation, he is currently R&D director, Geoprom, a small research company pioneering geoelectric prospecting technology.

Upbringing in Kyiv My childhood is vividly imprinted with memories of accompanying my mother, a hydrogeologist, on field work during the summers. Starting from the age of four, I found myself exploring the mountain rivers of the Carpathians, the vast steppe regions of the Donbas, and the picturesque Crimean Peninsula. These early adventures left an indelible mark on my life. However, this period was not without its challenges, occurring during the Soviet occupation of Ukraine and the disastrous Chernobyl accident. Student/early career At 15, I officially began my professional journey at the hydrogeological laboratory of the National Academy of Sciences of Ukraine. Already intrigued by the evolution of personal computers, I eagerly embraced their application in scientific research throughout my studies which I completed at the Institute of Geological Science National Academy of Ukraine. My focus initially revolved around the development of unconventional atmogeochemical methods for hydrocarbon prospecting. Subsequently, my research expanded to include innovative geoelectric methods. To date, I am proud to have published over 200 papers, contributing significantly to the advancement of these methodologies and their applications. Current job at Geoprom As R&D director at Geoprom, I am involved in harnessing our collective knowledge and expertise to pioneer innovative, cost-effective geoelectric research technology, addressing challenges in the prospecting of hydrocarbon, mineral, and

water resources. Our approach is distinctive for its swift problem-solving capabilities, result reliability, and cost-effectiveness compared to traditional methods. We have successfully executed over 400 projects, in Ukraine and internationally, Everyday life in Kyiv In the early days of the war, our city lay deserted as numerous residents, including my wife and daughters left for Poland. My mother and I stayed in the western part of the Ukraine. Following our victorious reclaiming of Kyiv, the enemy has retreated, and many people have returned. We strive to maintain a semblance of normality in our lives. However, during air attacks, life comes to a standstill. Residents seek refuge in shelters, The impact of enemy bombardments is evident in the destruction, loss of life, and suffering that every resident of Kyiv has witnessed first-hand. From my window, I witnessed the aftermath of an attack drone just 200 metres from my home. Eleven flats were damaged serving as a stark reminder of the proximity of danger. From our shelters, we hear the reverberations of hundreds of explosions in various parts of the city. How EAGE relationship began In 2007, I took the significant step of becoming a member of EAGE and had the opportunity to attend my first international event. The EAGE Annual Conference and Exhibition held in London. The scale and impact of this meeting left a lasting impression on me. Back in Kyiv, I volunteered with the EAGE Kyiv Local Chapter and was instrumental in organising the Ukrainian conference on Geoinformatics, over the years FIRST

BREAK

incorporating European standards. In 2011, a memorandum was signed between EAGE and AUAG to collaboratively organise the Geoinformatics conference. This partnership significantly elevated the conference’s profile, attracting a substantial number of international participants. Achievements with EAGE Since joining the EAGE team in 2010, my commitment to promoting Ukrainian events has taken on a new dimension. There are now four events in Ukraine, in addition to Geoinformatics, the Monitoring Conference, the Geoterrace Young Scientists Conference, and a workshop on landslides. Unlike other conferences in the post-Soviet space, these events are now in English only, which increases the audience and reputation. Since the outbreak of the war, we have continued to organise these events online, and have made participation not only in these conferences but also in all EAGE events free of charge for all Ukrainians. Also, an EAGE programme of support for scientists from Ukraine has been launched. Optimistic about future? I remain optimistic about a swift resolution to the ongoing conflict, anticipating a return to stability. With the restoration of peace, I foresee a growing interest in my country, Ukraine, not only in terms of scientific research but also in the realm of international conferences. Before the war, my daughter accompanied me to some field work from the age of six, as I did in my childhood, so I want to continue this tradition until she grows up. And that requires peace. I

VOLUME

42

I

FEBRUARY

2024

17


Make sure you’re in the know

EAGE MONTHLY UPDATE JOIN THE LEADING TECHNICAL EXPERTS A N D D I G I TA L LEADERS!

2 5 -27 MARCH 2 02 4 I PARIS I FR ANCE

ALL ACCESS

PASS AVAILABLE ONE PASS FOR YOUR FULL EVENT EXPERIENCE! Full Conference & E xhibition Workshops | Field Trips Short Courses | Hackathon Conference Evening ENJOY UP TO

45% OFF

I M P O R TANT D E AD LI N E S

DELIVERING BET TER ENERGY IN A TR ANSFORMING WORLD

18 February Regular Registration EAGE Sub-Saharan Africa Energy Forum 10 March Early Registration First EAGE Workshop on Advances in Carbonate Reservoirs 20 March Regular Registration EAGE GeoTech 2024

FIRST

BREAK

I

VOLUME

42

CALL FOR ASSOCIATE EDITOR APPLICATIONS Basin Research Volume 33 . Number

Published in

I

M.SC AND PH.D CANDIDATES

CAN APPLY BY 1 MARCH EAGE.ORG/AWARDS

BAS I N R E S E AR C H

UNTIL 15 MARCH

18

MARIE THARP AWARD A NEW RECOGNITION FOR SUSTAINABLE ENERGY YOUNG PROFESSIONALS

31 March Call for Abstracts Fourth EAGE Marine Acquisition Workshop

WITH THE EARLY BIRD RATES

WWW.EAGEANNUAL.ORG

W W W. E A G E D I G I TA L . O R G

conjunction with

1 . February 2021

the European

. http://ww

Association of

w.wileyonlinelibrary

Geoscientists

& Engineers and

.com/journal/bre

the International

Association of

Sedimentolog ists

Editors: Atle Rotevatn (University of Bergen) Kerry Gallagher (University of Rennes) Peter Burgess (University Cari Johnson (University of Liverpool) Craig Magee (University of Utah) of Leeds) Nadine McQuarrie (University of Pittsburgh)

DEADLINE: 15 FEBRUARY

2024

We are looking for experts willing to share their valuable knowledge on: Geothermal Energy | Hydrogen and Energy Storage | CCS/CCUS Wind Energy | Critical Minerals for Energy Transition

FIND OUT MORE!

FEBRUARY

DET COMMUNITY

CALL FOR VOLUNTEERS

Interested? Reach out to us at communities@eage.org



CROSSTALK BY AN D R E W M c BAR N E T

B

U

S

I

N

E

S

S

P

E

O

P

L

E

T

E

C

H

N

O

L

O

G

Y

Interest in coal mines is heating up A hamlet in the English county of Nottinghamshire, so small Demand for British coal rocketed in the 19th century and it has a church but no shops, is earning at least a footnote in reached its peak in 1913 when 292 million tons were produced Britain’s energy transition. This is because the community of for home consumption and export (96 million tons) from 3270 Ratcliffe-on Soar neighbours the country’s last coal-fired power operating mines. In 1920, 1.19 million people were working in station, operated by German energy company Uniper. The plant is the mines, 1 in 20 of the country’s workforce. The Mines and Collieries Act of 1842 had forbidden boys under 10 years of age definitively due to close in October this year after a brief reprieve related to the threat to European power supplies after the Russian and all females to work underground as so called hurriers and invasion of Ukraine. Ironically, in recent years coal imported thrusters (although the legislation was for some time not always from Russia had been its main source of fuel. adhered to). Pit ponies became the alternative for deep mines; at Once the dying embers are extinguished at Ratcliffe-on-Soar, peak in the early 20th century there were some 70,000 registered Britain will become one of the first countries in the world to animals, the last retiring from work in a Northumberland mine give up on the use of coal for generation of electricity, a pledge coalfield in 1994. made in 2015. Since the world’s first centralised public coal-fired After experiencing a visit to the pit face, George Orwell in generator opened in 1882, at Holborn Viaduct in London, Britain his pre-Second World War Road to Wigan Pier acknowledged the had some continuous coal-fired power generation until the first significance of coal to society: ‘Our civilisation, pace Chesterton, is founded on coal, more completely than one realises until one ‘coal-free’ day was declared in April 2017. Actual mining of coal in the country has reached almost zero, stops to think about it. The machines that keep us alive, and the final chapter in the extraordinary story of the machines that make machines, are all directly or indirectly dependent upon coal. how coal fueled Britain’s emergence as the ‘UK is not yet done first great power of the First Industrial RevoIn the metabolism of the western world the with all its old lution thanks to manufacture of iron, factory coalminer is second in importance only to the textile mills, steamships, and steam engines man who ploughs the soil.’ mineworkings’ for transport and many other purposes. The UK’s dependence on coal was still But, the UK is not yet done with all its old mineworkings, evident in 1950 when 90% of all energy (including industry, so the country’s association with coal seems destined to live on. railways, heating, cooking, etc) was still sourced from coal. But The first forays into developing the potential energy from low so-called King Coal had been in steady decline since the 1920s as enthalpy heat for local district heating, using open loop ground competition squeezed exports. This was only to become steeper, source technology to recover heat from abandoned flooded coal not fully appreciated by the post-war Labour government under mines, is underway. Clement Atlee. It nationalised the industry in 1946-7 honouring Looking back, possibly unbeknownst to many, the first the party’s Clause 4 ideological commitment to public ownership effective application of steam power, a full century before Robert of key industries (amended in the mid-1990s under Labour Party Stevenson’s famous 1830 Rocket passenger locomotive, was leader Tony Blair). patented by Thomas Savery in 1698. Its purpose was to drain The 1950 Plan for Coal proved hopelessly optimistic in water from coal mines. This was followed by more sophisticated expecting to increase output from 184 million to 250 million tons systems developed by Thomas Newcomen around 1712 and 50 by 1970. In 1956, thanks to government investment, 700,000 men years later by James Watt. produced 207 million tons of coal but by 1971, fewer than 290,000

20

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


CROSSTALK

workers were producing 133 million tons at 292 collieries. The that this is well short of the government’s ambition for 95% industry was soon in further freefall. The advent of North Sea oil low-carbon electricity by 2030 and a fully decarbonised grid by 2035. and gas plus the nuclear power station programme took its toll on coal demand. Diesel and electric trains rapidly replaced steam As for Boris Johnson’s misguided aside in Glasgow, he would engines in the 1970s. On the domestic front, the 1956 Clean Air have been much better served boosting the promise of environmentally friendly district heating from the naturally heated water Act, inspired by the ‘great smog’ in London in 1952, was instrumental in a switch away from coal-fired central heating. in flooded coal mines. The city is home to the recently established The legend of Margaret Thatcher maintains that the ‘Iron Glasgow Geoenergy Observatory operated by the British Geological Survey (BGS). Its mission and facilities are well described in Lady’ delivered a fatal blow to both the coal industry and a paper by A. Monaghan et al. at the EAGE’s 2023 Near Surface organised trade unions when crushing the mineworkers strikes of Geoscience in Edinburgh (available in Earthdoc). The mine water 1984-85 over pit closures and output reductions. It is said to be her act of vengeance for the early 1970s militant mineworkers’ geothermal research facilities are intended to provide ‘an at-scale underground laboratory to facilitate collaborative research to action that arguably forced Tory Prime Minister Edward Heath improve understanding of subsurface processes, environmental out of office. But, regardless of the ruthless action and damage to and induced change. It offers scientific infrastructure for invesmining communities, the truth in retrospect is that she probably tigating the shallow, low-temperature geothermal energy and only accelerated a process well underway. Indeed, governments thermal storage resources available in abandoned and flooded (Conservative and Labour) had been implementing mining clocoal mine workings at depths of around 50-90 m. Such resources sures for decades. Immediately before the strike, 170 mines could provide sufficient heat for community-scale district heating employed 148,000 workers and produced 120 million tonnes of networks and extensive thermal storage.’ coal. By the time the Tories privatised the coal industry a decade An open access Energy Reports paper in 2020 on The theoretilater, around 30 mines produced 50 million tonnes and employed cal potential for large-scale underground ther7000 workers, less than 1% of the 1950 post war peak. In 1994, coal only constituted 12% ‘Britain’s adoption of mal energy storage (UTES) within the UK by J.G. Gluyas et al. concluded: ‘Our calculations of Britain’s fuel production, with 80% coming from North Sea oil and gas and the other 8% renewables effectively indicate that the theoretical potential for largemainly nuclear power. spelled the end of coal’ scale underground thermal-energy storage in the UK is substantial, much larger than which Britain’ adoption of renewables in the might ever be needed and the location of such current century effectively spelled the end storage is well matched to the places where people live and work of coal, the last deep mine closing in 2015. Ex-Prime Minister and therefore where the demand for heat occurs. Boris Johnson in a pre-COP26 visit to Glasgow in 2021 claimed An EAGE Annual 2023 conference paper from Oldfield et green credentials for a previous Tory administration. He joked to al. on Regional-Scale 3D Geothermal Prospecting to Support reporters that ‘Thanks to Margaret Thatcher, who closed so many Local Authorities in Delivering National Strategies focused on the coal mines across the country, we had a big early start and we’re potential of the old mineworkings of Selby, Yorkshire. now moving rapidly away from coal altogether.’ Unsurprisingly Although research and project development interest is taking the remark invited swift all-party rebuke with critics pointing to off in a number of UK regions, the idea is not new and is being the devastation caused in mining communities by abrupt closures explored in a number of countries. Springhill, Nova Scotia, in Scotland and the North of England. Canada inaugurated an ongoing system as far back as 1994 and Carbon Brief reports that, as of the end of 2023, the 104 the world’s first mine water power station opened in Heerlen, The terawatt hours (TWh) generated from fossil fuels in 2023 was Netherlands in 2008. the lowest level in 66 years. Electricity from fossil fuels fell by However, Gateshead Council, NE England, has already got two-thirds (199TWh) since peaking in 2008. Within that total, off the mark with the largest mine water heat network in Britain coal dropped by 115TWh (97%) and gas by 80TWh (45%). This is and one of the largest in Europe providing hot water and heat to attributed to the rapid expansion of renewable energy (up six-fold hundreds of homes and businesses. Three years in development, since 2008, 113TWh) and by lower electricity demand (down 21% heat is extracted from mine water from 150 m below Gateshead since 2008, 83TWh). town centre via three boreholes drilled into old flooded mine Low-carbon sources made up 56% of the total, of which workings. The warm water is then then fed into a 6 MW water renewables were 43% and nuclear 13%. The remainder was from source heat pump that boosts the temperature of the water before imports (7%) and other sources (3%) such as waste incineration. it is distributed over a 5 km-long heating network. Overall, the electricity generated in the UK in 2023 had the It seems mineworkings in the future may not be such a dirty lowest-ever carbon intensity, with an average of 162g of carbon word. dioxide per kilowatt hour (gCO2/kWh), but Carbon Brief notes

Views expressed in Crosstalk are solely those of the author, who can be contacted at andrew@andrewmcbarnet.com.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

21


FEATURING OFFSHORE WIND ENERGY

HYDROGEN & ENERGY STORAGE

CARBON CAPTURE & STORAGE

GEOTHERMAL ENERGY

HELP US SHAPE THE FUTURE OF ENERGY

SAVE THE DATE! 4-7 NOVEMBER 2024

ROTTERDAM, THE NETHERLANDS

EAGEGET.ORG #EAGEGET2024


HIGHLIGHTS

INDUSTRY NEWS

24

Norway approves seabed mining

25

Trillion reprocesses Black Sea data

27

PGS expands survey offshore Angola

TGS wins series of Ocean Bottom Node projects TGS has won or completed a series of Ocean Bottom Node (OBN) projects offshore Guyana, in the Gulf of Mexico and the North Sea as the company focuses more of its investment on the technique. The company has completed a number of marine OBN surveys offshore Guyana. The final node recovery marked the culmination of three exclusive OBN surveys commissioned by ExxonMobil Guyana. TGS acquired 2400 km2 of OBN data within a span of 410 days, concluding the data acquisition process 20 days ahead of the projected schedule. This achievement not only sets a record for the longest deepwater node survey but also showcases TGS’ cutting-edge ZXPLR node technology utilised throughout the surveys. Carel Hooijkaas, executive vice-president of acquisition at TGS, said, ‘This project is the longest deepwater node survey successfully acquired for ExxonMobil Guyana. We did this safely and efficiently in one of the most congested fields in the world.’ During field operations, TGS recovered 1.2 metric tonnes of marine debris, removing a large amount of discarded fishing gear, plastics and other harmful debris from the marine environment in Guyana. Meanwhile, TGS has also won a twomonth proprietary OBN survey in the Gulf of Mexico with acquisition due to start in Q1 2024.

Kristian Johansen, CEO at TGS, said: ‘This project highlights the key role OBN data plays in this vital basin. OBN technology provides the essential data needed to visualise and understand the intricate structures within the Gulf of Mexico, enabling our clients to make well-informed, data-driven decisions in their field development strategies.’ The company has also won a threemonth proprietary OBN data acquisition contract in the North Sea for a repeat customer. The project’s acquisition will begin in Q2 2024. Kristian Johansen, TGS CEO, said: ‘This project, for a major energy customer, further highlights the integral role OBN acquisition has in providing our clients improved seismic data quality and help them make better reservoir development decisions.’ Finally, TGS has completed the imaging phase for the priority area of its NOAKA OBNmulti-client seismic survey in the Norwegian North Sea. It has applied its proprietary OBN processing and imaging technology, including dynamic matching full-waveform inversion (DM/FWI), to this data, creating a 3D volume that will enhance the resolution and structural definition of the complex geology and reservoirs in the region. FIRST

BREAK

Will Ashby, EVP of Eastern Hemisphere at TGS, said: ‘The area between Oseberg and Alvheim in the Norwegian North Sea has recently witnessed significant infrastructure-led exploration. TGS remains committed to further developing the coverage in the NCS region.’ The NOAKA OBN survey was acquired over two seasons in 2021-22 and comprises 434 km2 of multi-client OBN data. Processing has now been completed over a priority area of 198 km2. Processing utilises full azimuth and ultra-long offsets to understand and illuminate the complex subsurface geology in the region. In addition, this enables a detailed velocity model to be built using TGS’ DM-FWI technology. The imaging results have also benefited from the enhanced low frequencies provided by OBN data, allowing clear and accurate depth images to be produced. Multi-component processing enabled the up and down-going wavefields to be processed and imaged separately. The downgoing wavefields provide detailed near-surface imaging products. Imaging with FWI also shows improved imaging of deep faulting and structures below the Base Cretaceous Unconformity (BCU), which is key to understanding the Jurassic and Triassic prospectivity. I

VOLUME

42

I

FEBRUARY

2024

23


INDUSTRY NEWS

Norway approves mining of seabed minerals The Norwegian parliament has decided to open the Norwegian shelf for exploration and possible extraction of seabed minerals. After a vote in the Norwegian Parliament, in which the governing coalition received backing from other parties, some 300,000 km2 of waters in the Norwegian Sea will be opened up for licensing stretching as far north as Svalbard in the Arctic Ocean. The first mining would be estimated to start in 2032. Seabed minerals such as cobalt are essential for the production of products such as smartphones as well as electric car batteries. The Norwegian Offshore Directorate said: ‘We have mapped vast areas in the northern Norwegian Sea since 2017. We’ve taken samples and collected data about minerals and metals found on the seabed. We’ve done this by means of our own expeditions, and also in cooperation with expert com-

Greenpeace activists protest in front of a deep-sea mining vessel. Photograph: Gustavo Graf Maldonado/Reuters.

munities at the universities in Tromsø and Bergen. However, the decision has been criticised by environmental groups, especially as it is claimed that some of the mining could take place in Arctic waters.

Environmental groups have expressed concern at the plans, warning of the impact of plumes of sediment that would be created by extracting minerals from bare rock on the ocean bed.

PGS signs deal to provide geoscience data carbon capture projects offshore Australia PGS has signed a deal to provide seismic interpretation and geophysical advisory services to deepC Store and Azuli to jointly pursue and acquire Greenhouse Gas (‘GHG’) storage acreages in offshore Australia. In a service-for-equity agreement, the deepC Store and Azuli will issue shares to PGS for its services. DeepC Store and Azuli have jointly bid for GHG Assessment Permits released by the Australian government. PGS will provide its full suite of seismic interpretation and deep knowledge from geological and geophysical studies to build high-resolution injection site modelling and simulation services. DeepC Store managing director, Daein Cha said, ‘Australia has potential CO2 storage capacity of 434 billion tonnes (with 73% of the storage residing offshore), which is equivalent to ~870 years 24

FIRST

BREAK

I

VOLUME

42

I

Illustration only. Source: deepC Store.

of Australia’s net emissions. Australia is well positioned to offer significant CO2 reduction contributions for its own hardto-abate industrial sectors via CCS, as well as to play a central role in the decarbonisation of the Asia Pacific region.’

FEBRUARY

2024

Azuli managing director, Hamish Wilson said, ‘This offshore venture will provide a viable CO2 sequestration option for Australian emissions for major industrial companies such as fertiliser, cement and steel making. We also believe that Australia will play an important role in enabling Japan and Korea’s decarbonisation plans imperative in achieving net zero by 2050. This will be enabled through shipping transboundary CO2 imports to a number of Floating Storage & Injection (FSI) facilities in Commonwealth waters.’ PGS EVP New Energy, Berit Osnes, said, ‘Geophysical data and expertise are important tools to secure safe and reliable reservoirs for carbon storage. As a partner, PGS continues to support carbon storage developers around the world by providing quality subsurface data and advisory services.’


INDUSTRY NEWS

Shearwater, PGS and DNV develop standard for emissions reporting

ModelVision Magnetic & Gravity Interpretation System All sensors Processing 3D modelling 3D inversion Visualisation Analysis Utilities

Shearwater, PGS and DNV have launched a project to define Key Performance Indicators for energy and emission reporting for the seismic sector. The joint industry project aims to establish a standardised reporting framework for the seismic acquisition carried out offshore by vessels based on currently gathered data and operating modes enabled by the DNV reporting tool. In 2023 the basis for work has been carried out mapping needs and available information and getting ready for data tests. The work is expected to be concluded in Q1 2024.

Minerals Petroleum Near Surface Government Contracting Consulting Education

Mikael Johansson, senior principal consultant of DNV Maritime Advisory, said; ‘To be able to work jointly with the industry to develop relevant metrics to monitor energy and emission efficiency of the work carried out for seismic vessels that can be verified by third parties is very constructive for all stakeholders in the value chain.’ Shearwater and PGS currently own the bulk of vessels being used for seismic acquisition with Shearwater operating 21 towed-streamer and OBN vessels and PGS operating eight vessels carrying out acquisition.

Trillion reprocesses Black Sea data Trillion Energy International has completed its 3D seismic reprocessing for the SASB gas field in the southwest Black Sea. The company is in the process of interpreting the data and tying the results into the existing drilled wells. Significant improvements in seismic technology since the initial processing of the 2004 seismic data offer enhanced data imaging for reservoir characterisation,

Source: Trillion Energy vessel.

faults, and reservoir layers, said Trillion. Additionally, the new PSDM seismic volumes and velocity models have uncovered continuous reflectors, much clearer fault cuts, amplitude preservation, improved imaging of subsurface geology and enhanced results of attributes. It has also provided improved inversion, and AVO and higher quality data, revealing thin sand bodies; validation of the presence of hydrocarbons using different AVO attributes – such as intercepts, gradients, fluid factors – and gas indicators; and detection of sweet spots for new gas prospects. Arthur Halleran CEO of Trillion said: ‘Our reprocessed seismic data has unveiled a promising chapter for the SASB gas field. The revelation of extensive channel sands, particularly in the D and De-E zones, surpasses our earlier understanding of the field.’ FIRST

BREAK

Tensor Research support@tensor-research.com.au www.tensor-research.com.au

I

VOLUME

Tensor Research1021.indd 1

42

I

FEBRUARY

2024

25

03/09/2021 08:18


INDUSTRY NEWS

Offshore wind had a record year in 2023, says TGS report TGS – 4C Offshore has declared that 2023 was a record year for offshore wind investment with projects totalling 12.3 GW having closed during the year. This represents a strong recovery from last year, when only 0.8 GW reached final investment decision (FID). TGS – 4C Offshore’s latest Global Market Overview also states that 2024 could be another record year with up to 13 GW possible. Final investment decisions were made by eight European projects in 2023, totalling 9.3 GW, with Hornsea closing just in time for Christmas. In Asia-Pacific, 2.3 GW closed across Taiwan and South Korea, and 704 MW in the USA at Revolution Wind. The US is experiencing record offtake activity, with five auctions in process. Candidates include new and existing projects. Overall, offtake needs to make up ground, according to the report, which states that offtake contracts are down almost 2 GW to 9.5 GW in 2023, primarily driven by a no-show in the UK’s Contracts for Difference (CfD) auction. However, of those offtake contracts that were awarded, the price is markedly higher than in previous years – an average of $1.15/MWh in 2023 – reflecting

recent inflation, supply chain constraints and interest rates on the cost of energy. The authors of the report expect offtake contracts to remain an area of focus over the next two years. Richard Aukland, director of research at TGS – 4C Offshore, said: ‘Despite ongoing project delays and cancellations, 2023 has still managed to produce record progress in offshore wind. With high activity and a significant year of offtake auctioning expected in 2024 as countries work to hit their 2030 installation targets, a positive scene is being set for the next 12 months, and this will translate into record construction activity later in the decade.’ The offtake auction schedule continues to look healthy, with 47.5 GW slated for 2024 (40 GW in Europe), and it’s a similar story for lease rounds with 33.5 GW of leases under the hammer, including in Australia, Belgium, Colombia, Denmark, Estonia, Finland, France, India, Japan, Lithuania, Netherlands, Norway, Portugal, Spain, the UK, Uruguay and the US. The rate of leasing activity has ramped up in the last two years, with 43.8 GW-worth of sites awarded in 2023, one-third of which was for floating wind.

The report includes market indicators, which examine global additions, permitting, offtake, and timelines since 2010. These are used to identify trends for the year ahead. Latest market indicators show that with 879 GW accounted for by existing projects and national targets, a further 1121 GW would still be needed to meet IRENA/IEA’s estimated net-zero requirement of 2000 GW by 2050. On floating wind, the UK comes out on top overall, followed by Norway. Other markets scoring highly include South Korea, which currently has the greatest perceived potential, and the US, which has the greatest ambition. However, 4C Offshore’s forecast for floating wind underway by 2030 has been reduced for the 6th quarter in a row to 10 GW of capacity underway.

CGG and C-Questra pursue carbon capture projects

CGG is working to cut industrial emissions.

CGG and C-Questra have signed an agreement to collaborate on carbon capture, utilisation, and sequestration (CCUS). CGG intends to provide expertise and technological support to C-Questra

26

FIRST

BREAK

I

VOLUME

42

I

– a European technology company founded in 2023, specialising in the field of CCUS from emission sources to sequestration sites – to accelerate the development of certain carbon storage projects. Walid Sinno, CEO, C-Questra, said: ‘The purpose of the cooperation agreement is quite simply to go much faster. CGG contributes human and technological resources, while C-Questra implements the projects, particularly

FEBRUARY

2024

in France, by leveraging the combined expertise and experience of our technical teams that span almost 100 years.’ Sophie Zurquiyah, CEO, CGG, said: ‘Carbon storage is one of the key processes in the energy transition and CGG is continuing to diversify in this area where we bring real know-how and cutting-edge technologies. After concluding several projects and agreements in 2023, CGG is looking to accelerate the deployment of its offerings in 2024.’


INDUSTRY NEWS

PGS expands survey offshore Angola PGS has added more than 4200 km2 of 3D data to its regional MegaSurvey dataset over Block 33 in the deepwater Lower Congo Basin offshore Angola. The additional 4238 km2 of 3D data released by PGS in partnership with ANPG covers open acreage that is available for direct award from ANPG through its Permanent Offer regime. Total coverage of the Angola MegaSurvey dataset now stands at more than 49,500 km2. The data extension covers the majority of Block 33, which is located outboard and south of the prolific blocks 17 and 32. Combined, these currently produce

BRIEFS PXGEO has acquired 100% of the share capital of AmpSeis, a geophysical company that has developed an Ocean Bottom Node. ‘The acquisition secures the company’s access to core technology,’ said PXGEO. The first batch of the nodes will be delivered in Q1 2024. ‘The acquired node technology combined with the MantaRay OBN handling system using Hovering Autonomous Underwater Vehicles (HAUV), takes the efficiency of OBN seismic data acquisition to the next level.’

around 550,000 barrels of oil per day from post-salt Oligo-Miocene plays. Block 33 contains just six well penetrations, which have yielded two Oligo-Miocene oil discoveries, and is highly underexplored in comparison to the neighbouring acreage. No well in the deepwater Lower Congo Basin has yet penetrated the pre-salt megasequence and few wells have tested the post-salt Cretaceous section. One such test of the Cretaceous post-salt, however, has proven excellent source rocks with high oil potential in the Lower Albian-aged stratigraphy on Block 33. Upper Cretaceous-aged reservoirs are highly productive in the shelfal areas of the Lower Congo Basin and could be meaningful exploration targets in the outboard. Block 33, located on the flank of the main Congo Fan depocentre, is a favourable area for these deeper targets. Here overburden is less, and seismic imaging challenges are fewer due to reduced halokinesis of the Loeme Formation salt. The Angola MegaSurvey also provides 3D data coverage over neighbouring Permanent Offer blocks 32 and 34, where the same plays can be targeted.

Autoridade Nacional do Petroleo (ANP) has signed a PSC for TL-SO-22-23 (Block P), offshore Timor Leste, with Eni. The PSC lies adjacent to the Greater Sunrise field. The work programme includes 2D and 3D seismic acquisition and a one well commitment in the first exploration period. Geoteric and Petronas have announced an agreement for Geoteric AI Horizons technology to help digitise and decarbonise Petronas’ upstream exploration effort using its AI seismic interpretation technology. SLB is partnering with Geminus AI that will give SLB exclusive access to deploy the first physics-informed artificial intelligence model builder for oil and gas operations. The Geminus model builder fuses physics-based approaches with process data to produce AI models that can be deployed faster and at much less cost than traditional AI approaches, said SLB.

Santos develops CCS offshore Australia with Japanese help Santos is collaborating with Japanese energy companies on CCS offshore Australia. The agreement between Santos, JX Nippon Oil & Gas Exploration Corporation and ENEOS Corporation paves the way for a joint feasibility study that will evaluate the potential to capture, transport and sequester emissions from Japan, supporting expansion of the Moomba CCS project. The aggregation and management of carbon at Moomba complements current studies with Tokyo Gas and Osaka Gas for potential low-carbon e-methane production in the Cooper Basin. This would facilitate the export of e-methane,

made by combining green hydrogen with CO2 obtained from industrial emissions or direct air capture in a circular economy. The MOU seeks to jointly identify and define commercial and investment opportunities covering the potential importation of up to 5 Mtpa of CO2 by 2030, 10 Mtpa by 2035 and 20 Mtpa by 2040 from Japan to the Moomba CCS project, via either Port Bonython in South Australia or Gladstone in Queensland. This would potentially provide a large-scale source of CO2 to support Phase 2 of the Moomba CCS project and provide feedstock for future e-methane production. FIRST

BREAK

DNV’s Alternative Fuels Insight (AFI) platform has found that a total of 298 ships with alternative fuel propulsion were ordered in 2023 – an 8% increase year on year. There has also been a sharp increase in methanol orders (138), putting it neck and neck with LNG (130). Additionally, 2023 marked a breakout year for ammonia, with 11 orders for vessels run on this fuel. Impact Oil and Gas has farmed out 10% of its interests in Blacks 2912 and 2913B offshore Namibia to TotalEnergies, which brings the French energy company’s interests up to 42% in both blocks.

I

VOLUME

42

I

FEBRUARY

2024

27


INDUSTRY NEWS

Six European states sign landmark hydrogen deal A group of Baltic and Nordic natural gas transmission operators have signed a contract to conduct a pre-feasibility study for a European hydrogen transport infrastructure network. AFRY Management Consulting has been awarded the contract to carry out the pre-feasbility study for the Nordic-Baltic Hydrogen Corridor between Finland and Germany, passing through the Baltic countries of Estonia, Latvia and Lithuania, as well as Poland. The contract follows a provisional agreement between the European Council and Parliament on rules to create a regional hydrogen market and decarbonise the European Union gas industry. Expected to be completed mid-2024, the study ‘will provide a comprehensive, fact-based framework to allow decisions to be made,’ said German project partner Ontras Gastransport. The other participating companies in the project are Finland’s state-owned

Gasgrid Finland, Estonia’s state-owned Elering, Latvia’s Conexus Baltic Grid, Lithuania’s state-owned Amber Grid and Poland’s state-owned Gaz-System. Expected to go onstream in 2030 the corridor will be ‘the connection between green energy production regions in Northern Europe with the main consumption centres in Central Europe’, Ontras said. Amber chief executive Nemunas Biknius said: ‘The pre-feasibility study will provide recommendations on the scope of the project, pipeline routing, capacities, financing, and risk managemen. In addition, potential hydrogen storage sites will be investigated.’ In December EU countries agreed on plans to promote the uptake of renewable and low-carbon gases, including hydrogen, into the EU energy market. Ontras said: ‘With the dynamic changes related to energy transformation and decarbonisation of the economy

Baltic and Nordic gas transporters signed a contract for a pre-feasibility study for an EU hydrogen infrastructure network.

taking place in the Member States of the European Union, it is expected that hydrogen will become one of the main of energy carriers in Europe. ‘Hydrogen will strengthen the EU’s energy security and will play an important role in reducing the dependence of energy-intensive European industries on imported fossil fuels and energy.’

CGG set to report improved full-year revenues of more than $1 billion

Sophie Zurquiyah, CGG CEO.

CGG is expected to report Q4 2023 segment revenue of $316 million, down 1% year on year, but full-year segment revenue at around $1.12 billion, up 21%. Geoscience segment revenue is expected to be around $97 million, up 40% year-on-year. Earth Data segment sales are expected to be around $100 million, down 31%

28

FIRST

BREAK

I

VOLUME

42

I

year-on-year, mainly due in particular to delayed year-end licensing rounds in Brazil and Gulf of Mexico. Sensing and Monitoring segment sales are expected to be around $119 million, up 14% year-on-year. Improved full-year performance is driven in particular by very large deliveries of OBN and land equipment for mega-crew projects. Full-year 2023 segment EBITDA is expected to be around $390-$400 million. Sophie Zurquiyah, CEO of CGG said: ‘I am pleased to see the positive effects of our strategy, with GEO and SMO performing at near pre-covid levels, our new business initiatives reaching around $90 million in revenue generation, and the company organically delivering around $30 million positive net cash flow in 2023, despite $(65) million of penalty fees from vessel commitments.

FEBRUARY

2024

‘We expect 2024 performance to improve compared to 2023, while we anticipate the market to moderately grow through 2026, yet unevenly over the quarters, based mainly on mega crew activity and multi-client spending.’ The group’s liquidity at the end of December 2023 is expected to be around $415 million, including around $325 million cash liquidity and $90 million undrawn RCF. CGG anticipates year-end 2023 net debt before IFRS 16 to be around $875 million, and net debt after IFRS 16 to be around $980 million. CGG anticipates net cash flow generation to be flat in 2024 but significantly accelerate to $75-100 million during 2025-2026 based on ‘continued operational optimisation’, including the end of its contractual vessel commitments, and the further development of new businesses.


INDUSTRY NEWS

India signs contracts for 13 hydrocarbon blocks India’s Petroleum and Natural Gas Ministry has signed contracts for 13 hydrocarbon blocks opened for bidding in the last two years. Of the new contracts, 10 came from the 2023 Open Acreage Licensing Policy (OALP) Bid Round VIII and three were under the 2022 Special Coalbed Methane (CBM) Bid Round. In India the OALP allows companies to carve out development areas of their choice based on resource data at the National Data Repository (NDR), which stores exploration and production research results for Indian sedimentary basins. They then submit expressions of interest (EOIs), in a process done any time without waiting for a bidding round. Under the eighth round of the OALP, 13 companies made offers for the 10 blocks put forward. Four companies emerged successful, committing a total of $233 million for exploration work. The 10 blocks are ‘spread across nine Sedimentary Basins and included two land blocks, four shallow-water blocks, two deep-water blocks and two ultra-deep water blocks’. In the Special CBM round, 16 blocks were offered, spread across seven states. Three blocks got bidders. Of the six companies that made submissions, two won awards, for a total area of 717 km2. Pledges for exploration work from the successful bidders totalled $7.4 million.

Simultaneously the petroleum ministry opened applications for the ninth round of the OALP. Twenty-eight blocks covering 136,596 km2 are on offer. Bidders have until February 29 to make submissions. “It is estimated that after award of blocks under forthcoming OALP-IX and X Bid Rounds, about 560,000 km2 [16% of Indian sedimentary basins] will come under exploration by end of year 2024,’ the PIB news release said. ‘Offshore acreage of more than 1 million km2 has been made available in the recent past for exploration and production operations which were earlier so called ‘No-Go’ areas’, the PIB said. Only 10% of Indian sedimentary basins are under active exploration. The PIB added: ‘Several initiatives have been taken by the government for making available good quality data of Indian sedimentary basins to investors, such as the National Seismic Programme (NSP) in onshore areas, EEZ [exclusive economic zone] survey in offshore areas, opening of Andaman Basin, and uprade of NDR. ‘Other initiatives planned are the Mission Anveshan Project (Part II of NSP), Continental Shelf Survey, drilling of stratigraphic wells and the Hydrocarbon Resource Reassessment Study. ‘Several IOCs have visited the NDR and purchased a large volume of E&P Data for analysis.’

TGS MC3D Seismic TGS MC2D Seismic License Applications Open Areas

Lebanon launches third offshore licensing round Lebanon has launched its Third Offshore Licensing Round with a closing date for bids on July 2, 2024. Blocks on offer are: Blocks 1, 2, 3 and 4 in the northern half of the country’s maritime exclusive economic zone (EEZ) and Blocks 5, 6, 7, 8 and 10 in the southern half. Blocks 8 and 10 are still under

negotiation since the Second Offshore Licensing Round. ‘The set of open blocks was selected based on the priorities and goals of the upstream oil and gas sector and the objectives of the Third Offshore Licensing Round,’ said the Lebanese Petroleum Administration in a statement. FIRST

BREAK

I

VOLUME

TGS Feb24.indd 1

42

I

FEBRUARY

2024

29

05/01/2024 16:03


INDUSTRY NEWS

TGS expects fourth quarter revenues of $189 million TGS expects IFRS revenues for Q4 2023 to be approximately $189 million, compared to $219 million in Q4 2022. POC revenues are expected to be $205 million, compared to $227 million in Q4 2022. Proprietary revenues are expected to be $88 million, up from $60 million in Q4 2022. POC multi-client revenues are estimated at approximately $118 million, down from $167 million in Q4 2022, with early sales of $59 million, up from $31 million in Q4 2022, and late sales of approximately $59 million, compared to $137 million in Q4 2022. This results in POC revenues of $968 million for the full year of 2023, a growth of 14% compared to pro-forma POC revenues (incl. Magseis) in 2022. POC multi-client revenues are expected to be $549 million, up 8% compared to 2022.

The POC contract backlog is estimated at $545 million compared to $475 million on 30 September 2023 and $451 million on 31 December 2022. Cash balance on 31 December 2023 was approximately $200 million. Kristian Johansen, CEO of TGS, said: ‘While we are pleased to deliver an annual revenue growth of 14% in 2023, we are disappointed with late sales in Q4. Delayed licensing rounds, supermajors focusing their exploration spending on drilling and new seismic data acquisition, as well as ongoing M&A processes among some of our key customers, partly explain why we did not see the normal year-end spending in Q4. On a positive note, we saw increased activity from independents, good order inflow and positive momentum in our Acquisition business, which continues to outperform our expectations. Further, the strong development

in the Digital Energy Solutions business continued, with more than a doubling of revenues compared to Q4 2022. I’m increasingly optimistic for 2024, based on positive signals from our customers. Our contract backlog going into 2024 is 21% higher than a year earlier and the pipeline of further business opportunities looks promising. ‘In a market characterised by high volatility in both revenue mix and regional focus, diversification is increasingly important. The PGS transaction, which is expected to close during H1 2024, will ensure that TGS has exposure towards all parts of the energy data market, including streamer, OBN acquisition and products and services for new energy. Finally, the transaction will add geographical diversification and thereby reduce volatility of the multi-client business,’ added Johansen.

PGS to report increased Q4 revenues of $265 million suggestion for picture?

Rune Olav Pedersen, CEO, PGS.

PGS expects to report Q4 2023 income of approx. $265 million, compared to $216.7 million in Q4 2022. The company expects produced revenues for Q4 2023 of approximately $227 million, compared to $250.7 million in Q4 2022. Contract revenues ended at approximately $84 million in Q4 2023, compared to $111.2 million in Q4 2022. Multi-client late sales revenues were approximately $82 million in Q4 2023, compared to $92 million in Q4 2022. Estimated produced multi-client pre-funding revenues were approximately $56 million, compared to $42.6 million in Q4 2022. Multi-client pre-funding revenues based on IFRS, where revenues are recognised at the time of delivery of finally processed data, were approximate30

FIRST

BREAK

I

VOLUME

42

I

ly $94 million in Q4 2023, compared to $8.6 million in Q4 2022. In October 2023 PGS announced an award in the first part of an arbitration process relating to a transfer fee dispute. The second part of the arbitration process, for which the company recognised $15 million in Q4 2022, was settled in Q4 2023. The result more than fully covered the amount recognised, said the company. President and CEO Rune Olav Pedersen, said: ‘I am pleased to see Q4 multi-client late sales doubling compared to the average of the three first quarters of 2023. In addition, we recorded significant sales from surveys in the processing phase. The multi-client pre-funding level in Q4 was strong at approximately 150% of the capitalised cash investment, driven by these sales and attractive multi-client programmes in Brazil and Malaysia. ‘We used 25% of available vessel capacity for contract work and experience a flat pricing development, compared to the seasonally stronger summer rates. We commenced a large offshore wind site

FEBRUARY

2024

characterisation project in early October, which contributed approximately $13 million of the Q4 contract revenues.’ PGS had seven active 3D vessels in Q3 and Q4 2023, while the company had six active 3D vessels in Q4 2022. Meanwhile, The reflagging of the PGS vessel Ramform Tethys on 5 December 2023 marked the completion of an initiative to bring the entire PGS fleet of owned vessels under the Norwegian flag. PGS has flagged its vessels to the Bahamas, Singapore, Vanautu and elsewhere, but in September 2022 declared its intention to reflag all active vessels to Norway, citing the global geopolitical situation as a driving factor. The Ramform Titan led the way, with registration in the Norwegian International Ship Register (NIS) on 20 September 2022, and other active vessels in the PGS fleet have followed suit. The reflagging journey commenced with the Ramform Vanguard, registered with NIS in August 2021 before a project in the Black Sea.


INDUSTRY NEWS

UK fossil fuel power generation at lowest since 1957 Power generated from coal, gas and oil in the UK dropped in 2023 to the lowest since 1957 on renewable energy growth and electricity use slowdown, according to an analysis. However, the rate of increase in low-carbon generation was still below the level needed to meet the government’s goal of getting 95% of power from low-carbon sources by 2030, according to a report from Carbon Brief.

Power generated from coal, gas and oil in the UK dropped in 2023 to the lowest since 1957 on renewable energy growth and demand slowdown.

Last year the share of fossil fuels in the country’s power generation decreased 22% compared to 2022. Fossil fuels made up 33% or 104 terawatt hours (tWh) of power produced in the UK in 2023, the country’s lowest for such sources in 66 years, Carbon Brief said. Natural gas comprised 31%, coal just over 1% and oil just below 1%, Carbon Brief reported using data from the government and Elexon Ltd, which operates the power supply-demand balancing system in the UK. ‘These declines have been caused by the rapid expansion of renewable energy (up six-fold since 2008, some 113TWh) and by lower electricity demand (down 21% since 2008, some 83TWh),’ it wrote. Low-carbon sources accounted for 56% of power generated in the UK in 2023, of which 43% came from renewables and 13% from nuclear energy, Carbon Brief said.

ENERGY TRANSITION BRIEFS Woodside Energy has agreed with four Japanese companies to study a potential carbon capture and storage value chain between Japan and Australia. Woodside, Sumitomo Corporation, JFE Steel Corporation, Sumitomo Osaka Cement Co and Kawasaki Kisen Kaisha will study the capture, storage and transportation of carbon dioxide (CO2) emissions from the Setouchi and Shikoku regions of Japan and the injection and storage of the CO2 at Australian storage sites.

UK power generation in 2023 had the lowest-ever carbon dioxide (CO2) intensity, or the amount of CO2 released per unit of electricity generated. Emissions of the planet-warming gas from the power sector averaged 162 grams per kilowatt hour. ‘This remains a long way from the government’s ambition for 95% low-carbon electricity by 2030 – just years from now – and a fully decarbonised grid by 2035,’ Carbon Brief said. The fastest rate of growth in the share of low-carbon generation in the UK’s power mix has been 25% percentage points in seven years, from 23% in 2010 to 48% in 2017. This rate is still below the 39 percentage points increase needed, based on the 2023 share of low-carbon sources, to reach the 95% aim by the end of the decade. ‘The rise of renewables since 2008 has been nearly as steep as the fall for fossil fuels,’ Carbon Brief said. ‘Notably, however, since reaching 134TWh in 2020, renewables have effectively stood still, with output of 135TWh in 2023, matching the record 135TWh set in 2022. ‘This reflects the balance between continued increases in wind and solar capacity, variations in average weather conditions and reduced output in the past two years from bioenergy’. It noted only one offshore wind generation project was completed 2023. Equinor announced in October that the Dogger Bank Wind Farm, a North Sea project with a planned capacity of 3.6GW, has been partially put onstream. Full capacity of the world’s biggest offshore windfarm is planned to be achieved by 2026 in three phases, 1.2 GW each, through a total of 277 turbines, according to Equinor. Carbon Brief added that since 2008 ‘coal has nearly disappeared from the UK electricity system, falling from 119TWh in 2008 to 4TWh in 2023 (down 115TWh, 97%)’. ‘Gas, meanwhile, is now down to levels rarely seen since the mid-1990s, falling from 178TWh in 2008 to just 98TWh in 2023 (down 80TWh, 45%)’, Carbo Brief said. FIRST

BREAK

Norway has approved the appointment of PGNiG as operator and partner with Horisont Energi in Polaris (CO2 exploration licence EXL003), the only CO2 storage located in the Barents Sea. Statkraft is investing 6 billion euros in Norwegian hydro and wind power facilities. The investment will include 1 billion euros in onshore wind farms; 2500 GWh or more wind power production (more than double the current production). Equinor and Linde have agreed to develop the H2M Eemshaven low carbon hydrogen project in The Netherlands by 2028. Equinor will secure access to carbon transport and storage capacity and offer low carbon hydrogen to the market. Linde will build, co-own and operate the hydrogen production and carbon capture and transfer facility.The facility in the Eemshaven industrial area will reform natural gas from the Norwegian continental shelf to low-carbon hydrogen with CO2 capture and storage (CCS). More than 95% of the CO2 will be captured and stored under the seabed offshore Norway. The US Bureau of Ocean Energy Management (BOEM) has launched a public consultation on a draft development of six wind lease areas offshore New York and New Jersey (the New York Bight). Development of leases, totalling over 488,000 acres, could generate 5.6 to 7 GW. The US Energy Information Administration (EIA) expects solar electric generation to account for 7% of total US electricity generation in 2025, up from 4% in 2023.

I

VOLUME

42

I

FEBRUARY

2024

31


INDUSTRY NEWS

SLB, TGS and PGS expand Malaysia survey TGS, PGS and SLB, have secured prefunding to expand multi-client seismic data coverage in the Sabah Basin offshore Malaysia. The seventh phase of this multi-year project off the coast of Sabah encompasses over 5000 km of new 2D seismic data acquisition, over 2600 km of legacy seismic data processing, and 2800 km2 of 2D-cubed processing as part of a multi-year contract originally awarded by Petronas in 2016. Kristian Johansen, CEO at TGS, said: ‘High-quality 2D seismic data across Sabah will be instrumental in promoting future bid rounds. It also allows E&P

The Eagle Explorer vessel. Source: SeaBird.

companies to further assess exploration opportunities in an under-explored proven petroleum system surrounded by prolific hydrocarbon provinces. ‘The 2D multi-client project will provide high-quality seismic data across Sabah.’ The Eagle Explorer vessel mobilised in November 2023, with acquisition completion anticipated in February 2024. Fast-track results are anticipated to be available for evaluation during the 2024 Malaysia Bid Round. In the summer of 2023, PGS, TGS and SLB secured pre-funding to expand 3D coverage in the nearby Sarawak Basin.

Oil and gas round-up ConocoPhillips has made a financial investment decision on the Willow project in Alaska. The decision follows the US Department of the Interior March 2023 Record of Decision and recent positive court orders, including this week’s Ninth Circuit Court of Appeals denial of plaintiffs’ request for an injunction. According to the US Bureau of Land Management, the Willow project is projected to deliver $8 billion to $17 billion in new revenue for the federal government, the state of Alaska and Alaska Native communities. When completed, Willow is estimated to produce approximately 600 million barrels across the lifetime of the project. The Willow project underwent five years of rigorous regulatory and environmental review. TotalEnergies (40%), QatarEnergy (30%) and Petronas (30%) have signed a production sharing contract for Block 64 with Staatsolie, the state-owned oil company of Suriname. Block 64 was awarded to TotalEnergies and its partners in the Bid Round 2022-2023. Block 64 is 6262 km2 block, 250 km from shore. Petronas has an agreement with PTTEP to develop the PTTEP-operated Blocks SK405B and SK410B off the coast of Sarawak, offshore Malaysia.

32

FIRST

BREAK

I

VOLUME

42

I

Tethys Oil has completed drilling of exploration well Menna-1 in Oman, indicating hydrocarbons in three separate zones. The Menna-1 well was drilled vertically to a total depth of approx. 1600 m. The well logs indicate hydrocarbons in the Al Khlata, Karim and Birba formations. The prospect is one of several identified on the Eastern Flank trend, stretching alongside the border of Block 6’s productive Karim Small Fields. In total, the trend currently has more than a dozen prospects and leads that have not yet matured into prospects. A full prospect and lead inventory is expected to be finalised in the first quarter of 2024. Galp (80%, operator) and partners Namcor and Custos (10% each), have drilled and logged the first exploration well (Mopane-1X) in block PEL83, offshore Namibia and confirmed the discovery of a significant column of light oil in reservoir-bearing sands of high quality. Galp will continue to analyse the acquired data and anticipates performing a Drill Stem Test (DST) in the coming weeks to assess the commerciality of this discovery. The drilling operations at Mopane-1X will proceed to explore deeper targets. BlueNord has made a final investment decision on the Harald East Middle

FEBRUARY

2024

Jurassic well (HEMJ) in the Danish sector of the North Sea, expected to be spudded during the summer of 2024. Harald East is operated by the Danish Underground Consortium (DUC), a JV between TotalEnergies (43,2%), BlueNord (36,8%) and Nordsøfonden (20%) If successful, the well could deliver production by end of 2024. The expected gain from the well is up to 8 mmboe net to BlueNord of which ca 80% is gas. This well is drilled into the Jurassic with good reservoir properties. Beacon Offshore Energy has taken a final investment decision to develop the Winterfell discovery in the Gulf of Mexico, which is operated by BOE and will be developed as a subsea tieback. The Miocene-aged project is located in Green Canyon blocks 943, 944, 987, and 988 with a water depth of approx. 5200 ft (1585 m). Winterfell was discovered in 2021 with subsequent successful appraisal drilling conducted in 2022. The field will be developed via a subsea tieback to the Heidelberg spar located in Green Canyon Block 860. First oil is expected to occur early in the second quarter 2024 and from three initial wells projected to deliver gross production of approx. 22,000 boepd.


TECHNICAL ARTICLE

DC resistivity inversion using conjugate gradient and maximum likelihood techniques with hydrogeological applications Cassiano Antonio Bortolozo1,2*, Jorge Luís Porsani2, Fernando Acácio Monteiro dos Santos3 and Tristan Pryer4.

Abstract This study introduces a DC 2D inversion algorithm that employs conjugate gradients relaxation to solve the maximum likelihood inverse equations. The adoption of the maximum likelihood algorithm was motivated by its advantage of not requiring the calculation of electrical field derivatives. While the inversion algorithm based on the maximum likelihood inverse theory has been extensively described for 3D DC inversion using finite differences modelling, its application in the 2D finite element method has received limited attention. A significant difference between 3D finite difference modelling and 2D finite element methods lies in the integration variable lambda. In our 2D case, the electrical potential is initially calculated in the Laplace and Fourier domains, which include the stiffness matrix. However, to obtain the stiffness matrix in the Cartesian domain, we had to develop a suitable transformation method since no existing resources in the literature addressed this specific condition. In this study, we successfully transformed the stiffness matrix using a similar approach to the potential calculation. The results obtained from synthetic and real models demonstrate the method’s potential for various applications, as exemplified by the hydrogeological study presented in this work.

Keywords: Electrical resistivity (ER), Finite Elements, 2D ER inversion, Sedimentary aquifers, Paraná Basin. Introduction Advancements in geophysical research are marked by the continuous development and adaptation of inversion algorithms focusing on robustness and computational efficiency. This study introduces a refined version of Zhang et al.’s 1995 algorithm, initially developed for 3D electrical resistivity inversion, optimised for speed through implicit computations involving partial derivatives required for inversion. In our adaptation, the algorithm is employed for 2D electrical resistivity inversion problems, which present unique challenges and computational requirements. The inherent dimensionality reduction involves computations in a transformed domain, contrasting the original’s Cartesian domain approach. This work employs finite element methods due to their proficiency in approximating complex geometries and potential for local mesh adaptivity, promising high-accuracy solutions, as demonstrated by Ashby et al. (2021). Our proposed finite element method operates in the transformed domain, converting to the Cartesian domain at the final stage, necessitating novel strategies and routines for matrix transformation to maintain algorithm functionality.

Extensive testing confirms the algorithm’s capability for 2D DC inversion, with synthetic data inversions substantiating its applicability and reliability, and real data inversion furnishing insightful subsurface resistivity structures, like the mapping of the sedimentary aquifer in the Ibirá region. At its core, this work furnishes a streamlined adaptation of Zhang et al.’s algorithm for 2D DC electrical resistivity inversion, yielding crucial insights into subsurface resistivity structures and substantiating the algorithm’s utility and effectiveness in advancing regional aquifer system knowledge. 2D electrical resistivity modelling using finite elements In the 2D case, the geoelectric model consists of a discretised section composed of small polygons representing the subsurface. Each polygon is assigned an electrical resistivity value that is independent of the others. Therefore, the direct problem involves determining the distribution of the electric field in the section due to a known source. There are various ways to discretise the medium, with the most commonly used methods in geophysical electrical studies being finite difference and finite element approximations. In this research, we developed a 2D modelling and inversion of electrical resistivity using finite elements.

1

São Paulo State University (Unesp) | 2 Universidade de São Paulo | 3 University of Lisbon | 4 Department of Mathematical Sciences

*

Corresponding author, E-mail: cassianoab@gmail.com

DOI: 10.3997/1365-2397.fb2024011

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

33


TECHNICAL ARTICLE

Triangular elements were employed, allowing for excellent discretisation of both the medium and the topography of the area, although more general polygonal elements are available, as described in Dong et. al. (2020), which would offer additional complexity reduction. The direct calculation of electrical resistivity is described in detail in Rijo (1977) and serves as a reference for this work. In this formulation, the equation governing the electric potential must be transformed into Fourier space to remove the variable corresponding to the geoelectric strike direction (y-axis). In Fourier space, the electric potential can be obtained through the following linear system: (1) ~ where K represents the stiffness (capacitance) matrix Φ is the vector of the transformed potential for each λ across the mesh, and v represents the current vector. The potential in Fourier space φ~(x, k, z) obtained from solving the linear system (1) is the transformed potential. To obtain the electric potential in the Cartesian domain, an inverse transform is required. To perform this transformation a set of 15 optimal values of k are used to discretize the interval 0<k<∞. The inverse Fourier Transform is then performed using a trapezoidal method. To close the system, one must also consider boundary conditions. We apply Neumann boundary conditions in the surface, with z = 0 and a mixed boundary condition along the external left, right and base boundaries by exploiting the asymptotic behaviour of the electrical potential and its expected derivative at large distances from the source as described by Dey and Morrison (1979), see also Bortolozo (2016). Maximum likelihood algorithm The data inversion we utilise is based on an adaptation of the algorithm described in Zhang et al. (1995), applied for 3D resistivity inversion using finite difference discretisation. However, there is a significant difference between the calculation of 3D resistivity using finite differences and 2D finite element modelling; the 3D calculation is not performed in the transformed domain. Therefore, in the direct calculation of Zhang et al.’s (1995) work, the authors do not have to deal with the transformed domain capacitance matrix, only the capacitance matrix in Cartesian space. This poses a potential problem in the inversion algorithm, as the calculation requires the real capacitance matrix and its Jacobian which is not computed in a finite element method. Our solution to this is an appropriate representation of the capacitance matrix and its Jacobian which we will describe in the sequel. Thus, the inversion algorithm not only functioned properly but also generated highly satisfactory results. In this research, the maximum likelihood algorithm initially developed by Tarantola and Valette (1982) was employed. The mathematical formulation of the maximum likelihood used follows Mackie et al. (1988) and Madden (1990), which, in turn, build upon the works of Tarantola and Valette (1982) and Tarantola (1987). The formulation can be expressed as an iterative 34

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

method. Given an initial model vector m0 let k = 0,1,… being the iteration parameter then we seek an update δmk such that (2) where: •  A~ denotes the sensitivity matrix, •  d is the vector of observed data, •  mk is the vector of the model at the kth iterate, ~ •  G is the forward modeling operator and ~ ~ •  R dd and Rmm are the data and model covariance matrices respectively. For a single source with m receivers, and n elements, let denote a vector of measurements at the receivers and the electrical resistivity on the elements. We define the sensitivity matrix

with components

.

In the conventional techniques in the literature with potentially many elements and sources, the matrix A~ can become cumbersome, leading to computational issues. There is no need for any numerical computation of partial derivatives and sensitivity matrices. For each relaxation iteration, only two direct problem calculations need to be carried out to update the step δmk. The total number of direct calculations per inversion iteration is determined by the number of iterations in the relaxation, which is equal to 2nrel, twice the number of relaxation iterations. This value is considerably smaller than that in traditional methods, which require a direct calculation for each element of the mesh when numerical partial derivatives are computed. For this work, we utilised a logarithmic parameterisation of the data (logarithm of apparent resistivity) and the model (logarithm of resistivity). However, for exposition of the ideas we abuse notation and do not redefine them. The following argument is adapted from Zhang et al. (1995) which we include for completeness and to show the main differences. A single potential measurement qi can be defined as: (3) where i = 1,2, …, m represents the receiver index, v∈Rm is the vector of voltages at all mesh nodes, and is the ith canonical basis vector of Rm. The partial derivatives with respect to ρ in Equation 1 can be given as: (4) Substituting Equation 5 into Equation 6 gives the sensitivity term of the sensitivity matrix as: (5) ~ As the matrix ∂K /∂ρ is composed of partial derivatives corresponding to each element, it can be observed that it contains only


TECHNICAL ARTICLE

a few non-zero terms associated with the resistivity medium ρ in ~ the matrix K . Therefore, it is possible to analytically obtain the ~ matrix ∂K /∂ρ through derivation. Consequently, the sensitivity matrix A~ can be obtained by multiplying it with an arbitrary vector x∈Rn in the following form:

(6) Similarly, the transpose matrix A~T multiplied by a vector y∈Rm is given by:

(7)

~ It is noteworthy that the matrix ∂K /∂ρi in these equations contains only a few non-zero terms, which can be calculated analytically. As mentioned earlier, these vectors need to be updated in each relaxation iteration, but the computational cost of calculating these vectors is insignificant compared to numerical computation of the Jacobian. When conducting a survey with multiple electrode positions, which is the case in most situations, a direct calculation is necessary for each source position. Thus, Equations 8 and 9 are slightly modified to include M equations in column form for A~x and in row form for A~Ty, with: (8) and (9) Notice that x remains of dimension n (representing the number of parameters), but y grows in size to M(M −1). Thus, the computation of A~x and A~Ty takes the form of: A~x = (f1, f2, …, fM)T (10)

However, as mentioned earlier, in the case of 2D electrical resistivity using finite element methods, the capacitance matrix ~ ~ K and, consequently, the matrix ∂K /∂ρ are not computed directly. ~ ~ Instead, the transformed domain matrices K and ∂K /∂ρ are obtained. This is the reason why there is not yet an algorithm that utilises this type of approximation for partial derivatives in 2D electrical resistivity using finite element methods. Therefore, what was done was to calculate the transformed domain matrices ~ ~ K and ∂K /∂ρ , and then perform the inverse transform in the same way as done with the electric potential. The lambda values used for potential calculations were also employed here. The results obtained with this algorithm were analysed and found to be compatible with those expected from a conventional inversion algorithm, with the advantage of not requiring the calculation of partial derivatives, which reduces the computational cost. Regularisation To improve the inversion results and make the process more stable, two types of regularisation were incorporated into the process: smoothing regularisation and damping regularisation. The regularisation methods used in this study are described in Zhang et al. (1995), and they are based on the Tikhonov regularisation method (Tikhonov and Arsenin, 1977). The damping term reduces the influences of small eigenvalues in the early stages of inversion, thereby increasing their influence in the later stages. Consequently, the damping term maintains the convergence rate of the maximum likelihood inversion stable throughout the process. Synthetic inversions To validate the algorithms and understand the potential of the developed inversion method, a series of tests were performed using synthetic models, two of which are presented in this article. The developed models aim to represent typical geological situations encountered in hydrogeological studies within geophysical research. In the first case, a regular grid was employed to simulate simple models with a flat topography. In the second model, routines for generating an irregular mesh were utilised, matching the size and topography of the actual study area, which will be discussed further in the text. In synthetic survey 1, a Pole-Dipole array was simulated with an electrode spacing of 5 m and nine levels of investigation. The total electrode spread covers a distance of 280 m, ranging from -140 m to 14 0m on the graph. In the second case, a Pole-Dipole survey was simulated with an electrode spacing of 25 m and five levels of investigation, spanning a distance of 1000 m. More details of the DC method and the Pole-Dipole array configuration and filed procedure are presented in Telford et al. (1990). In each of these synthetic test cases we compute the normalised root mean square error (NRMSE) as a metric to assess the quality of the inversion process defined as

and (12)

A~Ty = (g1 + g2 +…+ gM) (11) where f1 and g1 (i = 1, 2, …, M) are given by Equations 8 and 9, respectively.

for m and d denoting the vector of model and observed data respectively. We also mention that the number of relaxation FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

35


TECHNICAL ARTICLE

iterations we used varied, but never exceeded eight iterations. With experimentation, we found that this was the optimal in the tradeoff between complexity and accuracy. Model 1 — Buried conductive block The first model used in the synthetic studies is a homogeneous medium model with a resistivity of 1000 Ohm.m, containing a conductive block with a resistivity of 100 Ohm.m embedded within it. This model was selected due to its frequent use for algorithm validation and its ability to represent geological structures such as paleochannels. The model can be seen in Figure 1a. Notice the underlying mesh is graded in certain regions, this is to facilitate imposition of boundary conditions. The inversion results are presented in Figure 1b. It is evident that the obtained result is deemed highly satisfactory, with the final model closely resembling the true model in both dimensions and resistivity. The NRMSE obtained from the inversion was 4.7%. Model 2 — Resistive block on the surface Model 2 was developed based on the results obtained from an electrical profiling survey conducted in the city of Ibirá, SP, Brazil, which will be presented in the next section of this article. In this synthetic case, the mesh used follows the electrode positions obtained from the actual survey measurements. The simulated model is quite simple, consisting of a resistive block within a conductive medium, as depicted in Figure 2a. However, the difference between this model and the previous one is that, this time, the resistive block is on the surface, posing a detection challenge due to the 25 m electrode spacing. These variations between the

models aim to illustrate different scenarios and how inversions handle them. The initial model used in the inversion for this case is a homogeneous model with a resistivity of 100 Ohm.m. The inversion results are presented in Figure 2b, with a NRMSE of 5.3%. Despite being a simple body, the topography could potentially influence the determination of the block; however, this was not observed. The inversion results accurately recovered the location and shape of the resistive block. The region where the base of the block in the inversion exhibits the largest discrepancy with respect to the true depth corresponds precisely to the elevated topography of the area. This suggests an increase in the depth of the block’s base in that region. It demonstrates that the depth accuracy of the inversion is influenced by the topography. Regarding the resistivity of the block, it is underdetermined compared to the true value, indicating the difficulty in defining high resistivity contrasts in some cases. Overall, the finite element inversion method effectively delineates the resistive block, showcasing the capability of the employed modelling and inversion methodology. Real world test results in Ibirá, São Paulo, Brazil The study area in Ibirá (Figure 3a), located in the northwest region of São Paulo state, encompasses the Paraná Sedimentary Basin. Groundwater exploitation in the region primarily occurs in the Bauru Aquifer (sedimentary) and the shallow portion of the Serra Geral Aquifer (crystalline), mainly through dug wells (Bauru Aquifer) or deep tube wells (Serra Geral Aquifer). The Bauru Aquifer is extensively utilised in the region, particularly in small rural properties and remote neighborhoods, due to its accessibility and lower cost as a water source. For this reason, the

Figure 1 Synthetic Model 1. In a) the model simulating a buried onductive block (100 Ohm.m) in a half-space with a resistivity of 1000 Ohm.m. In b) the inversion results of Synthetic Model 1. Notice that the profile is smoothed, which is a result of the regularisation. It is also skewed to the right, a result of geometry of the electrical profile and the unequal sampling or asymmetry in electrode setup. Since the injection electrodes (current) are on one side and the potential electrodes (measuring voltage) are on the opposite, certain portions of the subsurface are interrogated more than others, leading to this skewness.

36

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


TECHNICAL ARTICLE

Figure 2 Synthetic Model 2. In a) the model simulating a resistive block (1000 Ohm.m) outcropping on the surface, in a half-space with a resistivity of 100 Ohm.m. In b) the inversion results of Synthetic Model 2. Notice the diffusion of the block in the x-direction. This is indicative of the spatial resolution as well as the regularisation parameter. Regularisation smooths out the model to stabilise the inversion, but over-smoothing can lead to loss of detail. The inversion also underestimates the depth of the block which is a limitation in the depth of investigation due to electrode spacing.

Figure 3 Study Area in Ibirá, state of São Paulo. In a) the location of the city in Brazil, b) the ERT carried out in the city, in c) the location of the survey and in d) the stream at the position of -400m.

region is the subject of numerous studies, including geophysical investigations employing electrical and electromagnetic methods (Bortolozo et al., 2023; Almeida et. al., 2017; Leite et al., 2018; Campana et al., 2017; Bokhonok et al., 2015; Bortolozo et al., 2017; Bortolozo et al., 2015; Bortolozo et al., 2014). To minimise interference from external noise sources, the survey was conducted in a noise-free rural area (Figure 3b). The survey area exhibited an undulating topography, with an increasing elevation from south to north along the profile (Figure 3c). The surface of the area consisted mainly of sugarcane plantations,

except for a stream crossing the area between positions -400m (Figure 3d). Prior to the survey, the region experienced rainfall, resulting in water accumulation in the lower areas of the plantation. Consequently, the soil was moist, enabling good electrical contact with the electrodes used in the pole-dipole electrical profiling. The field campaigns were conducted in October and November 2014, employing the SYSCAL Pro equipment manufactured by IRIS. The electrical profiling was performed using a pole-dipole array, with an electrode spacing of 25 m and a total profile length of 1000 m. FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

37


TECHNICAL ARTICLE

Geology The Bauru Group, consisting of the Caiuá and Bauru groups, represents the uppermost formation in the Ibirá region. These sandstones were deposited in semi-arid environments and include alluvial fan and ephemeral fluvial systems (Milani et al., 2007). The Bauru Aquifer is a significant water source covering a large portion of western São Paulo, supplying water to numerous municipalities (Iritani & Ezaki, 2012). Underlying the Bauru Group is the Serra Geral Formation, primarily composed of Cretaceous basalts and basaltic andesites. This fractured aquifer lacks primary porosity and permeability, with water circulation occurring through fractures in the basalt layer. The Serra Geral Aquifer plays a vital role in urban water supply and irrigation in western São Paulo due to its high-water yield potential. Results with real data For the inversion, the initial model is a homogeneous model with a resistivity of 30 Ohm.m, and the inversion results are presented in Figure 4, with a NRMSE of 11.2%. In this inversion, we can observe the resistive near-surface regions associated with the soil and the dry region of the sedimentary layer in the Bauru formation. With the influence of the topography, there is a clear association between the topography of the region and the near-surface resistivity along the profile. In the northwest corner of the profile, where the topography is at its maximum, the near-surface layers exhibit higher conductivity. This could be attributed to recent rainfall in the area, which may have accumulated in the topographic depressions. In the shallower regions

between the stream area and the higher topographic region, higher resistivity values are observed. The stream area, clearly delineated by the topography in the profile, is characterised by its high conductivity, covering a significant portion of the profile’s extent (approximately 300 m). This lower resistivity is likely to be due to the presence of moisture from plants in the stream, deposited organic matter, and potentially surface clay deposition. Below this more conductive region, near the southeastern end of the profile, a more resistive zone is observed, which is likely to correspond to the top of the basalt layer (S.G. Fm.). Since the topography of the area is known, this resistive region can be confidently associated with the top of the basalt layer, as it coincides with the lowest elevation. The green region in the profile, with resistivity around 100 Ohm.m, is likely to be associated with the saturated region of the Bauru formation (upper sedimentary aquifer), revealing the variation in the top of the aquifer within the study area. Consequently, a geological/geoelectric model of the area was developed (Figure 5), providing a better correlation between the resistivity distribution and the geological model. Conclusion In this paper, we introduced a novel DC inversion method utilising a maximum likelihood algorithm. Through the synthetic tests, we observed that the algorithm yielded accurate and reliable results. Furthermore, when applying the developed algorithm to real data from Ibirá, we obtained highly interesting results. The inversion of the real data provided valuable insights into the resistivity profiles and facilitated aquifer characterisation

Figure 4 2D inversion results of the electrical profile from real data in Ibirá.

Figure 5 Geological/Geophysical model obtained from electrical resistivity profile conducted in Ibirá - SP.

38

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


TECHNICAL ARTICLE

in the study area. Overall, the presented DC inversion method offers a powerful tool, enabling efficient and robust mapping of subsurface resistivity distributions. The promising outcomes from both synthetic and real data inversions highlight the effectiveness and applicability of the proposed algorithm. In conclusion, the proposed DC inversion method, applied to the hydrogeological studies carried in the northwest portion of São Paulo State, contributes to improved understanding and management of groundwater resources, supporting sustainable development in hydrogeological studies.

Bortolozo, C.A., Couto, M.A., Porsani, J.L., Almeida, E.R. and Monteiro dos Santos, F.A. [2014]. Geoelectrical characterization using joint inversion of VES/TEM data: A case study in Paraná Sedimentary Basin, São Paulo State, Brazil. Journal of Applied Geophysics, 11, 33-46. Bortolozo, C.A., Porsani, J.L., Monteiro dos Santos, F.A. and Almeida, E. R. [2015]. VES/TEM 1D joint inversion by using Controlled Random Search (CRS) algorithm. Journal of Applied Geophysics, 112, 157-174. Campana, J.D.R., Porsani, J.L., Bortolozo, C. A., Serejo, G. and Monteiro dos Santos, F.A. [2017]. Inversion of TEM data and analysis of the

Acknowledgements Cassiano Bortolozo thanks the Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) for the Postdoctoral Scholarship (grant 152269/2022-3) and for the Research Fellowship Program (grant 301201/2022-6). Cassiano Bortolozo also thanks Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) for the scholarship (grant 2011/06404-0) that allowed the data acquisition. Jorge Luís Porsani acknowledges Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) for providing financial support to this research (Grants 2009/084663 and 2012/15338-4). TP is grateful for partial support from EPSRC (EP/X017206/1, EP/X030067/1 and EP/W026899/1) and the Leverhulme Trust (RPG-2021-238).

2D induced magnetic field applied to the aquifers characterization in the Paraná basin, Brazil. Journal of Applied Geophysics, p. 233-244. Dey, A. and Morrison, F. [1979]. Resistivity modeling for arbitrarily shaped two-dimensional structures. Geophysical Prospecting, 27, 106-136. Dong, Z., Georgoulis, E.H. and Pryer, T. [2020]. Recovered finite element methods on polygonal and polyhedral meshes. ESAIM: Mathematical Modelling and Numerical Analysis, 54(4), pp.1309-1337. Iritani, M.A. and Ezaki, S. [2012]. As águas subterrâneas do estado de são Paulo. Governo do Estado de São Paulo secretaria do meio ambiente. Instituto Geológico publication. Leite, D.N., Bortolozo, C.A., Porsani, J.L., Couto, M.A., Campana, J.D.R., Monteiro dos Santos, F.A., Rangel, R.C., Hamada, L.R., Sifontes, R.V., Serejo, G. and Stangari, M.C. [2018]. Geoelectrical

References

Characterization with 1D VES/TDEM Joint Inversion in Urupês-SP

Almeida, E.R., Porsani, J.L., Monteiro dos Santos, F. A., Bortolozo, C. A.

Region, Paraná Basin: Applications to Hydrogeology. Journal of

[2017]. 2D TEM Modeling for a Hydrogeological Study in the Paraná Sedimentary Basin, Brazil. International Journal of Geosciences (ON LINE), v. 08, p. 693-710. Ashby, B., Bortolozo, C., Lukyanov, A. and Pryer, T. [2021]. Adaptive modelling of variably saturated seepage problems. The Quarterly Journal of Mechanics and Applied Mathematics, 74(1), pp.55-81. Bokhonok, O., Diogo, L.A., Bortolozo, C.A., Mendonça, C.A. and Slob, E. [2015]. Residual function dispersion maps to evaluate multidi-

Applied Geophysics, 151, 205-220. Mackie, R.L., Bennett, B.R. and Madden, T.R. [1988]. Long-period magnetotelluric measurements near the central California coast: A land-locked view of the conductivity structure under the Pacific Ocean: Geophysics Journal, 95, 181-194. Mackie, R.L. and e Madden, T.R. [1993]. Three-dimensional magnetotelluric inversion using conjugate gradients: Geophysics Journal International, 115, 215-229.

mensional objective function topography: Near surface geophysical

Madden, T.R. [1990]. Inversion of low-frequency electromagnetic data,

inverse problems. Resultados Preliminaries: 14th International Con-

in oceanographic and geophysical tomography: Elsevier Science

gress of the Brazilian Geophysical Society & EXPOGEF.

Publ., 337-408.

Bortolozo, C.A., Porsani, J.L., Pryer, T., Benjumea, J.L.A., dos Santos,

Milani, E., Melo, J., Souza, P., Fernandes, L. and França, A.B. [2007].

F.A.M., Couto Jr., M.A., Pampuch, L.A., Mendes, T.S.G., Metodiev,

Bacia do Paraná. Boletim de Geociências da Petrobrás, 15, nº 1,

D., de Moraes, M.A.E., Mendes, R.M. and de Andrade, M.R.M. [2023]. Curupira V1.0: Joint Inversion of VES and TEM for Environmental and Mass Movements Studies. International Journal of Geosciences, 14, 1160-1176. Bortolozo, C.A., Bokhonok, O., Porsani, J.L., Monteiro dos Santos, F.A.,

265-287. Rio de Janeiro, Brasil. Rijo, L. [1977]. Modeling of electric and electromagnetic data. Ph.D. Thesis, University of Utah. Tarantola, A. and Valette, B. [1982]. Generalized Nonlinear Inverse Problems Solved Using the Least Squares Criterion: Reviews of

Diogo, L.A. and Slob, E. [2017]. Objective Function Analysis for

Geophysics and Space Physics, 20, 219-232.

Electric Soundings (VES), Transient Electromagnetic Soundings

Tarantola, A. [1987]. Inverse Problem Theory. Elsevier.

(TEM) and Joint Inversion VES/TEM. Journal of Applied Geophys-

Telford, W.M., Geldart, L.P. and Sheriff, R.E. [1990]. Applied Geophys-

ics, 146, 120-137. Bortolozo, C.A. [2011]. Inversão conjunta 1D e 2D de dados de Eletrorresistividade e TDEM aplicados em estudos de hidrogeologia na

ics. 2nd Edition, Cambridge University Press, Cambridge, 770. Tikhonov, A.N. and e Arsenin, V.I. [1977]. Solutions of ill-posed problems. VH Winston & Sons, Washington, DC.

bacia do Paraná. (PhD Thesis). Universidade de São Paulo, Instituto

Zhang, J., Mackie, R.L. and Madden, T.R. [1995]. 3-D resistivity forward

de Astronomia, Geofísica e Ciências Atmosféricas, Departamento de

modeling and inversion using conjugate gradients. Geophysics,

Geofísica.

60(5), 1313-1325.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

39


14-16 OCTOBER 2024 I NAPLES I ITALY

CONTRIBUTIONS

WELCOMED! GO TO SEISMICINVERSION.ORG


TECHNICAL ARTICLE

Innovative environmental monitoring methods using multispectral UAV and satellite data Benjamin Haske1,2*, Tobias Rudolph1, Bodo Bernsdorf1 and Marcin Pawlik1,2.

Abstract Protecting natural and near-natural ecosystems is becoming increasingly important in an ever more densely populated and intensively used earth surface. Climate change-induced extreme weather events have accelerated the environmental degradation that has been taking place for centuries. Comprehensive and precise environmental monitoring is therefore essential, especially around mining, post-mining and industrial sites. Traditional in-situ measurements are inadequate for wide areas, necessitating the integration of satellite and unmanned aerial vehicle (UAV) remote sensing data for a more comprehensive monitoring. This multi-level approach utilises satellites for large-scale and high-temporal remote sensing and UAV data for medium-area and high-precision monitoring, while in-situ measurements serve as validation for both data sources. Different case studies at the Research Center of Post-Mining demonstrate the approach’s effectiveness in geomonitoring post-mining processes and risk management in the oil and gas industry. Integrating diverse data sources enables comprehensive monitoring and analysis, enabling the creation of userfriendly web applications to facilitate efficient risk management decisions. This multi-level monitoring concept offers an efficient approach to understanding and addressing environmental changes and risks in various industries and conservation projects.

Introduction The need to protect natural and near-natural ecosystems is becoming an increasing factor in today’s society. Millennia of agricultural, mining, industrial and infrastructural use of soils, subsurface and waters have significantly changed conditions on our planet. This development has progressed to such an extent that in 2020, the sum of the weight of all things produced by humans exceeded the sum of the weight of biomass on Earth for the first time (Elhacham, et al., 2020), which might further drive the ongoing consideration of introducing an ‘Anthropocene’ into the geologic time scale (Waters, et al., 2016). Extreme weather events facilitated by climate change have even accelerated the rate at which habitats, soils, and infrastructure have been damaged in recent years (Kulkarni, 2021). In order to preserve or, if possible, to improve natural, near-natural and inhabited areas, extensive and precise environmental monitoring is necessary, especially near active mining operations or underground storage sites. The increasing attention on the oil and gas industry and its influence on both local and global ecosystems has reinforced the importance of maintaining a social licence to operate (Goerke-Mallet, et al., 2020). This necessitates a shift away from relying solely on in-situ measurements and local observations, as they are no longer adequate for addressing the magnitude of the industry’s

impact (Haske, 2021). Additionally, changes in raw material availability, legislation, and mining project requirements make it essential to implement integrated monitoring systems. Current methods of remote sensing and photogrammetry offer promising solutions to bridge the spatial and temporal data gaps. The Research Center of Post-Mining has developed a threestage monitoring system that has been successfully employed and tested in various projects. (Figure 1) (Pawlik, et al., 2022) (Bernsdorf, et al., 2022) (Haske, et al., 2021) . •  Level 1: Large-scale and high-temporal monitoring with satellite data •  Level 2: Medium-area and high precision monitoring with multispectral (RGB, near infrared, thermal-infrared) UAV (unmanned aerial vehicle) data •  Level 3: In-situ measurements as ground truth for validation purposes. Regarding environmental monitoring, this method begins by understanding the processes occurring in the different ecosystems. Precise on-site measurements provide insights into specific processes within the system, while aerial measurements offer an initial extrapolation mechanism. If the correlation between the two is established, satellite data can be added and utilised to provide largescale information with significantly enhanced accuracy (Figure 5).

1

Technische Hochschule Georg Agricola University | 2 Technical University Bergakademie Freiberg

*

Corresponding author, E-mail: Benjamin.Haske@thga.de

DOI: 10.3997/1365-2397.fb2024012

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

41


TECHNICAL ARTICLE

Satellite data

Figure 1 Multi-level environmental monitoring approach using satellite, UAV and in-situ data. Modified figure from (Haske, et al., 2021).

The effects of active mining, underground oil and gas storage or post-mining processes, such as subsidence and water leaks, can affect the health of the vegetation. One method of geomonitoring these sites involves utilising vegetation indices derived from satellite images and UAV flights. This allows for the identification and presentation of changes and trends on the earth’s surface, which can then be calibrated and validated by in-situ measurements (Figure 2). Methodology In order to establish the process understanding required to ground truth satellite and UAV data, different types of sensor data were tested, following this shortened methodology: 1. Literature research on the sensor and platform 2.  Specification of data output, comparison with existing sensors 3. Adjustment (e.g. georeference) and data-fusion 4. Application to case studies.

Satellite remote sensing is of great significance for long-term and wide-area geomonitoring, particularly for risk management in industries like oil and gas. However, the availability of current remote sensing data is continually changing. Advances in sensor technology, aerospace, and space technology lead to the development of new data sources while rendering others obsolete due to outdated technology or economic impracticality. In this context, the European Copernicus program by ESA stands out due to its comparatively good sensor technology, modern infrastructure, and the availability of free data (European Space Agency, 2023). Though some commercial satellite constellations offer higher resolution, their high costs make them less suitable for long-term monitoring. Three of the systems, Sentinel-1, Sentinel-2, and to some extent, Sentinel-5P, are particularly valuable for detecting ground movements, vegetation changes, and emissions, making them excellent tools for geomonitoring purposes (Haske, et al., 2022). UAV data

For some years now UAV, colloquially referred to as ‘drones’, have been increasingly used for monitoring tasks. As with other platforms, a wide range of different active and passive sensors has been developed in recent years (Toth & Jutzi, 2017), allowing users to map various physical properties of the earth’s surface and the infrastructure located on it (Figure 4). On one hand, in contrast to aeroplanes, helicopters, and satellites, they offer more flexible deployment options, lower flight costs, and significantly higher resolutions due to lower flight altitudes. On the other hand, there are legal aspects (no-fly zones, permits), weather-related restrictions, a lower spatial coverage and the low temporal resolution associated with these points. Therefore, to balance these advantages and disadvantages, drone and satellite

Figure 2 Building a process understanding based on a multi-level approach (Bernsdorf, et al., 2023).

Figure 3 The Sentinel satellites of the European Copernicus program. In focus: The air quality sensor technology of Sentinel-5P. Modified after (European Space Agency, 2023).

42

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


TECHNICAL ARTICLE

Figure 4 Orthophotos generated using different UAV sensor technologies at an undisclosed gas storage location. From left to right: RGB image representing standard colour capture, Normailsed Difference Vegetation Index (NDVI) derived from multispectral imagery and thermal infrared image highlighting temperature variations (Haske, et al., 2021).

Figure 5 Example comparison of the spatial, temporal and spectral resolution from multispectral Sentinel-2 images with images from a UAV-based MicaSense RedEdge-MX camera. The high-res images can ‘sharpen’ the regularly acquired satellite images (Bernsdorf, et al., 2023).

data should always be used together for the monitoring (Haske, et al., 2022). Both data sources can validate each other through their highly accurate georeferencing (Figure 5).

storage density, grain size analysis, and humidity content. These analyses provided valuable insights that played a crucial role in calibrating the measurements obtained from the sensors.

In-Situ Sensors

Data processing, adjustment, and fusion

To validate the remote sensing data, ground control points were surveyed and marked using classical surveying methods (GNSS, tacheometry). In addition, it makes sense to incorporate live data from the respective operators, meteorological data, and the resident’s on-site location knowledge into the analyses. To gain a comprehensive understanding of soil physics and plant health, the use of various on-site sensors and establishing correlations is crucial. To effectively analyse soil moisture processes and evaluate the plant response to events such as droughts or heavy rainfall, a careful selection of specific sensors is necessary. In the mentioned projects, a combination of reliable high-tech sensors (Meter TEROS 11) and affordable low-cost sensors (MicroSensys TELID) were tested to enhance spatial resolution for measuring soil moisture, soil temperature, and air temperature. Both types of sensors are equipped with software that provides indications of volumetric water content (VWC). Additionally, classical statistical analysis can be easily performed using tools like MS Excel (Figure 6). For the collection of soil samples, a soil drilling stick following the Pürckhauer method and classical extraction cylinders can be used. The soil samples are then subjected to various laboratory analyses, including determining gravimetric water content, soil

All data collected, provided, and added by regular drone and satellite overflights as well as in-situ measurements are combined in a 3D geoinformation system (GIS). High-precision georeferencing of all data allows them to be analysed and displayed in various thematic layers from topography, geology, hydrology, historical data, mining cracks, and remote sensing data in an application-specific manner. To handle the diverse data sources, a wide range of software products, both open-source and proprietary, were utilised in the following case studies. These included the Sentinel Application Platform (SNAP) for satellite data processing, OpenDroneMap, Drone2Map, and Metashape Professional for UAV data processing, ArcGIS Pro and QGIS for data fusion, and ArcGIS Survey123 for in-situ validation of remote sensing data. Case studies In situ components in Geomonitoring – Projects C2M2 and MuSe

The multi-level approach for process understanding (Figure 2) was tested and implemented in consecutive geomonitoring projects, ‘C2M2 - Climate Chance – Management and Monitoring’ (Bernsdorf, et al., 2022), and ‘MuSe – Multisensory GeomonitorFIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

43


TECHNICAL ARTICLE

Figure 6 Comparison of professional (top) and cheap (bottom) soil moisture sensors showing trends in annual circles (Bernsdorf, et al., 2023, modified).

Figure 7 Drone-based NDVI mapping and soil moisture interpolation using mobile GIS mapping results (Own illustration).

ing for Optimizing post-mining water management’ (Yin, et al., 2022). Both projects focused on studying the water regime of the Boye catchment, a subsidiary brook to the river Emscher. The main objective of these projects was to use vegetation as an integral indicator for environmental changes. Vegetation reacts differently to changes in water supply, both undersupply and overprovision can influence plant health. In the ‘C2M2’ project, the question was whether the catchment could provide sufficient water for the healthy development of the close-to-nature decommissioning of the river Emscher. In contrast, the ‘MuSe’ project focused on water management in subsidence areas. A similar approach was employed in both projects. The first step was to install in-situ sensors at a local level. Besides conventional weather stations measuring temperature, wind, rain, and air pressure, special attention was given to implementing soil moisture and soil temperature sensors. Soil moisture sensors mainly utilised capacitive measurements and were cost-effective 44

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

(Bernsdorf & Phyu, 2023). While individual measurements from these sensors were not highly accurate, the data analysis revealed significant trends that could be compared with high-quality soil moisture measurements (Figure 6) according to Bernsdorf & Khaing Zin Phyu (Bernsdorf & Phyu, 2023). In the second step, the findings from the local level were applied to the drone level, providing an initial areal perspective. The objective was to identify plant reactions to drought or other limiting factors, such as rising ground water levels due to subsidence in coal mining areas. In both cases, a decline in plant health was observed and reflected in vegetation indices like the Normalized Difference Vegetation Index (NDVI) and others (Pawlik, et al., 2022). Current work in progress involves consolidating the in-situ measurements using mobile GIS and mobile sensors for soil moisture and soil temperature. The results are then interpolated to obtain an initial aerial view that is calibrated based on the fixed


TECHNICAL ARTICLE

in-situ sensor installations. This calibration allows for a correlation between the drone-based NDVI mapping and the interpolated soil moisture to be established and analysed (Figure 7). These steps elevate the local in-situ measurements to a preliminary areal level. They provide insights into how water regime-related processes affects vegetation and allow for mapping and correlation with vegetation indices such as NDVI. Once the connection between water regime and vegetation is confirmed, the next phase involves developing a forecast for an entire region using satellite sensors similar to the drone sensors (Pawlik, et al., 2021). KaMonSys

In the research project KaMonSys (German acronym: ‘Monitoring system for the safety of cavern storage facilities using satellite and UAV data’) safety solutions for critical infrastructures have been implemented in an interdisciplinary approach of remote sensing and geoscientific methods. Using underground storage facilities as an example, the multisensory approach was used to monitor the facilities as well as their surroundings with satellites and UAVs to detect possible emissions, such as methane, hydrogen, and carbon dioxide (Bernsdorf, et al., 2020). This monitoring approach was extended by the fact that emissions can be detected not only directly but also via secondary effects, such as plant damage or soil movement, so that appropriate measures can be initiated (Haske, et al., 2021). The result was not only a comprehensive geodatabase with extensive analysis results and algorithms for emission detection (Figure 8), but also a fully immersive 3D-visualisiation (Haske, 2023). The ultimate objective was to develop a user-friendly web application for efficiently utilising the diverse data collected. The primary target users of this risk management concept are personnel in control rooms and control facilities of storage and production sites in the oil and gas industry. As such, the interface needed to be accessible even to individuals without a geoscientific background. To achieve this, a straightforward dashboard was designed, where all measured values, geo-data, and monitoring results could be easily accessed through a web browser. The interface was designed to be intuitive and interactive, with all geo-analyses running in the background (Figure 9).

Through active buttons, potential leaks identified by the satellite segment could be quickly detected, compared with existing data, and evaluated. If required, initial measures could be promptly implemented. For instance, one such measure could involve directly planning a drone flight with the relevant sensor technology through the portal and launching it manually. This integrated approach ensures swift decision-making and effective management of potential risks. Digital Twin – Integrated Geomonitoring

The post-mining processes have long-term and profound effects on the surrounding environment. The project ‘Digital Twin Integrated Geomonitoring’ aims to comprehend these processes at the closed Prosper-Haniel mine in the Ruhr area, located in western Germany. Utilising modern integrated geo-monitoring methods, the project aims to visualise and gain insights into the various processes that occur after the mine’s closure (Figure 10). The articles by Pawlik et al. (Pawlik, et al., 2022) and Rudolph et al. (Rudolph, et al., 2023) present the concept of integrated geo-monitoring, which consists of using, interpreting, and analysing different datasets. The data sources are: •  Satellite imagery: - Landsat 4,5,7,8,9 space missions (NASA), - Sentinel-2 space mission (ESA), •  Aerial imagery, •  Images from drone flights equipped with multispectral and thermal cameras, •  In-Situ measurements, •  Topographic, geological, hydrogeological, mining, tectonic maps, •  Mining documentation of the Prosper-Haniel mine and •  Documentation compiled using mobile GIS. This project comprises two main components that involve the three-dimensional modelling of the geological structure and its integration with data obtained from space missions’ images (Pawlik, et al., 2021): •  The three-dimensional geological modelling aims to identify potential tectonic faults that may have been triggered by mining activities and visualise rock formation layers within a specific area.

Figure 8 The interactive GIS from the KaMonSys project. Exemplary layers with information about pipelines, boreholes, caverns, land use, groundwater distances, and geological faults are displayed together with analyses of multispectral UAV flights (RGB, thermal infrared, NDVI) and subsidence derived from Sentinel-1 data (Bernsdorf, et al., 2023). For data protection reasons without scale and exact coordinates.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

45


TECHNICAL ARTICLE

•  Utilising satellite images from the Landsat space missions, the project enables long-term geomonitoring dating back to 1972. Through the analysis of these satellite images, changes in the ground surface and vegetation health within a given area are visually represented using remote sensing indicators. The article by Pawlik et al. (Pawlik, et al., 2023) presents the results of remote sensing analysis using the subsidence lakes Weihnachtssee and Pfingstssee as examples. The authors demonstrate that between 2002 and 2012, there was a reduction in different vegetation indices along with an increase in the water indices Normalised Difference Water Index and Modified Normalised Difference Water Index. It is essential to emphasise that all research findings need to be thoroughly checked and verified. In the project Digital Twin,

verification methods include drone imaging (Pawlik, et al., 2022) (Pawlik, et al., 2023) and the utilisation of mobile GIS (Pawlik, et al., 2023). These methods enhance the spatial resolution of visual terrain assessments through orthophoto maps and enable the computation of remote sensing indices. Mobile GIS facilitates photographic documentation with terrain descriptions and their precise locations. Conclusion and outlook The use of both satellite and UAV data has been found to be of great benefit for effective and efficient spatiotemporal environmental monitoring. Sensor data fusion and GIS-based approaches make the monitoring efforts, e.g. for the reduction of emissions and risks at production and storage sites in the oil and gas industry, much more effective and efficient. The multi-level monitoring concept

Figure 9 The KaMonSys demonstrator as an example of a simple web dashboard. The 2D and 3D maps can be used interactively, possible leakages from the satellite segment can be assessed and drone flights can be planned if required (Haske, 2023).

Figure 10 Conceptualisation of project Digital Twin. Modified after (Pawlik, et al., 2022) (Rudolph, et al., 2023) (Pawlik, et al., 2023).

46

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


TECHNICAL ARTICLE

makes it possible to balance the strengths and weaknesses of the individual systems and sensors, thus ensuring comprehensive, spatio-temporal risk management. The individual system components can be modularly adapted and combined to suit the specific application and study area, making this approach suitable for many different industries and nature conservation projects. Further projects on 3D modelling and multisensory drone flights have already started or are planned at the Research Center of Post-Mining. This includes the monitoring of production and storage sites in the oil and gas industry, vegetation analysis for the detection of gas leaks at old wells, research and training for the use of multisensory drones in fire and disaster protection as well as further cooperation for drone-based building inspection and area monitoring have already started or are planned at the Research Center of Post-Mining. The knowledge gained in the presented case studies regarding data acquisition, geo-data management, geo-analysis, drone flight planning, and data processing can be actively used for other projects.

Goerke-Mallet, P., Rudolph, T., Kretschmann, J. and Brune, J. [2020]. The Importance of „Social License to Operate for the Mining Life Cycle. Ming Report Glückauf 156 (2020) No. 4, 323-332. Haske, B. [2021]. Die Anwendbarkeit frei verfügbarer Fernerkundungsdaten bei Fragestellungen in Risikomanagement-Systemen des Alt- und Nachbergbaus.. Bochum: Masterarbeit an der Technischen Hochschule Georg Agricola. Haske, B. [2023]. ESRI Map Book - Underground Oil and Gas Caverns at a Storage Facility in Germany. [Online] Available at: https://www. esri.com/en-us/esri-map-book/maps#/details/7/4 [Accessed 23 07 2023]. Haske, B. [2023]. Satelliten- und Drohnen-gestützte Monitoring-Konzepte für Produktions- und Speicherstandorte der Öl- und Gasindustrie. NACHBergbauzeit in NRW. Haske, B., Rudolph, T. and Bernsdorf, B. [2021]. Sustainability in energy storages - How modern geoscience concepts can improve underground storage monitoring. Online, s.n. Haske, B., Rudolph, T. and Goerke-Mallet, P. [2022]. The Application of freely available Remote Sensing Data in Risk Management Systems

Acknowledgements The authors would like to thank the sponsors RAG Stiftung, BMBF and EGLV for the funding of the respective projects and the project partner EFTAS Fernerkundung Technologietransfer GmbH for the good cooperation in the KaMonSys project. Further thanks go to the associated project partners Uniper, Salzgewinnungsgesellschaft Westfalen mbH, Gaswärmeinistitut Essen e.V. for providing data, research areas and use case validation. In addition, we would also like to thank the StartING@THGA initiative and the RAG Stiftung for providing different UAVs for the research projects.

for abandoned Mines and Post-Mining. GeoResources Journal, 20(4), 22-26. Kulkarni, S. [2021]. Climate Change, Soil Erosion Risks, and Nutritional Security. In: Climate Change and Resilient Food Systems. Singapore: Springer. Pawlik, M. et al. [2021]. Digital Twin – as instrument for observing changes and trends on the Earth’s surface. online, s.n. Pawlik, M. et al. [2022]. Digital-Twin – How to Observe Changes and Trends on the Post-Mining Areas?. International Journal of Earth & Environmental Sciences Volume 7 (2022), 12 02. Pawlik, M. et al. [2022]. Analyse des Zustands der Vegetation auf dem Gelände des stillgeleten Bergwerks Prosper-Haniel anhand von

References Bernsdorf, B., Formaniuk, A. and Rudolph, T. [2020]. Possibilities of a

multispektralen Satellitenbildern der Sentinel-2 und Drohnenflüge.. Markscheidewesen, 129(1), 37-44.

method for copter-supported gas leak detection with therma imaging

Pawlik, M. et al. [2023]. The use of Mobile GIS in scientific research

cameras in industry and hazard prevention. Oil & Gas - European

- Case Studies. IOP Conference Series Earth and Environmental

Magazine, 46th Edition, Issue 4/2020, 12, pp. 13-20.

Sciences , Volume 1189 012023, 1-19.

Bernsdorf, B. and Phyu, K.Z. [2023]. Zur Bewertung von in situ-Sensoren

Pawlik, M., Rudolph, T. and Bernsdorf, B. [2023]. Analysis of changes

bei der Einschätzung von Prozessabläufen im Geomonitoring..

of the vegetation condition on the area of the closed Prosper-Haniel

Zeitschrift der Deutschen Gesellschaft für Geowissenschaften

mine in 1984-2021 using multispectral satellite images. IOP Confer-

(ZDGG); 1/2023.

ence Series Earth and Environmental Science, Volume 1189 012022,

Bernsdorf, B., Rudolph, T. and Khaing Zin, P. [2022]. Climate Change | Management and Monitoring of Soil and In Situ Data as the Key

1-21. Rudolph, T., Yin, X. and P., G.-M. [2023]. Umfassende Definition des

to Process Understanding. Mining Report Glückauf, 158(1), 32-52.

Geo- und Umweltmonitoring aus den nachbergbaulichen Erfahrun-

Bernsdorf, B., Tiganj, J. and Haske, B. [2023]. Geomonitoring as an

gen im Ruhrgebiet.. Zeitschrift der Deutschen Gesellschaft für

instrument to accompany structural changes in post-mining areas. In: Managing the Change: Tasks of Post-Mining in Ukraine. Bochum: s.n., 92-117. Elhacham, E. et al., [2020]. Global human-made mass exceeds all living biomass. Nature, 9 12, 1-3. European Space Agency, 2023. Europe’s Copernicus programme. [Online]

Geowissenschaften, 173(4), 513-531. Toth, C. and Jutzi, B. [2017]. Plattformen und Sensoren für die Fernerkundung. In: Photogrammetrie und Fernerkundung. s.l.:Springer, 29-64. Waters, C. N. et al. [2016]. The Anthropocene is functionally and stratigraphically distinct from the Holocene. Science, 8 01.

Available at: https://www.esa.int/Applications/Observing_the_Earth/

Yin, X., Bernsdorf, B., Rudolph, T. and Goerke-Mallet, P. [2022]. Die

Copernicus/Europe_s_Copernicus_programme [Accessed 27 07

„Muse“ im nachbergbaulichen Geomonitoring – Neue Ansätze für

2023].

das Poldermonitoring.. Markscheidewesen 1/2022, 129. Jg., 29-36.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

47


Special Topic

DIGITALIZATION / MACHINE LEARNING Seismic acquisition, monitoring, modelling, imaging interpretation, processing and imaging are being transformed by artificial intelligence and machine learning to produce faster results, render data more accessible and more integrated with other disciplines. Geoscientists are utilising AI and machine learning to do the work that was once done by human minds. As a result they have been liberated to focus more on the bigger picture. Hadyan Pratama et al present a series of experiments designed to improve fault likelihood predictions of a pre-trained machine learning model on an unseen dataset with real fault interpretations. Sergey Alyaev et al adapt the latest generation of generative artificial intelligence algorithms and provide the starting point for future multi-scale enhancements. Marvee Dela Resma et al illustrate a field application of a robust and reliable approach for assessing geomechanical parameters during drilling operation or as a post-mortem analysis. Erik Ewig et al present the PGS journey in adopting the cloud and digitalisation to ensure the company remains highly competitive. Neil Hodgson et al compare the job of a geoscientist in 2024 with how it might be in 2050. Edward Jarvis et al discuss how machine learning and artificial intelligence applications are identifying target images from a larger corpus of documents for further analysis, quantifying porosity and identifying core-scale sedimentary facies, hydrocarbon shows and thin section microfacies. John McGaughey et al present a data structure for integration and storage of geological models, data, and metadata where dissemination, ease of access, and persistence are required without commercial encumbrance.

Submit an article

Special Topic overview January

Land Seismic

First Break Special Topics are covered by a mix of original articles dealing with case studies and the latest technology. Contributions to a Special Topic in First Break can be sent directly to the editorial office (firstbreak@eage.org). Submissions will be considered for publication by the editor.

February

Digitalization / Machine Learning

March

Reservoir Monitoring

April

Underground Storage and Passive Seismic

May

Global Exploration

June

Technology and Talent for a Secure and Sustainable Energy Future

It is also possible to submit a Technical Article to First Break. Technical Articles are subject to a peer review process and should be submitted via EAGE’s ScholarOne website: http://mc.manuscriptcentral.com/fb

July

Modelling / Interpretation

August

Near Surface Geo & Mining

September

Reservoir Engineering & Geoscience

October

Energy Transition

November

Marine Acquisition

December

Data Management and Processing

You can find the First Break author guidelines online at www.firstbreak.org/guidelines.

More Special Topics may be added during the course of the year.

48

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Lessons learnt for tuning a machine learning fault prediction model Hadyan Pratama1*, Matthew Oke1, Wayne Mogg1, David Markus1, Arnaud Huck1 and Paul de Groot1 present a series of experiments designed to improve fault likelihood predictions of a pre-trained machine learning model on an unseen dataset with real fault interpretations. Abstract We describe a series of experiments designed to improve fault likelihood predictions of a pre-trained machine learning model on an unseen dataset with real fault interpretations. The goal is to establish a best practice workflow for tuning fault prediction models. The model is a 3D model that is tuned by training with a Masked Dice loss function on 2D interpretations. In our experiments we vary the number of interpreted sections in the training set and vary the number of epochs to train the model. We also compare continuous training versus transfer training. To optimise transfer-training we conduct experiments with freezing different parts of the model. Introduction In recent years, deep-learning-based fault likelihood prediction models have proved to be a valuable addition to the seismic interpreter’s arsenal of interpretation techniques. Models trained on synthetic data can be applied to unseen real datasets to produce meaningful fault likelihood volumes in record time (e.g. de Groot

and Refayee, 2023). These volumes can guide the user in the process of interpreting fault planes, and they can serve as input for automated fault plane extraction. In OpendTect, the software used in this work, this requires post-processing of the machine learning predicted fault likelihood volume. The automated fault plane extraction algorithm needs three inputs: fault likelihood, fault dip and fault strike. Most Machine Learning fault prediction models only generate one output: fault likelihood. Therefore, in most cases, fault dip and fault strike need to be computed before automated fault plane extraction can be applied. Applying a trained model ‘AS IS’ to an unseen data set generates meaningful results very fast but the results are not optimal. Many workers in the field, e.g. Dou et al., 2022, Zhu et al., 2022, and An et al., 2022, have shown that fault predictions can be improved by tuning the model to the data set at hand through additional training on real interpretation examples. In this paper, we focus on practical aspects of tuning. We seek answers to questions such as: how much interpretation is needed, what kind

Figure 1 Outline of the Thebe survey and the split in training area and blind test area.

1

dGB Earth Sciences BV

*

Corresponding author, E-mail: hadyan.pratama@dgbes.com

DOI: 10.3997/1365-2397.fb2024013

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

49


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

of training is better: continued training, or transfer training, and how many epochs of training is required? The paper is organized as follows: First, we describe the trained model that we use in our experiments. Next, we describe the data set and then the experiments before ending with our conclusions and recommendations on best practices. Fault-Net The trained model chosen for this study is known under the name Fault-Net (Dou et.al., 2022). Fault-Net is a 3D deep learning model developed in PyTorch and designed to capture the characteristics of seismic faults. Traditionally, machine learning fault predictors have used the encoder-decoder U-Net architecture (Ronneberger et al., 2015). The encoder part decomposes the input image sequentially into smaller-size features. The decoder reassembles

the decomposed features into larger-size components until the target image emerges. In Fault-Net features are propagated forward at different scales in parallel and subsequently fused together. The Multi-Scale Compression Fusion block (MCF) decomposes the convolution process into feature selection and channel fusion. This prevents image details from being weakened during fusion. FaultNet fully preserves the edge information of the faults. The model is also smaller than competing fault prediction models which means it uses less computational resources. In internal comparison studies we learnt that in general Fault-Net produces cleaner, more reliable results than a 3D U-Net Fault Predictor. Fault-Net transforms a 3D block of input into a 3D block of fault likelihood predictions. The model that we apply in our experiments ‘AS IS’ was trained by Dou et al. on 3D synthetic data and 2D interpretation examples from real data. Training a

Figure 2 Fifty four 3D pairs of seismic inputs and fault mask targets are extracted from the training area. The image shows the seismic data in the training area overlain by a fault mask for all available interpreted data (every 5th inline). Note the dashed appearance of faults on time-slices and cross-lines. Fault masks have a width of three samples in the inline direction. The grey-red cube in the figure shows one fault mask block of data from the training set. The dimensions are 256x256x256 samples.

Figure 3 Inline 1850: a) seismic data; b) with fault mask overlay; c) with Fault-Net ‘AS IS’ prediction overlay.

50

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

3D model on 2D examples is achieved by training with a Masked Dice (MD) loss function. The MD loss function computes the loss on interpreted sections in regions with actual faults. The mask is added so that sections without fault interpretations do not contribute to the model loss. Thebe dataset The dataset in our experiments is a public domain dataset covering the Thebe Gas Field in the Exmouth Plateau of the Carnarvon Basin on the NW shelf of Australia (Yu et al., 2022). The faults were manually interpreted by expert interpreters of the Fault Analysis Group, University College Dublin (Yu et al., 2021). The fault interpretation is limited to faults with vertical displacements greater than 20 m in a depth range between approx. 2 and 4 km. The faults are interpreted every 5th inline. Shallower, deeper and smaller faults have not been picked. We converted the interpreted fault sticks to a binary fault mask volume with values 0 (no-fault) and 1 (fault). To compensate for spatial inaccuracies, the width of faults in our mask volume is three samples in the inline direction (one additional sample on either side of the interpretation). Next, we split the data into two sets: one for extracting training examples, and one for blind testing. Figure 1 shows the two areas. Experiments Fault-Net transforms 3D blocks of seismic data into similarly sized blocks of fault likelihood. The block-size in all experiments is 256 x 256 x256 samples (Figure 2). In principle, we extract blocks of data without overlap. However, to optimise the number of blocks that can be extracted from the training area, we extract the last block in each direction counting backwards from the edge. This leads to overlapping examples in all three directions for the last and last-but-one example in the range. Operating in this way, we extract 54 examples for tuning the Fault-Net model. We conduct our experiments on a Linux-operated workstation with 11 GB of GPU memory, 1 CPU with 10 cores, and 256 GB RAM. Training and model application are done on the GPU. We use cross-validation in the training phase. Blind tests results are shown on inline 1850 and crossline 2885. Figure 3 shows inline 1850 with and without interpretation and with the Fault-Net prediction when applied ‘AS IS’. Figure 4 shows the same for crossline 2885. All experiments are compared to each other and to the ‘AS IS’ results in Figures 3c and 4c, respectively. To assess the quality of predictions we use visual comparison and we compute the Fault Detection Accuracy (FDA) index introduced by Li et al. (2019). FDA index is a simple evaluation metric that is also known as the Jaccard index. It returns the normalised value (0 to 1) of the Intersection over Union (IoU)

Figure 4 Crossline 2885: a) seismic data; b) with fault mask overlay; c) with FaultNet ‘AS IS’ prediction overlay.

of two sets. In this case: the IoU of interpreted faults (ground truth) and predicted fault likelihood. The FDA for the ‘AS IS’ application of the model on the test dataset is 0.68. Experiment 1: Varying the number of training epochs

With a sparse training set of only 54 examples, there is a fair chance of overfitting the data. Although we use cross-validation to avoid overfitting, this is not really helping us to determine the best stopping point. In cross-validation a new test set is randomly selected after each epoch. With an increasing number of epochs, all examples will eventually be used for training. Consequently, error functions continue to decrease on both the training set and the cross-validation set in all experiments. To determine the best stopping point we therefore employ a different strategy: we eval-

Number of epochs

Fault Detection Accuracy (FDA) Index

Training Time (GPU)

5

0.65

55 minutes

10

0.60

1 hours 40 minutes

15

0.58

2 hours 5 minutes

20

0.60

2 hours 40 minutes

Table 1 Statistics for the number of epochs experiment (continued-training).

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

51


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 5 Determining the number of epochs to train a model with continued-training on all available interpretations (every 5th inline). Results on inline 1850: a) 5 epochs; b) 10 epochs; c) 15 epochs; d) 20 epochs.

Model

Fault Detection Accuracy (FDA) Index

Training time (GPU)

Transfer-train model 1

0.63

3 hours 20 mins

Transfer-train model 2

0.70

5 hours 8 minutes

Transfer-train model 3

0.59

4 hours 20 minutes

Continue-train model

0.63

3 hours 10 minutes

Table 2 Statistics for transfer-train vs. continue-train experiments.

uate prediction performance on our blind test lines after stopping training at 5, 10, 15 and 20 epochs. The model is continue-trained on all available interpretations (every 5th inline). The results are given in Figure 5 for inline 1850 and Figure 6 for crossline 2885. Table 1 shows relevant statistics. All continued training experiments yielded lower FDA metrics on this dataset than the application of the ‘AS IS’ model. Experiment 2: Continued training vs. transfer training

The goal of this experiment is to determine which training approach gives optimal results in terms of accuracy and efficiency. In continued training, all weights of the model are updated. 52

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

In transfer training, only a subset of the weight set is updated. The benefit of this approach is that features in the convolutional layers that capture generic characteristics of faults are kept intact. The model will be less prone to overfitting. It should give a better prediction on faults seen in the new training set without losing the ability to predict generic faults. Which part of a model should be frozen, and which part should be updated, is an open question that needs to be addressed per model. Here, we test three different transfer-train models: 1. Update the last layer only (transfer model 1) 2. Update the last two layers only (transfer model 2) 3.  Update the first layer only (transfer model 3)


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

In Figures 7 and 8, we compare the transfer-train model results with a continue-trained model on inline 1850 and crossline 2885,

respectively. All models are trained on all available interpretation data (every 5th inline) with cross-validation for 20 epochs. Table 2 gives relevant statistics. Experiment 3: Varying the number of interpreted lines

The goal of this experiment is to determine how much interpretation is needed for tuning our Fault-Net model. Faults are interpreted at every 5th inline. In this experiment, we reduce the number of interpreted lines in the target fault mask volume from every 5th, to every 10th, 20th, 40th, and 80th inline. We transfer-train model 1, the one in which only the last layer is updated, for 20 epochs. The results are shown in Figure 9 for inline 1850. Table 3 gives the statistics for this experiment.

Figure 6 Determining the number of epochs to train a model with continuedtraining on all available interpretations (every 5th inline). Results on crossline 2885: a) 5 epochs; b) 10 epochs; c) 15 epochs; d) 20 epochs.

Discussion Figures 3 and 4 show that Fault-Net is a powerful fault likelihood predictor that generates accurate predictions on an unseen dataset. The ‘AS IS’ model application predicts faults at all interpreted locations including many clearly real (smaller) faults in the shallow section that were not included in the manual fault interpretation. The Fault Detection Accuracy (FDA) metric that we compute only quantifies the accuracy of the predictions relative to the given manual interpretations (the ground truth in the FDA index). The FDA metric does not measure the extent of either false negatives or false positives in the predicted results. For assessment of these we focus on qualitative observations made as seismic interpreters. The FDA metrics for experiment 1, continued training, (0.58-0.60) were significantly lower than application of the ‘AS IS’ model (0.68). Continued training also results in models that lose the ability to predict the many small faults in the shallow section (Figures 5 and 6). Some of the large-scale faults, appear to be much sharper in all the prolonged training experiments. This may be a function of overfitting which would explain why some faults appear better in one experiment and worse in another and vice versa. The transfer-training versus continue training experiment (Figures 7 and 8) shows that updating more layers decreases the model’s ability to predict generic (unseen) faults. Transfer-train model 1, the one in which only the last layer is updated, is considered the best model. However, comparing the predictions of this model on our blind test lines (Figures 7a and 8a) with the predictions of the ‘AS IS’ model (Figures 3c and 4c, respectively), we only observe marginal differences. The FDA metrics confirm that the ‘AS IS’ model (0.68) gave a better prediction of manual fault interpretation in all cases except transfer training

Interpreted inlines

Interpreted %*

Fault Detection Accuracy (FDA) Index

Training Time (GPU)

Every 5th

5.1%

0.63

3 hours 20 minutes

Every 10

th

2.6%

0.68

3 hours

Every 20th

1.3%

0.66

2 hours 46 minutes

Every 40th

0.6%

0.62

2 hours 47 minutes

Every 80

0.3%

0.61

2 hours 50 minutes

th

Table 3 Statistics for an experiment with varying number of interpreted lines. *Percentage = training area / seismic area / N * 100, where N is every Nth interpreted inline.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

53


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 7 Transfer-train vs continued-train results on blind test inline 1850: a) Transfer-train model 1; b) Transfer-train model 2; c) Transfer-train model 3; d) Continued-train model. Fault-Net architecture figures adapted from Dou et al. (2022); blue square indicates frozen layers in training.

Figure 8 Transfer-train vs continued-train results on blind test crossline 2885: a) Transfer-train model 1; b) Transfertrain model 2; c) Transfer-train model 3; d) Continued-train model. Fault-Net architecture figures adapted from Dou et al. (2022); blue square indicates frozen layers in training.

54

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 9 Determining how many lines need to be interpreted for a satisfactory result on Inline 1850: a) every 5th inline; b) every 10th inline; c) every 20th inline; d) every 40th inline; e) every 80th inline.

model 2 (updating only the last two layers) (0.70). This marginal improvement in the FDA metric is accompanied by a failure of the transfer training model 2 to predict any of the shallow, small scale faults. On inline 1850 (Figures 7a and 3c) the transfer-train model appears to be marginally better in some places and marginally worse in others. On crossline 2885 (Figures 8a and 4c) the differences are larger. The ‘AS IS’ model predicts more faults on the left-hand side. Also, the large fault that is cut in the strike direction extends further to the right. However, when we compare these results with the interpreted fault mask (Figures 4b) we notice these predictions were not mapped by the interpreter. Also, a comparison with the original data, does not show clear evidence of faults being present at these locations. In other words, the ‘AS IS’ model may have predicted false positives.

The experiment in which we reduce the number of interpreted lines from every 5th to every 80th (Figure 9) does not yield a clear answer. Intuitively, we expect better results with more interpreted lines. The results do not confirm this expectation as the differences in Figure 9 are marginal. Apparently, the result of the ‘AS IS’ model is already so close to the desired outcome, that tuning does not lead to significant improvements in this data set. Conclusions We presented a series of experiments that were designed to establish a best-practice workflow for tuning a pre-trained deep learning fault likelihood prediction model. The workflow centres on Fault-Net, a pre-trained model that is included in the library of pre-trained models within our machine learning FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

55


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

solution. As a first step in the workflow, we recommend applying Fault-Net ‘AS IS’. This step generates valuable results in a matter of minutes. If the results are not satisfactory, we can tune the model using transfer-training on a sparse training set. In transfer-training we only update the last layer of the Fault-Net model. The training set does not need to cover the entire seismic survey area but it should cover an area with representative faults. We recommend interpreting this area in a regular fashion, every Nth inline, or crossline. The value of N is data dependent and can be established by trial-and-error. For example, we can start interpreting every 20th line, tune the model and QC the prediction results. If the results are not satisfactory, we interpret more lines (e.g. every 10th line) and we use the prediction results to update existing interpretations. This cycle is repeated until we are satisfied.

References An, Y., Guo, J., Ye, Q., Childs, C., Walsh, J., and Dong, R. [2022]. Deep convolutional neural network for automatic fault recognition from 3D seismic datasets. Computers & Geosciences 153 (2021) 104776. An, Y., Guo, J., Ye, Q., Childs, C., Walsh, J., and Dong, R. [2021]. A gigabyte interpreted seismic dataset for automatic fault recognition. Data in Brief 37 (2021) 107219. de Groot, P. ,and Refayee, H., [2023]. How reusing trained machine learning models accelerates and improves the work of operational geoscientists. First Break, February 2023. Dou, Y., Li, K., Zhu, J., Li, T., Tan, S., and Huang, Z. [2022]. MD Loss: Efficient Training of 3-D Seismic Fault Segmentation Network Under Sparse Labels by Weakening Anomaly Annotation. IEEE Trans. Geosci. Remote. Sens. 60: 1-14. Li, Si, Yang, C., Sun, H., and Zhang, H., [2019]. Seismic fault detection using an encoder–decoder convolutional neural network with a small

Acknowledgment We thank the Australia National Offshore Petroleum Information Management System (NOPIMS) for providing the original Thebe dataset and the Fault Analysis Group of the University College of Dublin for releasing the fault interpretations. We also thank Dou et al. (2021) for releasing Fault-Net.

training set. Journal of Geophysics and Engineering (2019) 16, 175–189. doi:10.1093/jge/gxy015 Ronneberger, O., Fischer, P. and Brox, T., [2015]. U-Net: Convolutional Networks for Biomedical Image Segmentation. Medical Image Computing and Computer-Assisted Intervention (MICCAI), Springer, LNCS, Vol.9351: 234--241, 2015. Zhu, D., Li, L., Guo, R., Tao, C., and Zhan, S., [2022]. 3D fault detection: Using human reasoning to improve performance of convolutional neural networks. Geophysics, Vol. 87, No. 4 (July-August 2022); P. IM143–IM156, 18 Figures 10.1190/GEO2020-0905.1.

ADVERTISEMENT

Fourth EAGE Marine Acquisition Workshop 2 - 4 S E P T E M B E R 2 0 2 4 • O S LO, N O R WAY

TOPICS • Seismic Source Technology • Seismic Acquisition and Modelling • Other Geophysical Applications • New Energy Applications

Deadline: 31 March 2024

W W W. E AG E .O R G

Submit your abstracts! 56

MAW24 V2H.indd 1 FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

27/11/2023 08:21


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Ensemble history-matching workflow using interpretable SPADE-GAN geomodel Kristian Fossum1, Sergey Alyaev1* and Ahmed H. Elsheikh2 adapt the latest generation of generative artificial intelligence algorithms for describing subsurface and provide the starting point for future multi-scale enhancements. Abstract Ensemble history matching adjusts multiple geomodels used for reservoir simulation, conditioning them to historical data. It reduces and quantifies the uncertainty in the unknown model parameters to increase the models’ reliability for decision support. In this study, we adapt the latest generation of generative artificial intelligence algorithms, SPADE-GANs. In geosciences, Generative Adversarial Networks (GANs) learn to simulate complex geological patterns. SPADE (SPatially Adaptive DEnormalisation) layers in the GAN generator learn conditioning to the coarse geological structure provided as coarse-scale maps, enabling explainable output, stable training, and higher variability of resulting outputs. Our statistical method, an iterative ensemble smoother, assimilates data into an ensemble of these maps, interpreted as the channel proportions. This Bayesian data assimilation conditions the ensemble of GAN-geomodels to a combination of well data and flow data, thus extending the usability of pretrained SPADE-GANs in subsurface applications. Our numerical experiments convincingly demonstrate the method’s capacity to replicate previously unseen geological configurations beyond GAN’s training data. This proficiency is particularly valuable in data-scarce scenarios typical for renewable geo-energy, where the GAN captures realistic geology, but its output geomodels must be adjusted to match observed data. Furthermore, our fully opensource developments lay the foundation for future multi-scale enhancements of history matching workflows. The extended abstract for this article is published in the proceedings of the Fifth EAGE Conference on Petroleum Geostatistics (November 27-30, 2023; Porto, Portugal). Introduction Generative Adversarial Networks (GANs) and their derivatives are now the go-to methods in image generation (Rombach et al., 2022). Recognising their potential beyond image processing, our study aims to investigate the effectiveness of GANs in geostatistical applications, focusing on their ability to generate realistic geological models that can significantly aid in resource exploration and management. This ability is particularly crucial in addressing the increasing complexity of geological models and the need for advanced yet efficient tools for accurate interpretation and simulation.

1

NORCE Norwegian Research Centre | 2 Heriot-Watt University

*

Corresponding author, E-mail: saly@norceresearch.no

In recent years, generative artificial intelligence has also become a hot topic within geostatistics and was widely presented at the latest EAGE Conference on Petroleum Geostatistics of 2023 (Escada and Azevedo, 2023; Ovanger et al., 2023; Miele et al., 2023). This highlights a growing interest and the untapped potential of applying AI technologies like GANs in geosciences. Once trained, a GAN maps basic numerical inputs (vectors) into complex geological patterns through convolutional layers, which gradually increase the resolution and level of details of an image or a geomodel as they process through the network layers. This approach outperforms classical geostatistical simulators such as SNESIM in accuracy and speed (Chan and Elsheikh, 2019). Ensemble history matching can exploit the model parametrisation learnt by a GAN to effectively condition geomodel realisations to data (Canchumuni et al., 2021). The resulting ensemble-based subsurface representation combines the geological knowledge embedded in GANs with conditioning to the observed data. Thus, they possess powerful predictive abilities that can improve various decision-support workflows. Many of the latest GAN methods can themselves learn conditioning to data during training. This extends data integration possibilities beyond the ensemble conditioning methods. Here is a brief account of noteworthy developments. Mohd Razak and Jafarpour (2022) explored the application of ‘Conditional’ GAN, a network that can be conditioned to data: in this case, to labels representing subsurface flow responses. Laloy et al. (2021) introduced a ‘vec2pix’ algorithm building on a convolutional GAN. It produces subsurface realisations conditioned to crosswell time-series data (vec) for recovering 2D rock properties on a pixelated grid (pix). Zhang et al. (2022) combined GANs with auxiliary networks for a low-dimensional representation of data (called autoencoders) that learn to generate parts of the GAN’s input vector, thus conditioning the GAN’s output to some desired geological properties. The authors also combined the resulting model with ensemble-based data assimilation for history matching. Abdellatif et al. (2022) introduced a so-called SPacially Adaptive DEnormalization (SPADE) layer that conditions the GAN inner parametrisation on a coarse-scale parameter map. Conditional GANs open possibilities for faster and more advanced geostatistical workflows. However, one needs to take

DOI: 10.3997/1365-2397.fb2024014

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

57


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 1 The links and the QR codes of the opensource components used in this paper and the opensource code to reproduce the presented results.

Figure 2 Workflow that uses a trained SPADE-GAN geomodel generator in an ensemble data assimilation of different data types.

care when integrating them into ensemble-based history-matching workflows to ensure that different sources of data are assimilated in a statistically consistent manner. This paper presents an ensemble history-matching workflow that performs coarse-scale conditioning, interpreted as the channel proportion, using the SPADE-GAN geomodel structure. Our numerically efficient and parallelisable workflow integrates ‘hard’ well data and ‘soft’ flow data, reducing the uncertainty in an ensemble of the GAN’s conditional inputs without retraining. The resulting GAN-geomodel ensemble reproduces configurations consistent with data not seen during training, thus extending the practical applicability of pre-trained conditional GANs in subsurface workflows. Our workflow implementation is a combination of open-source tools for GAN Abdellatif et al. (2022), history matching Lorentzen et al. (2023), reservoir simulation Rasmussen et al. (2021), and geostatistical simulator producing the synthetic truths, (Hansen et al, 2016). The links and the QR codes to the open-source repository for this paper and the components described above are in Figure 1. Workflow The conditioning workflow has several stages. During the initial stage, the SPADE-GAN generator is trained to reproduce geological patterns and to learn conditioning on the coarse-scale distribution of channels. This paper uses the pre-trained 2D generator from Abdellatif et al. (2022). Equipped with the SPADE-GAN, we 58

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

propose to condition the ensemble stepwise: assimilate the ‘hard’ data from well picks first and ‘soft’ flow data second, see Figure 2. The trained GAN model is used in ensemble-based data assimilation to match observations using a version of an iterative ensemble smoother (Chen and Oliver, 2013) that optimises the GAN’s input parameters starting from a prior ensemble of realisations. Finally, the ensemble-geomodel statistics over multiple scales form a basis for decision-making for future field development. SPADE-GAN

GANs are designed to learn from training samples, capturing their underlying patterns. For the case of facies modelling, we want the GAN to learn the realistic distribution of facies from the training images. During adversarial training, a GAN includes a generator G and a discriminator D. The generator creates new samples from a normally distributed latent vector z while the discriminator evaluates their authenticity. This setup creates a dynamic where both networks gradually improve each other, learning to generate and distinguish realistic samples respectively. In the simplest case, this is accomplished by reducing the GAN loss L(G,D):

(1)

where x~pdata[∙] denotes the expected value when facies realisation x is sampled from the real data pdata while denotes the


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

expected value when the latent vector z is sampled from the selected input stochastic distribution pz (Goodfellow et al., 2014). SPADE-GAN introduces a significant enhancement to the GAN framework by incorporating spatial semantical information into the training process through semantic masks M. In the original image-generation tasks, a mask represented a segmentation of the image into several objects, like trees, sky, grass, etc. (Park et al., 2019). Abdellatif et al. (2022) proposed to use the SPADE-mask layers to represent the spatial distribution of continuous geological properties to adjust the generator’s output. Their pre-trained model that we adopt uses a 4x4 coarse channel-density map. The semantical masks are injected into the architecture through a series of Spatially Adaptive Denormalisation (SPADE) layers developed by Park et al. (2019). The SPADE-GAN generator learns to modify each part of the generated image based on the corresponding part of the spatial map. The SPADE-GAN discriminator is enhanced to evaluate the ‘true sample’ probability in terms of both the geological authenticity and consistency with the spatial map. As a result, the final generated facies images closely resemble natural geological formations and are consistent with the provided coarse map. The adversarial loss L(G,D) conditioned on the masks can be formally written as: (2) where D(x|M) is the ‘true’ sample probability given a semantical mask M, and the generated facies image G(z,M) is a function of both a latent vector z and the semantical mask. We recommend referring to Abdellatif et al. (2022) for full implementation details. The multiscale architectures of the used SPADE-GAN networks are shown in Figure 3. SPADE-GAN generator and ResNet-based discriminator are deep CNNs, meaning that the same generative operator is applied to all image parts. This structure ensures that SPADE-GAN can generate consistent facies distribution output for spatial maps not seen during training and reduces the required size of the training data. The pre-trained SPADE-GAN model in our study interprets the masks as the coarse density of channel bodies within the area, making

its outputs explainable with respect to the coarse-scale features. It automatically distributes levees surrounding the channels, mimicking the geological system in the training dataset. In our flow simulation, we merge the channel and levee system for the preferential flow paths. The future uses of semantical masks can include conditioning the generated realisations to more complex properties such as seismic images. Ensemble-based history-matching methods

In this paper, we use two ensemble smoother methods from the open-source PET library (Figure 1: the Ensemble Randomised Maximum Likelihood (EnRML) (Chen and Oliver, 2013) and the Ensemble Smoother with Multiple Data Assimilation (ES-MDA) (Emerick and Reynolds, 2013). The EnRML and the ES-MDA represent a significant advancement in geosciences, particularly in reservoir management and simulation. The techniques merge available data with reservoir simulation models, leading to more accurate predictions of reservoir behaviour. By integrating dynamic data with static data from geological studies and well logs, ensemble-based methods play a crucial role in reducing uncertainties related to reservoir parameters. We will consider the channel-proportion coarse map M as our unknown parameter in the experiments. At the core of the methods is the ’Analysis’ step. In this step, the ensemble of model parameters is updated to minimise a statistical misfit with observed measurements. (3) where, ‖∙‖C= (∙)T C-1(∙)

(4)

CM is the prior covariance matrix of M, Cd is the measurement error covariance matrix, g(·) is the observation operator, mapping the model state space to the observation space (here, GAN-geomodel generator and reservoir flow model), and d denotes the observed data (here, facies type and historical flow-rate). The approach involves conditioning model variables to realisations of the measurements by formulating a minimisation problem.

Figure 3 The architectures of the SPADE-GAN generator and discriminator used during training. The training of the two networks is performed simultaneously as a competitive ‘game’. The SPADE-GAN generator is conditioned to the coarse-scale proportion map after the training.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

59


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

In the EnRML, this problem is solved using the iterative Gauss-Newton method with a “Levenberg-Marquardt” modification. The update equation incorporates a sensitivity matrix and a state covariance matrix, both derived from the ensemble. The step is complete when the minimisation has converged. One iteration of the EnRML is given as

The ensemble-based history-matching methods, which integrate diverse data sets and handle uncertainties effectively, are valuable tools for geoscientists and engineers. Its sophisticated approach to data assimilation and model updating makes it indispensable for advanced reservoir simulation and management, ensuring more reliable predictions and efficient resource extraction. Both EnRML and ES-MDA will be employed in the numerical investigation.

(5) where G is the sensitivity matrix, λ is the Levenberg-Marquardt damping factor, ε denotes noise perturbation, and n denotes the iteration step. The EnRML method, combined with GAN, has previously been proven to provide accurate samples from the posterior when compared to samples from MCMC (Fossum et al., 2022). In the ES-MDA, the conditioning is solved by dividing the data assimilation process into multiple cycles. Each cycle assimilates a fraction of the total available observation data using the ES implementation, approximating the state covariance and state-data cross-covariance from the ensemble. The data is divided into fractions by inflating the data covariance matrix with the MDA inflation factor. The gradual assimilation approach avoids overfitting observations and handles non-linearities better. This allows for a more accurate representation of uncertainties in the model, leading to a refined posterior ensemble that better reflects the observed data. The step is complete when a pre-determined set of cycles is performed. One cycle of the ES-MDA is given as (6) where CM,g(M) represents the cross-covariance between state and predictions, Cg(M) denotes the prediction auto-covariance, and αn is the MDA inflation factor. To ensure statistically correct results, we enforce that the sum of αn equals one.

Examples To illustrate our approach, we present two synthetic experiments demonstrating ensemble-based history matching using SPADEGAN geomodels in a channelised reservoir setting. In both scenarios, the ‘true’ models are generated from the Strebelle training image with the MPSlib implementation of the SNESIM algorithm from the open-source MPSlib (Figure 1). Such synthetic truths are not part of the GAN training images. Moreover, they are based on a fundamentally different method. For Case 2, we consider a challenging scenario where the coarse-scale property distribution differs from the GAN training data (a single channel toward the top). Given a 4×4 channel-proportion coarse map M and a random noise vector z, the pre-trained SPADE-GAN generates a 64×64 image with three facies: channel, background, and levee. Due to the limitations of classical geostatistical methods, the synthetic truth model only contains two facies. We modified the SPADEGAN output by mapping the levee facies to channel facies in our implementation to obtain a fair comparison. While it is possible to condition both M and z to the data, we cultivated the in-built conditioning in this study and only used the coarse map M as the unknown parameter. Figure 4 illustrates how the channel-proportion map (Figure 4(a,f)) produces an image with three facies (Figure 4(b,g)) and the resulting modified image with two faces (Figure 4(d,i)). We generated 100 images with the same proportion map and

Figure 4 Two examples of the spatial distribution of geological features: (a) & (f): Realisations of coarse-scale channel-proportion maps from the prior. (b) & (g): GAN realisations with three facies created by the pre-trained model from Abdellatif et al. (2022). (c) & (h): Point-wise probability of channel facies calculated from 100 such three-facies GAN realisation with the same proportion map (matches the coarse-scale proportion map on average). (d) & (i): Same GAN realisations with two facies (as used in the history matching examples) obtained by merging levee and channel facies. (e) & (j): Point-wise probability of merged-channel facies calculated from 100 such as two facies GAN realisation with the same proportion map. For figures (b), (d), (g), and (i): blue represents background facies, green levee facies, and red channel/merged-channel facies.

60

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 5 The statistical visualisation of the ensemble of 100 prior realisations, where each proportion map generates one channel distribution. (a): Mean coarse-scale channel-proportion map. (b): Point-wise probability of channel facies from the three-facies GAN model. (c): Point-wise probability of merged channel facies from the prior ensemble. This representation is used in the rest of the history-matching study.

Figure 6 Results after conditioning to the ‘hard’ well data. (a): The posterior point-wise probability of channel facies. (b): GAN image of one realisation from the posterior.

different latent vectors to demonstrate that the coarse-scale channel proportion map provides consistent models (per-pixel channel probabilities are in Figure 4(c,h)). We also calculate the per-pixel probability of channel facies for two-channel output used in our test cases (Figure 4(e,i)). Figure 4 illustrates that the coarse channel-proportion map M produces consistent images of channels, and the per-pixel probability has a similar structure and value as M. Mapping the levee facies as merged-channel facies preserves the structure, but the channels become wider, giving more flow paths. In this study, we modify an ensemble of channel-proportion maps. We illustrate the prior configuration in Figure 5, including the mean of 100 realisations of the proportion map (5a), the corresponding per-pixel probability of original channel facies (5b), and merged-channel facies (5c). Note that one realisation of the random vector, z, was used for each realisation of M. Since we had 100 realisations of M, this produced the 100 models of the prior. Figure 5 demonstrates that this representation is consistent between the coarse map, the original channels, and the merged channels. In numerical studies, we use the black-oil model from the OPM Flow reservoir simulator on 64×64 cartesian grids. We place two water injection wells on the left edge of the reservoir and two producers on the right edge. Injectors and producers are controlled by bottom-hole pressure: 275 bar for the injectors and 103 bar for the producers. Oil production, water production, and water injection rates were observed every 2500 days for 25,000 days. The measurements were assumed to have uncorrelated errors with a standard deviation of 8 sm3/day. The model’s porosity is considered to be constant at 0.2. The permeability is isotropic but differs between the facies: background: 2.5 milliDarcy; channel:

500 milliDarcy. Corey correlation curves and standard PVT tables were used for all three fluid phases. Generating the Prior Model

Before history matching, we need to describe the prior ensemble. It is reasonable to assume that some preliminary information is available through, e.g., seismic. In this case, we embed this information in the ensemble of coarse-scale (4×4) channel-proportion maps M. We choose almost horizontal channels and define the prior model as multivariate Gaussian with mean zero and spherical-variogram covariance with variance: 1; and range: 3×0.3 grid cells. The Gaussian is transformed to a log-uniform distribution between 0.01 and 0.5 to get realistic channel proportions. Figure 5a shows the prior mean of M. We sample the prior as 100 realisations from this distribution. In addition, we generate 100 normally distributed realisations of latent vector z. The history matching did not modify z-vectors, ensuring more fine-scale stochastic variations. Matching hard data

In the first experiment, we condition the channel-proportion map using ‘hard’ data, i.e., adjusting the models to align with concrete data from the drilling of wells. In this case, the data contains facies information acquired while drilling the wells. PRO1, INJ1, and INJ2 wells are drilled in the background facies, while PRO2 is in the channels. We indicate the facies type in wells by the colour of the well name in Figure 6. We define the measurement error as Gaussian with mean zero and uncorrelated variance of 10−8. The EnRML method was employed, and convergence was obtained after seven iterations. FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

61


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Since the channel/background map is a binary field, it is straightforward to calculate the point-wise probability estimates of either facies. Figure 6 compares the probability of the channel facies for the prior model and the posterior model conditioned to ‘hard’ data. As expected, the posterior matches the ‘hard’ data and has a high probability of channel facies in well PRO2 and a low probability for the remaining wells. At the same time, each realisation maintains a realistic channel structure on the fine scale; see Figure 6(b). Matching soft data - case 1

In this experiment, we utilise the 100 posterior realisations obtained by conditioning to ‘hard’ data in the previous experiment as the prior ensemble. We then assimilate the soft data of the rate observations generated using the data-generating model for Case 1; see Figure 7(a). The EnRML method was used, and convergence was obtained after three iterations. Comparing the soft data assimilation results in Figure 7(b) to the hard-data-based prior (Figure 6), we see little changes to the ensemble. However, one can see an increased probability of a connected third (central) channel in the reservoir, which coincides with the channel distribution from the synthetic truth. We note that the variability of latent vectors zi (unchanged during history matching) between individual realisations maintains geological realism and diversity on the fine scale; see Figure 7(c). Statistics from the ensemble of predicted water production rates (most sensitive to channelling) from the prior and posterior are shown in Figure 8. The plots show that the update aligns the models more with the data. The well PRO1 has a significant reduction in the spread of produced water. This observation correlates well with the information from the channel distribution. Water from INJ1 is moved along the central channel towards PRO2, efficiently reducing the water in PRO1.

Matching soft data - case 2

In the final experiment, we again utilise the 100 hard-data-conditioned realisations (Figure 6(a)) as the prior. We then assimilate the soft data of the flow-rate observations generated using the data-generating model for case 2 (Figure 9(a)). For this experiment, we used the ES-MDA with 10 MDA cycles. The point-wise probability for channel facies and one realisation of the ensemble are shown in Figures 9 (b) and (c), respectively. In Case 2, the synthetic truth lacks channels in its lower half, which drastically differs from the training configurations and the mean of the prior ensemble — this configuration of the reservoir results in no water production from well PRO1. The update introduces visible new information to the ensemble (see differences in Figure 6(a) and Figure 9(b)). The update reduces the probability of a connected lower and middle channel to almost zero. Hence, water cannot move from INJ1 to PRO1 efficiently, preventing water production in well PRO1, which is consistent with the data-generating model (Figure 9(a)). Conclusions This paper introduced conditional GAN geomodels into an ensemble-based reservoir history-matching workflow with ‘hard’ well data and a ‘soft’ time series of flow data. Our research successfully demonstrated the integration of new open-source conditional SPADE-GAN geomodels with an ensemble-based history-matching workflow based on our open-source PET library. The workflow matched the historical data by conditioning the coarse-scale channel-proportion map of the models. It reduced the model uncertainty while preserving the variability and geological consistency of the realisations on the fine scale. Notably, in our two test cases, we observed that this integrated approach was effective at identifying geological patterns beyond

Figure 7 Results after conditioning to the soft flow data (case 1). (a): Channel distribution for data generating model. (b): the posterior point-wise probability of channel facies. (c): GAN image for one realisation.

Figure 8 Initial and final Well Water Production Rates (WWPR) forecast for Case 1. (a) results from well PRO1; (b) results from well PRO2.

62

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 9 Results after conditioning to the soft flow data (case 2). (a): Channel distribution for data generating model. (b): the posterior point-wise probability of channel facies. (c): GAN image for one realisation.

the scope of the GAN’s initial training. Hence, the SPADE-GAN realisations’ additional coarse-scale conditioning, interpreted as the channel proportion, can significantly extend the GAN’s predictive ability. This feature is vital in low-data environments where the pre-trained GAN does capture realistic geology, but the GAN-geomodels do not agree with observed data. In this study, we only updated the coarse conditional map, while a complete multiscale workflow that also updates the GAN latent vector remains for future work.

Hansen, T.M., Vu, L.T. and Bach, T. [2016] MPSLIB: A C++ class for

Acknowledgments This work is part of the project DISTINGUISH (Decision support using neural networks to predict geological uncertainties when geosteering) funded by Aker BP, Equinor, and the Research Council of Norway (RCN PETROMAKS2 project no. 344236).

Miele, R., Levy, S., Linde, N. and Azevedo, L. [2023] Deep Generative

sequential simulation of multiplepoint statistical models. SoftwareX, 5, 127=133. Laloy, E., Linde, N. and Jacques, D. [2021] Approaching geoscientific inverse problems with vector-toimage domain transfer networks. Advances in Water Resources, 152, 103917. Lorentzen, R., Bhakta, T., Fossum, K., Haugen, J.A., Lie, E.O., Ndingwan, A.O. and Straith, K.R. [2023] Ensemble-based history matching of the Edvard Grieg field using 4D seismic data. Submitted to Computational Geosciences. Networks for Multivariate Fullstack Seismic Inversion with Using Inverse Autoregressive Flows. In: Fifth EAGE Conference on Petroleum Geostatistics, 2023. European Association of Geoscientists & Engineers, 1-5. Mohd Razak, S. and Jafarpour, B. [2022] conditioning generative adversari-

References

al networks on nonlinear data for subsurface flow model calibration and

Abdellatif, A., Elsheikh, A.H., Busby, D. and Berthet, P. [2022] Genera-

uncertainty quantification. Computational Geosciences, 26(1), 29-52.

tion of non-stationary stochastic fields using Generative Adversarial

Ovanger, O., Lee, D., Skauvold, J., Hauge, R., Eidsvik, J. and Aune, E.

Networks. arXiv preprint arXiv:2205.05469.

[2023] Using Latent Diffusion Models for Generating Conditional

Canchumuni, S.W., Castro, J.D., Potratz, J., Emerick, A.A. and Pacheco,

Facies Realisations: a Study Against Truncated Gaussian Random

M.A.C. [2021] Recent developments combining ensemble smoother

Fields. In: Fifth EAGE Conference on Petroleum Geostatistics, 2023.

and deep generative networks for facies history matching. Computational Geosciences, 25, 433-466.

European Association of Geoscientists & Engineers, 1-5. Park, T., Liu, M.Y., Wang, T.C. and Zhu, J.Y. [2019] Semantic image

Chan, S. and Elsheikh, A.H. [2019] Parametric generation of conditional

synthesis with spatially-adaptive normalization. In: Proceedings of the

geological realisations using generative neural networks. Computa-

IEEE/CVF Conference on Computer Vision and Pattern Recognition.

tional Geosciences, 23, 925-952.

2337-2346.

Chen, Y. and Oliver, D.S. [2013] Levenberg-Marquardt forms of the itera-

Rasmussen, A.F., Sandve, T.H., Bao, K., Lauser, A., Hove, J., Skaflestad,

tive ensemble smoother for efficient history matching and uncertainty

B., Klöfkorn, R., Blatt, M., Rustad, A.B., Sævareid, O., Lie, K.A. and

quantification. Computational Geosciences, 17, 689-703.

Thune, A. [2021] The Open Porous Media Flow reservoir simulator.

Emerick, A.A. and Reynolds, A.C. [2013] Ensemble smoother with

Computers & Mathematics with Applications, 81, 159-185. Devel-

multiple data assimilation. Computers & Geosciences, 55, 3-15.

opment and Application of Open-source Software for Problems with

Escada, C. and Azevedo, L. [2023] Facies Model Generation with Deep

Numerical PDEs.

Variational Autoencoders. In: Fifth EAGE Conference on Petroleum

Rombach, R., Blattmann, A., Lorenz, D., Esser, P. and Ommer, B. [2022]

Geostatistics, 2023. European Association of Geoscientists & Engi-

High-resolution image synthesis with latent diffusion models. In:

neers, 1-5.

Proceedings of the IEEE/CVF Conference on Computer Vision and

Fossum, K., Alyaev, S., Tveranger, J. and Elsheikh, A.H. [2022] Verification of a real-time ensemblebased method for updating earth model

Pattern Recognition. 10684-10695. Zhang, K., Yu, H.Q., Ma, X.P., Zhang, J.D., Wang, J., Yao, C.J., Yang, Y.F.,

based on GAN. Journal of Computational Science, 65, 101876.

Sun, H. and Yao, J. [2022] Multi-source information fused generative

Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D.,

adversarial network model and data assimilation-based history match-

Ozair, S., Courville, A. and Bengio, Y. [2014] Generative adversarial

ing for reservoir with complex geologies. Petroleum Science, 19(2),

nets. Advances in Neural Information Processing Systems, 27.

707-719.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

63


WWW.EARTHDOC.ORG

Over 75,000 papers and articles covering the main fields of study within geoscience, engineering and energy

All event papers from EAGE Events and partner associations held around the world

Works on all devices, IP address authenticated access from your smart phone, tablet, laptop or PC

Access Cutting-Edge Journals Full access to leading scientific journals, including EAGE’s flagship magazine First Break, featuring topquality submitted and commissioned research articles, news, special monthly topics, industry features and company profiles

Connect your institution with cutting edge geoscience, engineering and energy content Email earthdoc@eage.org to start your FREE trial today!


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Geomechanical parameter derivation while drilling in unconventional plays: a combination of surface drilling data, gamma ray data, and machine learning techniques Marvee Dela Resma1* and Ivo Colombo1 illustrate a field application of a robust and reliable approach for assessing geomechanical parameters during drilling operation or as a postmortem analysis. Abstract This study illustrates a field application of a robust and reliable approach for assessing geomechanical parameters during drilling operation or as a post-mortem analysis. The methodology leverages surface logging drilling data (such as Rate of Penetration, Rotation Per Minute, Weight on Bit, Torque, Standpipe Pressure, and Flow rates) along with well log data (including Sonic log, Bulk Density log, and Gamma Ray log) as input of the methodology. This model is grounded on the integration of various data processing techniques and machine learning algorithms (encompassing Multiple Linear Regression, Support Vector Regression, Random Forest, Artificial Neural Network, and XGBoost), ensuring a comprehensive and accurate evaluation of geomechanical parameters in the area under evaluation. The methodology is applied to a dataset of six wells drilled in the same geological units of an area located towards the eastern limit of the Neuquén Basin, north of the ‘Dorsal de Huincul’ (Gonzalez et al. 2016). The results associated with Young’s Modulus, Density, and UCS, here presented, provide evidence of its successful use in a challenging geological context such as the unconventional plays in Argentina. Introduction Accurate characterisation of geomechanical parameters plays a pivotal role in tackling various challenges related to drilling optimisation and reservoir characterisation. These challenges include ensuring borehole stability, setting an effective hydraulic fracturing programme, and facilitating precise reservoir simulation (Martinelli et al. 2021 and references therein). Traditionally, hydrocarbon reservoirs rely on core samples and/or Sonic and Density logs for estimating these parameters. While the analysis of cores stands as the most precise method for characterising geomechanical parameters, the constraints of time, costs, and associated risks compel companies to restrict its usage. In addition to time and cost considerations, the spatial distribution of cores must be considered, as their analysis provides only localised

1

Geolog Technologies

*

Corresponding author, E-mail: m.delaresma@geolog.com

information. Alternatively, well logs offer continuous measurements for deriving geomechanical parameters. Sonic logs are associated with both Compressional and Shear wave velocities, while Density logs facilitate the calculation of the Bulk density of the rocks. Conversion of this data into geomechanical parameters typically involves the application of site-specific empirical equations (Castagna et al., 1985; Goodman, 1989; Lashkaripour, 2002; Ohen, 2003; Ameen et al., 2009; Asfari et al., 2010; Elkatatny et al., 2018). Nevertheless, the acquisition of these data sets comes at a significant expense, requiring careful cost-benefit consideration. Moreover, they introduce risks related to borehole stability over time and the potential for tools, including those containing radioactive sources, to become stuck or, worse, lost in the hole. These challenges become particularly pronounced in areas of instability and in extended high-angle sections, further skewing the cost-benefit analysis away from data collection. To address these challenges, numerous methods have been introduced to compute geomechanical parameters during drilling using surface logging data, commonly accessible for all wells, and Gamma-ray logs (either LWD/MWD or SGR), widely available in drilling operations. Anemangely et al. (2019) propose the utilisation of Mechanical Specific Energy (MSE) in conjunction with an artificial neural network for the prediction of Poisson’s Ratio, Uniaxial Compressive Strength (UCS), and Confined Compressive Strength (CCS). Similarly, Hamada et al. (2018) rely on the Equivalent Strength (EST) parameter, utilising torque, drilling depth, and bit characteristics to evaluate rock mechanical properties. Jamshidi et al. (2013) have devised a method that integrates an artificial neural network with drilling parameters, tectonic stress, and bit characteristics for predicting Young’s Modulus and UCS during drilling. Additionally, other authors have introduced ROP models or inverted ROP models (Galle and Wood, 1960; Bourgoyne and Young, 1974; Warren, 1987; Hareland and Hoberock, 1993; Rampersad et al., 1994; Hareland and Nygård, 2007), capable of predicting the rock’s UCS by incorporating key drilling parameters into their approaches.

DOI: 10.3997/1365-2397.fb2024015

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

65


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

A proprietary methodology is here presented to derive geomechanical parameters while drilling concurrently mitigating the risks and costs associated with conventional approaches. This approach, designed to address limitations highlighted in the literature, involves the combination of surface logging data (Rate of Penetration, Torque, Weight on Bit, Stand Pipe Pressure, Rotation Per Minute, Flow rates), well log data (Sonic log, Bulk Density log, Gamma Ray log), and multiple machine learning algorithms (Support Vector Regression, Random Forest, Artificial Neural Network, XGBoost). In the following, this methodology is illustrated, and the results associated with its application in unconventional play for Young’s Modulus, Density, and Unconfined Compressive Strength are presented. Methodology The methodology, outlined in Figure 1, adopts a two-phase approach: model building and model application. The objective is to derive geomechanical data for wells lacking the necessary petrophysical logs for conventional geomechanical dataset generation. This is achieved by leveraging datasets from previous wells where such data is available. During the first phase, the model undergoes training and validation using a dedicated dataset from offset wells. This dataset includes Sonic, Density, and Gamma-ray logs, drilling parameters, and XRF analysis on cuttings (which may be optional in our approach). Additionally, empirical relationships between well logs and geomechanical parameters are incorporated. In the second phase, the trained model is deployed to predict geomechanical parameters in a test well. The dataset for this new well includes only drilling parameters and a Gamma-ray log. Alternatively, a Synthetic Gamma-ray log, derived from XRF

data, can serve as a backup or substitute for the downhole logging tool in both the first and second phases. The metrics used to evaluate the performance of the model are MAPE (Mean Absolute Percentage Error), which measures the error relative to the parameter being predicted. Another metric being used is the correlation coefficient which gives a value to describe how similar the predicted trend is to the actual one. Example of application The methodology is here applied to a challenging geological context, specifically addressing the challenges posed by unconventional plays in Argentina. Unconventional play refers to hydrocarbon reservoirs that do not exhibit the conventional characteristics of easily flowing oil or gas. These reservoirs typically have low permeability and require advanced extraction methods, such as hydraulic fracturing (fracking) and horizontal drilling, to recover hydrocarbons effectively, introducing high risk and cost-related issues. The dataset is composed of six wells drilled in the Neuquén Basin, located in Argentina on the eastern side of the Andes chain (Howell et al. 2005). The geological context involves a Late Triassic to Early Cenozoic succession comprising continental and marine siliciclastics, carbonates, and evaporites. The wells mentioned above are in the Lindero Atraversado block. The studied area is positioned at the eastern boundary of the Neuquén Basin, situated to the north of the ‘Dorsal de Huincul’ (Gonzalez et al., 2016). Within this region, the sedimentary setting of the Vaca Muerta formation exhibits a basin-to-ramp configuration of mixed composition, transitioning from an external ramp at the base to a middle ramp towards the top (Mitchum Uliana, 1986; Legarreta Uliana, 1991). The primary drilling targets are the formations between the Centenario and the Lotena (Centenario, Quintuco, Vaca Muerta, Catriel, Sas. Blancas, and Lotena), as depicted in Figure 2. The

Figure 1 Methodology involving the input and output data for the model-building phase and the model application phase.

Figure 2 Stratigraphy of the wells used to train and test the machine learning model.

66

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 3 Comparison between the machine learning-derived Young’s modulus, Density and UCS (red curve) and the conventionally calculated values (blue curve) associated with.

Parameter

R2

MAE

MAPE

Correlation

Young’s Modulus

0.72

4.388

16.78%

0.849

Density

0.63

0.045

1.84%

0.839

UCS

0.74

3.718

9.20%

0.864

Vaca Muerta Formation serves as both the principal source rock in the Neuquén Basin and a primary target. Thanks to advancements in technology and successful production from this shale-type reservoir, the Vaca Muerta has emerged as a prominent unconventional resource drilling target, distinguished by its considerable extension, thickness, and high levels of total organic carbon (Gonzalez et al., 2016). Results The dataset, comprising of six wells, was partitioned into the train set of five wells and one validation or blind test well. The model used drilling parameters, gamma ray values and computed geomechanical parameters for training. Some of the results for Young’s Modulus, Density and UCS are presented in the logs (Figure 3). Various machine learning algorithms underwent testing to determine their effectiveness, and among them, XGBoost emerged as the optimal choice for this application. The model’s results demonstrate its ability to generate synthetic values for various geomechanical parameters, exhibiting differing levels of error. The trend observed in the predicted values aligns notably well with the actual or computed parameters. This closeness is evident in the high positive correlation (more

Table 1 Numerical comparison of the metrics used for the specific geomechanical parameters predicted.

than 0.8) indicated by the correlation factor. Meanwhile, Table 1 showcases quantitatively the results for the three selected parameters in terms of four metrics: the r-square coefficient (R2), the Mean Absolute Error (MAE), the Mean Absolute Percentage Error (MAPE), and the correlation coefficient respectively. In general, these results highlight the model’s applicability in unconventional settings, contingent upon the intended purpose of parameter prediction. Depending on the specific objective, the focus may be directed toward exact values or understanding the general trends and variations along the well. The model remains open for further enhancement through activities such as feature engineering, integrating spatial information, fine-tuning hyperparameters, and implementing models tailored to specific formations. Conclusions Accurate derivation of geomechanical parameters is crucial for various applications, including drilling optimisation and reservoir characterisation. However, this process is often linked to significant economic and technical risks. To mitigate certain risks and address limitations outlined in the literature, a novel approach has been developed and is presented here with a specific application FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

67


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

in unconventional reservoirs. The results showcase the methodology’s efficacy in deriving geomechanical parameters—Young’s Modulus, Density and Unconfined Compressive Strength — by integrating surface drilling parameters, well log data, and machine learning algorithms. This machine learning-driven solution proves to be an efficient means of obtaining geomechanical parameters in post-drilling phases, thereby reducing costs and operational risks associated with acquiring Sonic and Density logs in newly drilled wells. This methodology offers an alternative to downhole tools in situations where their use is not feasible or financially viable. Additionally, it represents a reliable and cost-effective backup to downhole tools when they are accessible. Moreover, it can be retrospectively applied to legacy data, enhancing the optimisation of future drilling efforts.

Hamada, Y., Kitamura, M., Yamada, Y., Sanada, Y., Sugihara, T., Saito, S., Moe, K. and Hirose, T. [2018]. Continuous Depth Profile Of The Rock Strength In The Nankai Accretionary Prism Based On Drilling Performance Parameters, Scientific Reports, 8(1), 1-9. Hareland, G. and Hoberock, L.L. [1993]. Use Of Drilling Parameters To Predict In-Situ Stress Bounds, Paper SPE-25727 presented at the SPE/IADC Drilling Conference, Amsterdam, Netherlands, 22-25 February. DOI: 10.2118/25727-MS. Hareland, G. and Nygård, R. [2007]. Calculating Unconfined Rock Strength From Drilling Data, Paper ARMA- 07-214 presented at the 1st Canada-Us Rock Mechanics Symposium, Vancouver, Canada, 27-31 May. Howell, J.A., Schwarz, E., Spalletti, L.A. and Veiga, G.D. [2005]. The Neuquén Basin: An Overview, Geological Society London Special Publications, 252 pp. 1-14. http://dx.doi.org/10.1144/GSL. SP.2005.252.01.01. Jamshidi, E., Arabjamaloei, R., Hashemi, A., Ekramzadeh, M.A. and Amani,

References

M. [2013]. Real-Time Estimation Of Elastic Properties Of Formation

Ameen, M.S., Smart, B.G., Somerville, J.M., Hammilton, S. and Naji,

Rocks Based On Drilling Data By Using An Artificial Neural Network,

N.A. [2009]. Predicting rock mechanical properties of carbonates

Energy Sources, Part A: Recovery, Utilization, And Environmental

from wireline logs (A case study: Arab-d Reservoir, Ghawar Field, Saudi Arabia), Marine And Petroleum Geology, 26(04), 430-444. DOI:10.1016/j.marpetgeo.2009.01.017.

Effects, 35(4), 337-351. DOI:10.1080/15567036.2010.495971. Legarreta, L. and Uliana, M.A. [1991]. Jurassic-Cretaceous marine oscillations and geometry of back-arc basin fill, central Argentina Andes.

Anemangely, M., Ramezanzadeh, A., Amiri, H. and Hoseinpour, S.A. [2019]. Machine learning technique for the prediction of shear wave velocity using petrophysical logs, Journal of Petroleum Science and Engineering, 174, 306-327. DOI: S0920410518310246. Asfari, M., Amani, M., Razmgir, S.A.M., Karimi, H. and Yousefi, S.

Sedimentation, tectonics and eustasy. International Association of Sedimentologists, Special Publication, 12, 368 pp. Lashkaripour, G.R. [2002]. Predicting mechanical properties of mudrock from index parameters. Bulletin of Engineering Geology and the Environment, 61(1), 73-77. DOI: 10.1007/s100640100116.

[2010]. Using drilling and logging data for developing 1D mechani-

Martinelli, M., Colombo, I. and Russo, E.R. [2021]. Predict Geomechan-

cal earth model for a mature oil field to predict and mitigate wellbore

ical Parameters with Machine Learning Combining Drilling Data

stability challenges, Paper SPE – 132187 presented at the CPS/SPE

and Gamma Ray, SPE Middle East Oil & Gas Show and Conference,

International Oil & Gas Conference and Exhibition, Beijing, China, 8- 10 June. DOI: 10.2523/132187-MS.

Bahrein (event cancelled). DOI: 10.2118/204688-MS. Mitchum, R.M. and Uliana, M. [1986]. Seismic stratigraphy of carbonate

Bourgoyne, A.T. and Young, F.S. [1974]. A Multiple Regression Approach

depositional sequences, Upper Jurassic-Lower Creataceous, Neuquén

to Optimal Drilling and Abnormal Pressure Detection, SPE Journal,

Basin, Argentina. Seismic Stratigraphy II. An integrated approach to

14(4), 371–384. DOI: 10.2118/4238-PA.

hydrocarbon analysis. American Association of Petroleum Geolo-

Castagna, J.P., Batzle, M.L. and Eastwood, R.L. [1985]. Relationships Between Compressional-Wave And Shear-Wave Velocities In Clastic Silicate Rocks, Geophysics, 50(4), 571-581.

gists, Memoir 39: pp. 255-283. Ohen, H.A. [2003]. Calibrated Wireline Mechanical Rock Properties Model for Predicting and Preventing Wellbore Collapse and Sanding,

Elkatatny, S. [2018]. New Approach to Optimize the Rate of Penetration

Paper SPE-82236-MS presented at the SPE European Formation

Using Artificial Neural Network, Arabian Journal for Science and

Damage Conference, The Hague, Netherlands, 13-14 May. DOI:

Engineering, 43(11), 6297-6304.

10.2118/82236-MS.

Galle, E.H. and Woods, H.S. [1960]. How To Calculate Bit Weight And

Rampersad, P.R., Hareland, G. and Boonyapaluk, P. [1994]. Drilling

Rotary Speed For Lowest Cost Drilling, Oil And Gas Journal.

Optimization Using Drilling Data and Available Technology, Paper

González, G., Vallejo, M.D., Kietzmann, D., Marchal, D., Desjardins,

SPE-27034-MS presentend at the SPE Latin America/Caribbean

P., Tomassini, F.G., Rivarola, L.G. and Domínguez, R.F. [2016].

Petroleum Engineering Conference, Buenos Aires, Argentina, 27-29

Transecta regional de la Formación Vaca Muerta, IAPG. Goodman, R.E. [1989]. Introduction to Rock Mechanics 2nd Edition,

Spe Drilling Engineering, 2(1), 9-18.

John Wiley & Sons Ltd., New York.

68

FIRST

BREAK

I

VOLUME

42

April. DOI: 10.2118/27034-MS. Warren, T.M. [1987]. Penetration Rate Performance Of Roller Cone Bits,

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Adopting technology to revolutionise and accelerate the flow of seismic data from sensor to customer Erik Ewig1*, John Brittan1, Cerys James1, John Oluf Brodersen1 and Sverre Olsen1 present the PGS journey in enbracing the cloud and digitalsation to ensure the company remains highly competitive. Introduction Transforming a traditional industrial player into a cloud-native, data-driven energy data company is a monumental journey, one that PGS embarked on in 2020. Faced with ageing on-premises equipment that required substantial capital expenditure, the company found itself at a crossroads, especially coming out of a challenging economic downturn. It was in this context that a pivotal decision was made, to leverage the need for equipment renewal as an opportunity to look at our entire seismic data flow (from sensor to customer) and overhaul our IT and high-performance computing (HPC) landscape. The primary goal was to create a sustainable, cost-effective, and scalable solution that would not only future-proof the company but also deliver added value to our customers. We quickly realised that this was more than just a technology project; it requires the whole organisation to adapt to significant changes for both people and processes. At the onset of 2021, PGS consolidated its technological and digitalisation resources and projects into a unified organisation, empowered with the mandate and capabilities essential for executing this transformative initiative. The ultimate objective was articulated as the establishment of a sustainable and future-proof data ecosystem, ensuring the optimal flow of high-quality subsurface data with maximum efficiency, economy, and minimal environmental impact.

The initiative was guided by the key principle ‘cloud first’, i.e., always looking towards a cloud-based solution first to evaluate if it is technically and commercially viable. This gave us the opportunity to use cloud-native technologies and build on the inherent superior security, flexibility and scalability of the cloud. At the same time, we wanted to minimise data duplication, eliminate data handover points, and facilitate seamless and swift customer access to the subsurface information. In the initial stages, we outlined key focus areas related to seismic data in the cloud. The most intricate was the migration of our on-premises High-Performance Computing (HPC) workload from Cray computers to the Google Cloud Platform (GCP). Simultaneously, we embarked on constructing a cloud-based data delivery environment for our multi-client seismic data library. A third initiative in our digital transformation journey was to use operational data from our vessels to improve efficiency and asset maintenance. Concurrently we also lifted all our enterprise applications into the cloud thus fully taking advantage of this technology. Upon successful validation of the low earth orbit satellite’s maturity in the beginning of 2023, we introduced a fourth focus area concentrating on reducing operational costs through satellite utilisation. The projects have had a transformational impact on the organisation and impacted all parts of PGS. We started to work on new

Figure 1 Traditional data flow (top) vs. optimised data flow (bottom). The traditional flow contains many data duplications and physical data handover points. The PGS optimised data flow transfers the subsurface information as soon as possible via satellite into the cloud for QC, processing and final delivery.

1

PGS

*

Corresponding author, E-mail: Erik.Ewig@pgs.com

DOI: 10.3997/1365-2397.fb2024016

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

69


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

business models, opened new business lines, changed workflows and best practice, and this resulted in reduced turnaround time, a reduced cost base, more satisfied customers and a happier workforce. The following article will provide more insight into the key projects as well as exploring the changes we had to apply to the underlying platforms and infrastructure, which enabled this journey to be successful. Revolutionising maritime seismic data transfer from tapes to the cloud The initial phase in the seismic data journey in PGS generally starts after the seismic data has been recorded by the streamer spread and the onboard recording system. After initial quality control (QC) of both the data and the associated navigation information it is stored on the vessel until it is physically shipped to ours or our client’s onshore facility for further data conditioning and imaging (see Figure 1). Historically, this workflow has remained unchanged due to insufficient satellite bandwidth to facilitate immediate data transfer of the seismic data, despite having internet connectivity via satellite on all vessel for decades. However, this paradigm has now been disrupted with the arrival of Low Earth Orbit (LEO) satellite technology. LEO satellites provide a compelling alternative to traditional geostationary (GEO) satellites, offering enhanced data transmission capabilities and reduced latency. While we are still in the early stages of exploiting this technology, it is relevant to explore the benefits LEO satellites offer, their potential applications, and the lessons learnt so far.

•  Reduced latency: LEO satellites, operating at a lower altitude than GEO satellites (Figure 2), significantly reduce signal latency, facilitating near-real-time data transfer between seismic vessels and onshore data centres. This real-time data availability enables faster decision-making and operational adjustments. •  Increased bandwidth: LEO satellites can handle a large volume of data, meeting the high-bandwidth requirements of seismic data transmission. This enhanced bandwidth facilitates the transfer of high-quality seismic data onshore. •  Reduced operational costs: Seismic data transmission via LEO satellites reduces or even removes the need for on-board data processing, streamlines operations and potentially minimises costs in the long term. Leveraging the cutting-edge capabilities of LEO satellites, PGS conducted a series of tests throughout 2023. These tests demonstrated the technology’s exceptional effectiveness, with PGS successfully transmitting full-integrity 4D seismic data from two surveys directly to the cloud. This eliminated the need for physical data transfer, revolutionising the speed and efficiency of seismic data transfer. The data delivery time was reduced from an average of nine days to just one day (Figure 3). Following these successful tests and recognising the transformative potential of LEO satellite technology, a strategic decision was taken to implement a base Starlink (LEO) service level alongside our existing VSAT service (GEO). For the time being, we treat both services as complimentary to each other based on available bandwidth, latency and commercial terms depending on the use case.

Figure 2 Left, LEO and GEO orbits relative to earth. Right, Starlink (front) and VSAT dome (back) satellite antenna on board the PGS vessel Vanguard.

Figure 3 Key benefits of sending seismic data via satellite link rather than physical media transport.

70

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 4 The use of GKE (Google Kubernetes Engine) instances in late 2022 and 2023 for our cloud-based imaging. The graph shows the spikiness of usage, at peak points we are using far more capacity than is ever available to us on-premises. Howeverk, outside these times we do not pay for any unused capacity.

The successful validation of the new satellite technology provides a huge opportunity for PGS and the seismic industry, both in regard to how we provide clients with the acquired data but also on how we enable them to follow the project during the acquisition phase. It has the potential to perform some of the tasks onshore and gives the onboard crew a better connection to the onshore world. The future success of this technology in our industry depends on how we, as individuals, companies, and as an industry, adapt our traditional processes. Nevertheless, PGS embraces this technological change and is committed to driving sustainable seismic data transfer, ultimately enhancing the efficiency and effectiveness of seismic operations worldwide. A new dawn for handling huge imaging workloads After the seismic data have been acquired offshore and moved onshore the next step is to produce a 3D image of the subsurface. This typically requires multiple steps using different algorithms, often with significant use of high-performance compute (HPC). The HPC workload typically comprises large datasets (>1 TB and <100 TB) but a small number of files within each project (< 1000). Before 2019, our imaging software stack, comprising 300+ algorithms, utilised around 200,000 cores of Cray hardware and a 70 PB online parallel file system. Many of the most HPC-intensive codes were written specifically for this on-premises hardware and orchestrated by a heavily homegrown and highly customised job management system as well as proprietary and self-supported 3D data visualisation. Initially, our cloud approach aimed at a hybrid environment, prioritising on-premises compute and utilising cloud compute for excess capacity via a ‘Lift and Shift’ strategy. This meant trying to move on-premises applications to the cloud without redesigning them and this proved unsuccessful both in terms of user experience and cost. A fundamental reassessment was essential, extending beyond technical considerations to encompass our overall approach to the project. We started to work closer with our partner Google, reviewing our project set-up as well as our strategic short- and long-term goals. We developed a minimum-viable-product for one algorithm (3D SRME) on the cloud, integrating it with on-premises

infrastructure for an enhanced user experience. Despite initial scalability and reliability challenges, this experience addressed key usability issues. The onset of the Covid-19 pandemic and subsequent business retrenchment prompted a shift to prioritise Opex over Capex, leaving us with no other option than to pursue a cloud-centric model. The goal was to decommission the Cray platform by mid-2022, focusing on cloud scalability, storage optimisation, and platform independence. This was achieved by focusing our reengineering effort on only seven to eight key algorithms which make up more than 80% of our compute capacity and initially keeping bulk storage on-premises. Critical reengineering steps included checkpointing within key algorithms, replacing Lustre with Object Store, and adopting Kubernetes and GKE for cloud-native scheduling. The move to Kubernetes proved transformative, giving the scalability and the stability we needed (Figure 4). Here are some key benefits of moving processing to the cloud: •  Scalability and flexibility: In autumn 2022 we managed to sustain a peak of 1.2 million vCPUs across 12 GKE clusters, three times greater capacity than we previously had access to, allowing us to compete for larger jobs and run tasks in parallel. We can now routinely push through the 1 million vCPU barrier. •  Improved turnaround time: Tailoring compute for each large run, enjoying virtually unlimited capacity to run jobs at scale and in parallel, and leveraging the latest software and hardware stack significantly reduces compute time, from weeks to hours in extreme cases. •  Reduced exposure: Outsourcing tasks such as procuring equipment, maintaining the computer centre and negotiating power supply minimised our risk exposure associated with running a HPC compute centre. •  Greener compute: Our selected data centre with the preferred cloud provider runs on 100% renewable energy, contributing to our sustainability ambitions. •  Potential to allow us to access new levels of geophysics: Cloud scalability facilitates the development of new algorithms, such as those using elastic wave propagation, by providing short-term access to compute resources otherwise unaffordable for long-term use. FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

71


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 5 High level process flow for making seismic data available in PGS’ cloud solution, providing users fast and secure cloud-based access to contextualised and quality-controlled subsurface data.

The success of this approach can be illustrated with some examples: •  A large reverse time migration (RTM) on an OBN dataset which would have taken 6-8 weeks using our entire legacy on-premises compute capacity took 14 days in the cloud. •  A multi-azimuth streamer RTM which would have taken 40 days using our entire compute capacity was completed in 26 days in the cloud. A cloud-based solution for seismic storage and accelerated data access The final leg of our journey involves delivering the final imaged data to our clients, either as part of contracted work or through our multi-client data library for widespread access. Traditionally, this data transfer has occurred via physical media (e.g. tapes), involving multiple phases of global logistics (Figure 1). This created a high risk of data loss, required many points of data duplication and was extremely timeconsuming. Managing our seismic data library, a high value asset for both PGS and data owners, is a complicated task as the petabyte library spans data acquired decades ago up to those acquired today. This complexity introduces challenges, contract terms evolve, companies merge or acquire others, and regulatory bodies alter rules, requiring vigilance in ownership tracking and licence-honouring by vendors. Creating specific data deliveries, involving the extraction of subsets of data volumes, is a time-consuming process, taking days or weeks and necessitating meticulous planning. This can result in the vendor and end-user storing multiple copies of the data from disparate sources, further complicating matters. Our clients have said they spend a considerable amount of time on data sourcing and validation due to historical data management intricacies. Recognising an opportunity to revolutionise subsurface data management, PGS, as the owner of one of the world’s largest multiclient libraries, in 2019 started to move our multi-client library to the cloud. The goal was to eliminate data duplication, streamline data management through automation, and expedite data delivery. Even though the cloud offers nearly infinite storage capacity with ubiquitous, secure, and granularly controlled access, like our processing in the cloud story above, a mere lift-and-shift approach wouldn’t have addressed all the problems, like consistent data conditioning, client trust, or reduction of data duplication. To tackle these challenges, PGS entered a partnership with Cognite, a company who specialises in industry-scale data-ops challenges, to co-develop a tailored platform for subsurface 72

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

data, especially multi-client data. The shared objective was to contextualise data in relation to licences and other metadata, providing PGS with a fully trace-indexed library and giving clients easy, secure and fast access to load or even stream the data into their environment (Figure 5). By late 2022, PGS implemented fully functioning and cloud-native data ingestion pipelines, automating the reading, conditioning, ingesting, and indexing of post-stack and prestack data from cloud storage. A well-documented connection endpoint (API) and Software Development Kit (SDK) were made available to clients for 24/7 trace and metadata access. This empowered our sales department to explore diverse commercial models with clients for accessing seismic data. Key benefits of enabling cloud-based data storage and access: •  Improved turnaround time: reducing access time from weeks to less than an hour •  Direct access to data in the cloud at scale: data is available via Open Subsurface Data Universe (OSDU)-aligned APIs for further sharing, collaboration, and interoperability within a client’s own environment or within the industry. OSDU is becoming an open-source industry standard for storage, retrieval, and sharing. •  New commercial offers: this solution allows us to condition, ingest and connect our client’s proprietary data, proving a state-of-the-art managed cloud-based solution for storage, retrieval, and seismic data sharing. This enables more rapid and informed exploration and development decisions. Conclusion Confronted by ageing infrastructure and economic hurdles, PGS seized the opportunity to revolutionise our seismic data flow, information technology, and high-performance computing landscape to transform to one of the industry’s leading cloud-native, data-driven energy companies. A key learning was that the challenge extended beyond mere technical platform upgrades. It necessitated a profound shift in our workforce mindset. By embracing new project methodologies and technologies, integrating greater business participation into projects, scrutinising and revising numerous workflows, and persuading stakeholders that the cloud provides substantial business potential. An important role, only briefly mentioned in the article, relates to the IT offering within our organisation. Traditionally positioned at the periphery as a service function, it became an integral component of our transformation journey. IT now stands as a driving force behind deploying, operating, and maintaining a comprehensive IT platform that serves the entire organisation.


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Embarking on this initiative, we envisioned an exceptionally ambitious future, uncertain of its technical feasibility at the outset. While we haven’t reached all our goals yet, and there’s still work to be done in aligning and engaging with our customer base in this transformative journey, noteworthy milestones have already been attained. The initiative revolutionises the methods by which we obtain, process, and utilise seismic data.

Acknowledgements The journey we have described in this article would not have been possible without the dedication and commitment of a large number of people, both from the Technology and Digitalization group within PGS and across the other business units. We acknowledge that the hard work and innovation from these teams has allowed us to get this far.

ADVERTISEMENT

Publish with EAGE To serve the interests of our members and the wider multidisciplinary geoscience and engineering community, EAGE publishes a range of books and scientific journals in-house. Our extensive, professional marketing network is used to ensure publications get the attention they deserve.

EAGE is continually seeking submissions for both book publishing and articles for our journals. A dedicated and qualified publishing team is available to support your publication at EAGE.

CONTACT OUR PUBLICATIONS DEPARTMENT AT PUBLICATIONS@EAGE.ORG!

20879-Filler Journal Media Publish with EAGE .indd 1

FIRST

BREAK

I

VOLUME

42

I

09/03/2021 14:41

FEBRUARY

2024

73


30th

5th

4th

European Meeting of Environmental and Engineering Geophysics

Conference on Geophysics for Mineral Exploration and Mining

Conference on Airborne, Drone and Robotic Geophysics

CALLING ALL RESEARCHERS!

SUBMIT YOUR PAPER

FOR A CHANCE TO JOIN THE TECHNICAL PROGRAMME

8-12 SEPTEMBER 2024 I HELSINKI, FINLAND

W W W. E AG E N S G.O R G

# N S G 2 0 24


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Artificial intelligence and life on Mars Neil Hodgson1* and Sam Tyler 2 compare the job of a geoscientist in 2024 with how it might be in 2050. Introduction My name is Sam Tyler. I was a geologist who had an accident in 2050 and woke up in 2024. Am I mad, in a coma or back in time? Whatever’s happened, it’s like I’ve landed on a different planet. Back where I come from, the future, an explorers’ life is much simpler. ExpCorp collect seismic data using Autonomous unmanned vessels operating drone-node armadas. Each mother-ship cum source vessel, protected by sec-units, permanently cruise the oceans, transmitting data to orbital satellite servers, where it is processed on space-cooled quantum computers with near-infinite compute. It is a time when Moore’s Law of computer power progression had been found not to have broken down, but to be dreadfully unambitious, due to the quantum power storm. Sentient algorithms mix stochastic model builds with geology-free processing to make the data artifact- and noise-free, and positioned accurately in depth. Not that we see the data – it’s far too complex for a human. The data is interpreted in depth-frequency space by AI servers that consider all possible geological models, back-stripping through time, accessing literally all the basin data, well data and outcrop models that have ever been peer-reviewed. Then prospects identified, evaluated, risked and ranked and the next drilling co-ordinates spat out to the DrillCorp robots. All that may sound idyllic and utopian, because in terms of getting the job done; it is. Yet here I am in 2024 in my explorationist job where I have no data ‘Feed’ to talk to and guide me. I have an infinity of choices to make minute by minute that are not constrained but only intuited. I work in a team of other techs who have spent their lives learning not quite enough and we are all making stuff up. We have a fraction of the data and that data is a fraction of the quality so we distil the geologic stories from our imaginations. I am slow and make mistakes without a Feed and big-data AI to take the choices off my shoulders. I am drowning in fuzzy facts and an unconstrained imagined 3D vision of a ‘most likely model’. On one level it feels like a dystopian-past nightmare. On another level, it feels like I’m alive. How Aismov was proved right It seems inconceivable that Issaac Asimov could have anticipated the technology one hundred years before 2050, with his collection of short stories I Robot published in 1950. At that time computers as an idea was just being born, yet his stories were preceded by 30 years at least by fears of an overly rational robot ‘overlord’

Figure 1 AI is actually happening in 2024 on Mars. Nasa have been driving vehicles on Mars since 1997, beginning with the Sojourner Rover mission that used a stop-start system of Earth sending instructions, waiting to see what happens then sending another instruction. In 2016 Nasa remotely uploaded an AI system onto the Curiosity Rover in the Gail Crater on Mars.,The system called AEGIS (Autonomous Exploration for Gathering Increased Science) lets the ‘Curiosity’ rover choose rocks for the ChemCam analytical laser system to target, determining the chemistry of that rock automatically and mapping the geology without any intervention from Nasa scientists on Earth. The newer rover ‘Perseverance’ in the Jezero Crater of Mars again used AEGIS with its SuperCam Laser and its Remote Micro-Imaging camera to identify and characterise rocks, but it has a significant AI ‘wheel-up’ on Curiosity. Perseverance has been designed to have a wide field navigation camera ‘Navcam’, and Field Gate Programmable Array ‘FPGA’ that gives fast image processing. The product is Visual Odometry, ‘VO,’ which tracks the motion of features in images taken as it is driving to give what Nasa call ‘Thinking-WhileDriving’ capability. Perseverance rover continuously drives while performing VO, generating a map of terrain geometry, and using AI to autonomously blend drive arcs and selecting the safest and most efficient drive path. Currently, Nasa is just trying to drive as far as it can every day, relying more on Perseverance’s self-driving capabilities than ever. That means ‘Thinking whilst driving’ is a big deal even though the rover has a top speed of only 0.1 miles per hour. That doesn’t sound very fast, but it equates to 300 m of stone and sand-filled deathtrap traversed per day.

(such as in Fritz Lang’s Metropolis of 1927). In 2023 we arguably haven’t reached conscious intelligence yet – but with DeepMinds’ AlphaGo dominance of the game GO – requiring experience, judgment and intuition (or ‘gut instinct’) to win; we are not far away. The examples of Figures 1 and 2 relate to the robot rovers on Mars today. Here the boundaries of artificial intelligence creating a sense of self agency for the machine are being tested. On Earth this tech has manifested in driverless cars that can, like ‘Percy’ the Perseverance rover, sense, understand and navigate complex systems without human intervention. Nasa needs Percy to be independent because of the harshness and difficulty of putting a human on Mars, keeping them sane for years doing routine work and returning them in one piece. The first of Earth’s clever

1

Searcher | 2 The Future

*

Corresponding author, E-mail: n.hodgson@searcherseismic.com

DOI: 10.3997/1365-2397.fb2024017

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

75


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

monkey technology to leave the solar system (Voyager’s 1 and 2) included a gold LP record – technology that’s rarely even used on planet earth now. Percy’s ‘independent’ AI technology is a gateway to the stars that will be explored and inhabited by robots. There is a metaphor here to where AI is being developed in 2023 to solve seismic processing problems. AI applications are doing lots of other useful tasks from optimisation of product and supply chains to designing molecules in medical research. In the seismic area the majority of the effort is focused on improving the final data quality, using big data to jump some of the uncertainties in the physics, and understand the uncertainties in the collected data so we can apply better approximations and algorithms. The goal is also to find ways to reduce the compute and speed up processing. All of this is good, but the worry is that as machines do the less interesting part of the workflow quicker and better, humans like Sam Tyler in this narrative will get pushed out. No-one knows if this will actually happen. The Redundancy Myth is discussed at length by those in the AI business and the optimistic view is that more efficient and better seismic processing will just lead to more seismic processing happening, and more humans in the system, just doing different and perhaps more interesting work. Empirical evidence from a number of industries shows that automation does not cut costs (so that should not be the goal). Yet studies show that automation increases inequality by removing low-status workers and increasing the number of

Figure 2 AI Life on Mars Today in 2024. You the reader can now help teach an artificial intelligence algorithm to recognise scientific features in images taken by Nasa’s ‘Perseverance’ rover. All machine learning algorithms (currently) require training from humans. Called AI4Mars, Nasa is asking humans to help Percy learn. The resulting algorithm, called SPOC (Soil Property and Object Classification), already identifies features identified on images from Percy on the Martian Surface correctly 98% of the time. Percy’s auto-navigation system, known as AutoNav, makes its 3D maps and uses AI to decide how complex the terrain is and determining how conservative or not to be. If it is too conservative, it is too scared to progress. Not conservative enough and it could get stuck which would mean death and the end of the mission. At the same time onboard AI is being used to plan which of the seven instruments it can work with the energy supply at any time, allocating resources and resolving priority conflicts. There are many capabilities that robots like Percy have that humans do not. For instance, human eyes on Mars can only see dimly lit, dusty red terrain stretching to the horizon. So Mars rovers with camera filters ‘see’ wavelengths of light that humans can’t see in the infrared. Percy can analyse more data than a human can, more thoroughly and without bias. It is the right tool for job – working alone, continuously, on a distant planet in hostile conditions. To track Perseverance’s drive, visit mars.nasa.gov/mars2020/mission/where-is-the-rover/.

76

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

high-status (high salary) workers. Alternatives to straight-out replacement are ‘mixed autonomy’ systems, where people and robots work together. For example, to operate in traffic alongside human drivers. Autonomy is ‘mixed’ because both humans and robots operate in the same system, and their actions influence each other. However, if humans merely curate or teach AI this is ‘ghost-work’, mindless, piecemeal tasks that programmers hope machine learning will soon render obsolete. The Redundancy Myth So what does Sam Tyler think of the Redundancy Myth, stuck back in historical 2024? “Click, click, click. Here we go I have just interpreted an horizon. It means something in my imagination but I am literally making this up. Extrapolating from all the data I can’t use what is ‘known’, only what ‘I know’. Not the same thing at all. I think I’m doing it all wrong without my big data-based AI pal to support me. I feel like I am living on another planet – Mars for example. Yet what I do have is agency and independence and it’s interesting and impossible at the same time. I love the uncertainty and fighting with the other geologists who have different visions. Geologists beating up the wrong model – look at us cave men go (to quote David Bowie in Life on Mars)! In my future time I do not know what the processing algorithm is doing to the data, or how the evaluation is balancing up the uncertainties, but I do know that the computer knows best, and it does it without me. This is the trade-off. AI removes all of those thousands of mistakes and assumptions and does a better job, and the price to pay is that the human part in solving these routine puzzles is redundant. In 2050 the Feed woke me, used data streamed from my ‘implant-com’ devices to select nutrient appropriate breakfast, and comfortable and appropriate clothing for me. The data was fed into the Feed’s dataverse with everyone else’s data across all platforms and CCTV referenced to my genome, that facilitates the Feeds’ bespoke, me-tailored analysis 24-7. Making the best selections for me so I don’t need to bother. If we all look similar and eat similar food, that’s just because it’s the similar solution that’s best for all of us. In the 2024 ‘age of choice’, the dilemmas arrive when I wake up. I have to choose my breakfast from a myriad of unsuitable and probably dangerous options without sufficient data review. I have to choose my clothes without knowledge of the conditions I will experience during the day, and then, I have to do some meaningful work where my insight and experience makes a difference. I am making the best selections for me because no-one else is bothered. Some of those selections are terrible, but on balance – I’m good with that. Here in 2024, people struggle with my stories of their future, my past. They generally ask two questions; firstly ‘What is the environment like in 2050?’ to which I reply ‘believe me you don’t want to know’. Then they ask ‘so, as a geologist in 2050 – what do they do?’. I tell them that my job was ‘geophysical archivist of vintage datasets in storage facility 451’. I store the data that you used to use. At least I get to employ my geological training in a transitioned sense – I was one of the lucky ones. I suppose it was always going to happen, a human doing the box-stacking


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

instead of a robot – a pile of 3290 cartridges collapsed under its own weight. I took a blow to the head and everything went black – until I woke up here in 2024. Conclusion So there are two things I am learning here in the past. The first is that I actually find solving geological puzzles to be more fun than being right. They are unique puzzles involving time and uncertainty and both strong and weak data. We don’t have these problems in the future and the puzzles have all disappeared into the invisible machine mind. The second thing I’m learning is that sometimes in 2024 we make terrible mistakes that let us learn new things – mistakes can be good. We don’t have that problem in 2050 either, or maybe we do but just don’t know it. The AI will probably figure out that it has to get it wrong sometimes for serendipity and luck to work. If I ever get back – I’ll ask it.

If I ever had any spare time, I’d work on a better auto-tracker, that accesses the internet to find analogue interpretations to make better auto-tracking choices. Yet, I know it would be like designing an antler-bone soft-hammer to do better work on a flint arrowhead. But the stone age didn’t end because they ran out of stones – it was disruptive technology that did for the flint working franchise. The future is artificial – I’ve seen it. If you want a job done quickly, well and cheaply, or if you like an easy life where you don’t need to understand or remember, or if you just want to follow the Feed then choose the artificial life: it’s a mistake free philosophy. If you want to do the work and solve the geological problems yourself – choose agency, choose independence of thought, choose creativity. And you better choose history too – you might as well. You’re part of it. Editors note: No generative AI was harmed in the making of this article.

ADVERTISEMENT

First EAGE Data Processing Workshop 26 -28 FEBRUARY 2024 • CAIRO, EGYPT

Exciting news for EAGE participants residing in Africa! The First EAGE Data Processing Workshop is bringing together professionals to develop connections that promise a future of collaboration and progress. EAGE is supporting a local registration rate for all participants who are resident in Africa with 50% discount on eligible registration fees!

Regular Registration Deadline: W W W. E AG E .O R G

13 February 2024.

Register Now! DPW24 V3H.indd 2

FIRST

BREAK

I

VOLUME

42

I

09/01/2024 11:24

FEBRUARY

2024

77


8-1 0 A P R I L 20 24 I T H E H A G U E , T H E N E T H E R L A N D S

4th EAGE Workshop on

Combining Multiple Technical Streams for

One Amazing Multidisciplinary

Programme!

Distributed Fibre Optic Sensing

4th EAGE Workshop on

Practical Reservoir Monitoring

REGISTER NOW TO CLAIM YOUR SPOT REGULAR REGISTRATION ENDS ON 20 MARCH

DISTRIBUTED FIBRE OPTIC SENSING

WORKSHOP MAIN SPONSOR

W W W. E AG E G E OT E C H .O R G


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Large volume analysis of core and thin section images in the assessment of Brazil pre-salt reservoir distribution Edward Jarvis1*, Haoyi Wang1, Jonathan Dietz1 and Thomas Van Der Looven1 discuss how machine learning and artificial intelligence methods can screen a large corpus of unstructured documents, locating and analysing core and thin section images for the purposes of porosity quantification, assignment of core and thin section scale sedimentary facies and the detection of hydrocarbon shows. Introduction Understanding the distribution of reservoir intervals in the subsurface requires the integration of various image and numerical datasets. These take time to locate and use, particularly over large areas and a high number of wells. Core and thin section images are common datasets, typically taken over zones of potential reservoir significance. These images are, however, very qualitative and are therefore under-utilised. Any associated data related to these images, such as core description data, for example, and the samples themselves are also typically generated in a semi-quantitative manner and are therefore slow to generate and prone to descriptor bias (Lokier and Al Junaibi, 2016). These same image-related datasets are also not a direct indicator of reservoir quality, requiring further integration with other datasets before conclusions on reservoir distribution can be drawn. Technology developments in image analysis techniques have allowed for the prediction of geological features across thousands

of images once a model has been trained. However, while techniques exist and have been applied to various geological image types (Rubo et al., 2019; Falivene et al., 2022; Dietz et al., 2023), it is apparent that fully integrated workflows linking various model results together for further analytical purposes are not as common. In this article, we discuss how various machine learning and artificial intelligence (AI) tasks have been utilised to efficiently and consistently achieve the following: •  Identification and segmentation of target images from a larger corpus of documents; •  QC of the images to identify those suitable for further analysis; •  Analysis of the images in the quantification of porosity, identification of core-scale sedimentary facies (cm scale), hydrocarbon shows and thin section microfacies (mm scale); •  Integration of the image analysis results across various scales and cross-data validation.

Figure 1 Map of the well locations, Campos and Santos Basins, Brazil.

1

CGG

*

Corresponding author, E-mail: Edward.Jarvis@cgg.com

DOI: 10.3997/1365-2397.fb2024018

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

79


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 2 Document classification and image segmentation pipeline.

Figure 3 CNN model for the prediction of image classes where each cube represents a stack of ConvNeXt blocks and the rectangle indicates fully connected layers. The actual number of ConvNeXt blocks, i.e., n, used in each stage depends on the architecture of the deployed model.

The process of integrating AI results can be a significant task, particularly as the various models generate large volumes of new data that can be difficult to utilise and time-consuming to relate and compare between the models and across scales. In the increasingly common scenarios that now involve significant automation in data generation and analysis, it is more important than ever to integrate across data types and scales to ensure the results of automated workflows are accurate as well as accessible. The dataset used in this approach was from the pre-salt stratigraphy of Brazil, with a focus on 228 wells from the Campos and Santos Basins (Figure 1). The data comprised 78,018 docu80

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

ments in variable formats with a target of locating and utilising the following data for reservoir screening purposes: •  Thin section photomicrographs •  White light and ultra-violet light (UV) core images •  X-Ray Diffraction (XRD) mineralogy •  Core analysis data, such as helium porosity measurements Data and image identification, extraction and quality screening An immediate challenge in working with any legacy dataset is the task of locating and extracting the required data from


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

the source documents. In order to locate the data and images of interest, a first phase of document layout analysis and classification was conducted using a combination of techniques, including computer vision and natural language processing (Lun et al., 2022). Specifically, the authors proposed a directed acyclic graph (DAG) pipeline for automated classification, extraction, and curation of data from various types of documents, significantly enhancing data processing efficiency and accuracy. This approach resulted in the creation of 182,641 document labels, of which 23,494 were images, and enabled the removal of duplicates. Figure 2 illustrates the general workflow and pathways for targeted data types. A final classification step was then run on the identified images using a convolutional neural network (CNN) (Liu et al., 2022) to identify the required thin section and core images from the wider previously identified image corpus (Figure 3). From the total image dataset, 14,300 thin section images and 1856 core images were identified, with the latter covering 1,247.04 m of vertical section. The extracted core and thin section images underwent further segmentation to remove background and peripheral content from the images, that if not removed would compromise the image analysis at later stages. File lineage was maintained, linking images to source documents, and any available meta data, such as light source, magnification and depth, was also captured and attributed per image. This final processing step resulted in refined bounding box positions for the 14,300 thin section images and the generation of 6964 individual 1m-long core images that were then stitched into 70 composite images, one per cored interval. Each pixel row was depth-referenced through available meta-data acquired per image. Prior to any image analysis, a series of image QC steps were conducted to screen the suitability of each image for subsequent analytical steps. This included a further phase of image de-duplication based on hashing approaches, image size and aspect ratio outlier detection, extent of blur/poor resolution based on object boundary sharpness, image brightness and low information/degree of variance (Cleanvision model was utilised for this process: https://github.com/cleanlab/cleanvision). From this process, 365 images were identified to be of lower quality. Those images were labelled such that any results, while generated for comparative purposes, could be filtered out of the working datasets at a later stage. Comparison of

results from blurred versus non-blurred images may be useful should Generative Adversarial Network (GAN) (Goodfellow et al., 2014) techniques be applied in the future that could generate new, equivalent synthetic images where the effects of blurring have been removed. A process of image denoising was conducted to sharpen grain/pore boundaries acting to either increase or decrease total porosity values for a given sample. The process was particularly effective in finer-grained samples where boundaries were less well defined even where image resolution was high. In this scenario, the denoising typically reduced the total porosity by a small amount. Figure 4 illustrates examples of blurred and dark images that were identified in the process and their scores according to the model results. The image quality screening model generally performed well across the image set but some of the finer-grained lithologies or those containing greater abundance of clay were incorrectly categorised as blurred, indicating that further modifications and training were required in certain areas of the model to improve performance. Thin section images Thin section images contain significant information on reservoir quality. An example is visual porosity, for which the abundance, size, shape, orientation and distribution of pores can be observed. Further characteristics of the imaged rock can also be determined at this scale, such as the presence and types of cements, grains and clays, all characteristics which infer information on why a rock is or is not porous. Using image segmentation and object classification on the 14,300 identified thin section photomicrographs, it was possible to rapidly predict proportion and properties of grains, clay, cement and porosity, a task that would have taken many weeks to months to achieve if conducted after manual descriptive techniques. This preliminary result would also allow for further high-resolution models to be trained on specific stratigraphic intervals, reservoir zones and/or geographic areas to produce higher-confidence results. The first task was to identify pore space in the thin sections and determine the proportion in each image, using image segmentation. The second task focused on identifying grain type and pore space and, as a result, general image classification. Grain segmentation in the determination of grain textural properties was

Figure 4 Examples of image quality screening. Scores are on 0 to 1 scale, with a higher score indicative of greater image quality.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

81


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

not conducted although it is an approach that can be conducted, particularly on clastic rock. Pore space in thin section images could be discriminated from background regions using colour channels. Since the pore space is a relatively consistent blue under plain polarised light (PPL) and an inconsistent dark blue/black under crossed polarised light (XPL), it was possible firstly to transform RGB images into the HSV colour space in which the hue colour channel controls the visual colour of each pixel. The corresponding hue range was then defined that best represented the pore space, resulting in a mask for each image, as shown with one image example in Figure 5. A filter was set to remove the smallest pores, with a threshold set to remove pores smaller than 5% of the length of the image. This filter could be adjusted by the user and allowed the generation of multiple iterations with different set thresholds. The proportion of pore space for the given image could then be computed by dividing the number of pixels in the pore space region by the number of pixels in the entire image. A further model was run to identify scale bars. Once identified, optical character recognition (OCR) was undertaken in the extraction of scale-related text, which, in addition to the scale bar, could be used to assign a width in microns for a given image, thereby enabling all pores in the image to be assigned with quantified dimensions. Post-processing of the segmented pores resulted in statistics for pore count, circumference, length, width, shape, long axis alignment and areal distribution over the slide. Pore dimensions were defined in number of pixels but also microns if a scale bar was present in the image. Finally, the segmented pore dataset was screened in the identification of outliers and errors using a series of rule logic. Fifty-five rule statements were run over the data resulting in the creation of 1.7 million QC records, flagging either errors or warnings related to individual pores or image and sample level concerns, such as absence of scale bars or no depth datum. The QC process resulted in 835 warnings and errors, identifying pores with very irregular aspect ratios, very large dimensions or regular shapes/straight edges. Such flags could be artificial features,

such as plucked grains, slide edges or sample delamination, and should therefore be removed from further analysis or could be features of interest, such as vugs/ areas of dissolution or natural fractures. One hundred and forty nine images were flagged as high magnification views and/or total porosity values exceeding 45%, likely relating to high magnification images focused on individual large pores. Values derived from high magnification images were flagged, with more representative values taken from lower magnification images, if available. The applied rules were aggregated to provide an overall rating for a given image and its data, this rating then being used in assessing the image and its data application in various technical use cases, such as an input in petrophysical calibration or general reservoir quality assessment. All QC rules and associated use case terminology and definitions are aligned with the Open Subsurface Data Universe (OSDU) and are therefore universally recognised as an industry standard. The characterisation of rock samples goes beyond the quantification of porosity, with the assignment of rock facies representing a complex task involving the concatenation of details on porosity, grain and diagenetic phases. For the focus of predicting facies for use in the petrophysical log space, a CNN model was trained in the identification of rock properties, including grain types, clay matrix, cements and, ultimately, microfacies (Dietz et al., 2022). Due to the limited number of samples in the dataset, it was necessary to leverage and fine-tune a pre-trained model to achieve a better classification accuracy. The thin section scans were also divided into smaller image patches to simultaneously increase the number of training samples with a classifier appended with the same number of output neurons to the end of the model. Subject-matter experts initially identified five main rock classes or microfacies for which a training set was generated from 725 images. These microfacies classes were assigned over the training set: 1. Calcareous matrix with grains 2. Mudstone (Carbonate Dunham definition) 3. Porous limestone 4. Cemented 5. Bivalve floatstone

Figure 5 Thin section pore space segmentation. Original image (A) and segmentation mask (B) highlighting individual pores and their long axis orientations as denoted by arrows.

82

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 6 Confusion matrices for thin section microfacies and grain and matrix/cement prediction models. (1) Calcareous matrix with grains, (2) Mudstone, (3) Porous limestone, (4) Cemented and (5) Bivalve floatstone.

A further model was trained in the identification of grain types and surrounding rock matrix. These included spherulites, shrubs, silica and dolomite cement and carbonate clay matrix, which are general terms and definitions associated with the pre-salt stratigraphy and aligning with schemes outlined by Wright and Barnett (2015). The CNN assigned a predicted microfacies class per image patch in addition to cement, clays and grain classes. Each prediction also included a confidence or F1 score as an indication of the performance of the model. Figure 6 outlines the confusion matrices for thin section microfacies, grain and matrix prediction models. Overall, the model performed well in the identification of most classes. The prediction of cement classes performed well although differentiation of the type of cement was less clear with silica cement frequently being incorrectly predicted as dolomite. This related to a paucity of training labels, particularly for dolomite cement. Core images The 70 stitched white light and UV core photographs were analysed with the purpose of identifying features, such as core colour, facies and UV fluorescence. Pixel colour analysis was then conducted on white light images with assignment to the Munsell colour scheme at pixel level. An aggradation step was conducted in selecting the dominant pixel colour per pixel row. Principal component analysis (PCA) indicated common colour clusters relating to different rock properties, such as the presence of clay, oil stain or cement, and therefore provided the means of classifying coarse rock properties or colour-defined facies classes via pixel RGB value. Rock colouration in the white light images was driven principally by the degree of cementation (resultant white colouration due to low porosity and high silica and or dolomite content), degree of oil stain (pale-orange to dark-brown colouration dependent on porosity distribution and/or associated fluid interaction with formation) and volume of matrix (matrix-rich intervals typically dark-grey to brown due to discolouration/oxidation of carbonates and subtle increase in trace clays and oxides). From these criteria, the subject-matter experts defined four discrete rock classes, namely: •  Non-oil stained, grain-dominated •  Cemented •  Oil-stained or calcareous matrix •  Core plugs and gaps

The training relationship between colour and facies was conducted on a subset of wells which generated a corresponding core image mask indicating the facies assignment per pixel on the photo (Figure 7). A proportion of the wells were kept as a test set while the rest were used in training and validation. The selected model employed Naïve Bayes to predict the mask directly and, from that, identify the dominant facies (Webb et al., 2010). Naïve Bayes is a probabilistic modelling method that can generate the prediction as well as the probability without a sophisticated training process and can achieve superior performance when there is a strong correlation between the input and the output. The Naïve Bayes is formulated as:

where H indicates the facies profile that the method is trying to predict, and E is the evidence on which the prediction is based. In this case, the evidence corresponds to pixel values. P(H|E) is the posterior probability and is a conditional probability, which means the probability of H given E. In this case, it is the probability of the facies to be assigned to a certain pixel value. P(E|H) is called the likelihood, i.e., how different pixel values are associated with each facies type. P(H) is the prior probability, which is the proportion of each facies type in the dataset, and P(E) is called marginal likelihood, which is the proportion of each pixel value. Specifically for pixel-level facies prediction, the above equation can be reformulated as,

where P(pixel) and P(litho) are computed based on core photos from a sample well, and P(pixel|litho) is computed based on human labelling. The posterior probability is generated for each pixel. Facies masks were then generated based on the predictions at each pixel, with further aggregation in order to assign a facies to each pixel row and depth increment. A similar approach was conducted on the UV core images to predict intervals with fluorescence and therefore potential hydrocarbon shows, with the known caveat that dolomite can also fluoresce. By combining the results of the white light and UV core image models it was also possible to further subdivide FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

83


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 7 Core image facies and fluorescence prediction.

those classes predicted in the white light model. The presence of a UV fluorescence flag confirmed an oil-stained class prediction over a clay matrix according to the white light model. Figure 7 illustrates the decision path.

is a more precise technique, with means of measuring micropores that are not possible to observe and record in thin section images. Such a differential could be used to quantify the areal percentage of micropores in the samples. Irrespective of the inflated helium porosity values, the overall trends, when plotted on depth, were strongly comparable as illustrated in Figure 8 for one well. An overall conformance of 0.64 cosine similarity was recorded across all wells. After further investigation, samples having a significant divergence beyond a threshold greater than 20% was typically attributed to thin section image capture bias, where results from the images were accurate but were not representative of the entire plug sample. Such scenarios also provide an indication as to the degree of heterogeneity for a given sample. Further comparison also highlighted a positive relationship between predicted cemented, mudstone and calcareous matrix facies and lower porosities, highlighting a means of qualifying the controls on reservoir quality using the thin section microfacies prediction results. Table 1 illustrates the average porosity and selected XRD-derived mineral weight percentage values for the predicted microfacies classes.

Data integration and QC The thin section and core image analysis models generated predicted facies classes over 1247.04 m of core, 14,300 microfacies and 85,800 grain and cement type occurrences. In addition, over 173,000 individual pores were identified. Six hundred and nineteen thin sections were identified as porous and classified as having ‘good’ reservoir properties based on a porosity threshold of greater than 10%. 907.42 m of potential reservoir facies were predicted over the cored intervals of the study wells, of which 282.27m contained hydrocarbon indicators. Of the 6070 distinct depths with thin section-derived porosity, 4344 images had a corresponding helium porosity value for cross validation. Interrogation indicated that helium porosity values were on average 7.9% higher than segmented pore values which, in many cases, relates to the difference in measurement precision between the two methods. Helium porosity analysis

Predicted Thin Section Microfacies 1. Calcareous matrix with grains

2. Mudstone

3. Porous limestone

4. Cemented

5. Bivalve Floatstone

Pore segment (av. area %)

2.54

1.6

8.13

2.59

5.42

Helium porosity (av. volume %)

5.46

7.68

12.75

3.71

5.6

Quartz (av. volume %)

13.92

29.25

10.85

29.58

24.67

Dolomite (av. volume %)

21.04

59

64.76

31.89

37.8

Table 1 Average pore segment, helium porosity and XRD mineralogy values for predicted thin section microfacies.

84

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

The results of the cross examination between data sets illustrate trends that would be expected, helping to validate the results of the model. But what is also evident is the demonstrated means of using the predicted microfacies to explain trends in reservoir properties. For all available depth points with thin section imagery, it would be possible to ascertain both the reservoir quality and also the controls on pore volume statistics. While it is possible to derive similar results and observations using manual methods, it is the speed with which the data demonstrated here was generated, from such a wide array of disparate datasets, that highlights the potential significance for subsurface workflows. The results of QC rule screening and cross-data validation highlighted that a proportion of the segmented pore space data (79.7%) was suitable, according to Open Subsurface Data Universe (OSDU) technical assurance terminology, for general reservoir screening purposes. Suitability was defined where individual pore dimensions and sample scale total porosity values were within acceptable ranges, although sample datum parameters may be absent. 51.6% of the generated data was deemed appropriate for the purpose of formation evaluation calibration where a greater degree of data confidence and precision is required. In this instance, there was a requirement for porosity values between the methods to conform within a 15% differential margin, for clear recorded depth datums to be present, and for pore size attributes and facies predictions to align across both thin section and core images - 20.3% of the image dataset was deemed unsuitable for any further analysis due to one or a combination of factors. Many images failed the image quality pre-processing screening steps, typically being too dark or blurred. Many images were taken at too high a magnification and so were not representative of the samples in general. Porosity extraction and screening indicated some images only contained artificial, oversized or extensively elongated pores through grain plucking or sample delamination, features

typically created during sample preparation and therefore of no value. When comparing the thin section microfacies prediction results against the core image model, there was conformance between the reservoir classes at both scales in 46.4% of the total instances with conflicts relating typically to localised features only visible at the thin section pore scale, such as silica cements. Figure 9 illustrates such a scenario where two thin sections exist at the same depth with significantly contrasting porosity and microfacies predictions. Comparison with the core image model results and core analysis data indicates both thin section image predictions are accurate, but the lower-porosity image relates to a local cement that also highlights a degree of rock heterogeneity at the centimetre scale. Observations of this kind are quick to ascertain in low sample numbers, but this method of cross-data examination provides a means of quickly screening the heterogeneity of samples and the ability to assess the impact of features observed at the finest scale and their relation to larger-scale reservoir properties. A final comparison across all the model outputs illustrated a positive correlation between higher UV core fluorescence and typically higher helium porosity values and close correlation with the porous, grain-dominated and oil-stained rock classes as predicted from white light core images and thin sections (Figure 10). Areas of non-conformance seemed to relate to fluorescence occurring over lower-porosity, microporous zones. Conclusions This article discusses how the use of AI technologies looked to address the challenge of assessing reservoir quality and its distribution in the subsurface, where the data available for analysis exists in large volumes and unstructured formats. Starting with a corpus of 78,018 unstructured documents from 228 wells, the use of a series of interlinked AI technologies arranged within a pipeline led to identification of the required sub-set of data and

Figure 8 Correlation between thin section, core analysis and thin section microfacies predictions.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

85


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 9 Correlation between thin section and core image model prediction results. A scenario where two thin section images exist at the same depth and have conflicting segmented porosity values reflecting sample heterogeneity. Comparison with the helium porosity measurement indicates which of the images is the most representative.

Figure 10 Comparison across models illustrating distribution of UV fluorescence in relation to porosity values and both predicted microfacies and core image facies.

then the generation of new quantitative porosity values and the prediction of microfacies from 14,300 thin section images. A separate process within the same pipeline led to the prediction of geological rock properties and oil show detection from 1856 core photos. Manual identification, extraction, analysis and integration of these same datasets would be considered a year-long exercise, mobilising various subject-matter experts. In the method outlined in this paper, it was possible to generate porosity values and predict microfacies for the 14,300 thin section images in five days of processing time, equivalent to 30 seconds per image, utilising one CPU and one GPU for inference, once the training model was in place. The processing time for core images was slower, since Naïve Bayes needed to 86

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

iterate over all the pixels in the input images, therefore taking dozens of hours to predict the facies for one well. To improve this, deep learning-based image-level models could be developed to reduce the processing time per image from hours to minutes. The various QC steps and cross-validation flagged errors in the model outputs that could be isolated and also identified geological explanations for variance in reservoir properties. However, it is still apparent that errors and false positives remain to be addressed after later iterations and models. Identifying rock properties in core from colour alone is not sufficient to differentiate between many rock classes, which means that further work is needed to improve the core facies prediction model through the use of a CNN model and labels. This alongside the results


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

of the current model would help in the identification of textural characteristics in the images. The thin section microfacies model performed well where sufficient labels existed within the training set, and application of the model outside of the Brazil pre-salt stratigraphy indicates a more universal application is possible, with classes such as ‘Mudstone’ and ‘Cemented’ being identified with high accuracy when the model is applied across global carbonate image libraries. A full screening of the reservoirs discussed in this article would consider additional data types, whether used in traditional workflows or in the methods discussed here. Future work will look to use the model outputs to infill depth sections of the subsurface where core analysis is absent and integrate downhole test data, such as Drill Stem Tests (DSTs) and wireline tool tests, to further validate the results. All results will be used and tested as additional input calibration points to image log and petrophysical interpretation models. In summary, the merits of the approach outlined in this article aim to demonstrate that such methods are a means of both efficiently extracting data from documents and generating large amounts of new geological data in timeframes that would not be possible using manual methods. As the results indicate, the models will never predict results with 100% accuracy. However, the models enable large volumes of data to be screened quickly, with QC flags and certainty scores in place to guide subject-matter experts to those wells of interest or needing further attention and targeted manual assessment. The approach outlined is intended to be a method utilised alongside manual methods to help tackle large datasets, enabling initial higher-level screening across the entirety of regional datasets prior to the implementation of manual techniques in specific areas once a greater regional understanding of data coverage and trends is available.

References Dietz, J., Wang, H., Jarvis, E. and Hou, S. [2022]. Unlocking the Potential in Your Core, Thin Section, and Image Log Data Through Image Processing. Third EAGE Conference on Pre Salt Reservoirs. European Association of Geoscientists & Engineers, 1-7. Dietz, J., Wang, H., Hou, S., Jarvis, E. and Sekti, R. [2023]. Borehole Image Logs: New Approaches to Automated Surface, Breakout and Facies Interpretation. Third EAGE Digitalization Conference and Exhibition. European Association of Geoscientists & Engineers, 1-5. Falivene, O., Auchter, N.C., Pires de Lima, R., Kleipool, L., Solum, J.G., Zarian, P., Clark, R.W. and Espejo, I. [2022]. Lithofacies identification in cores using deep learning segmentation and the role of geoscientists: Turbidite deposits (Gulf of Mexico and North Sea). AAPG Bulletin, 106(7), 1357-1372. Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A. and Bengio, Y. [2014]. Generative adversarial nets. Advances in neural information processing systems, 27, 2672-2680. Liu, Z., Mao, H., Wu, C., Feichtenhofer, C., Darrell, T. and Xie, S. [2022]. A convnet for the 2020s. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. Lokier, S.W. and Al Junaibi, M. [2016]. The petrographic description of carbonate facies: are we all speaking the same language? Sedimentology, 63, 1843-1885. Lun, C. H., Hewitt, T. and Hou, S. [2022]. A machine learning pipeline for document extraction. First Break 40(2), 73-78. Rubo, R.A., Carneiro, C.C., Michelon, M.F. and Gioria, R.S. [2019] Digital petrography: Mineralogy and porosity identification using machine learning algorithms in petrographic thin section images. Journal of Petroleum Science and Engineering, 183. Webb, G. I., Keogh, E. and Miikkulainen, R. [2010]. Naïve Bayes. Encyclopedia of machine learning 15(1), 713-714. Wright, P. and Barnett, A.J. [2015]. An abiotic model for the development of textures in some South Atlantic early Cretaceous lacustrine carbonates. Geological Society, London, Special Publications, 418.

ADVERTISEMENT

February 29 + March 1 EXHIBITION CENTER OFFENBURG www.geotherm - offenburg.de/en

GeoTHERM.indd 1

FIRST

BREAK

I

VOLUME

42

I

07/11/2023 16:29

FEBRUARY

2024

87


12-13 AU G U S T 202 4 I PE R TH , AU S TR A LI A

SUBMIT YOUR

ABSTRACT! S U B M I S S I O N D E A D L I N E : 31 M A R C H 2 0 2 4


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

A framework for mineral geoscience data and model portability John McGaughey1*, Julien Brossoit1, Kristofer Davis1, Dominique Fournier1 and Sébastien Hensgen1 present a data structure for integration and storage of geological models, data, and metadata where dissemination, ease of access, and persistence are required without commercial encumbrance. Abstract We have developed a data structure called GEOH5 with the objective of integration and storage of geological models, data, and metadata where dissemination, ease of access, and persistence are required without commercial encumbrance. Our emphasis is on the needs of the mineral industry which, unlike the upstream oil and gas industry, otherwise lacks common data exchange formats with a scope encompassing most exploration and production data types. Although only a few years old, the data structure is already in use by many thousands of users with increasing acceptance across mineral geoscience and engineering. This includes industry, academia, and geological surveys that use GEOH5 as a documented, public, easy-to-use, vendor-neutral, and permanently accessible means of storage and communication. GEOH5 is open source and free to use. It is based on open-source HDF5 technology because of its many advantages: wide acceptance across numerous data-intensive industries, self-describing behaviour through integration of data and metadata, fast I/O, excellent compression, file merging, cross-platform capability, unlimited data size, and access to libraries in a variety of programming languages. It provides professionals, researchers, and the public at large with a robust means of managing, exchanging, and visualising large quantities of diverse mineral geoscience and engineering data. Introduction Barriers to interoperability, imposed by design or default by software vendors for commercial reasons, serve neither the interests of technology advancement nor the objectives of the data acquirers, interpreters, and researchers who need to exchange and disseminate their geoscientific data, metadata, and models. Geoscientists must often undertake complex and costly manual workarounds to share data and models among mutually non-interoperable systems, imposing costs as well as potential data loss and introduction of error. The result is loss of productivity, poorer decision making, and dissatisfaction with proprietary systems. We describe an open-format file structure called GEOH5 (Section 1) as a useful solution to the interoperability problem in

1

Mira Geoscience Ltd

*

Corresponding author, E-mail: johnm@mirageoscience.com

the minerals industry. We also describe an open-source Python API, GEOH5Py (Section 2), that provides a standard programmatic interface for reading from and writing to the GEOH5 format, and finally we describe a powerful, free-to-use viewer of the content of GEOH5 files called Geoscience ANALYST (Section 3). The open-source API and free viewer are what make the file structure easy to use for geoscientists and promote its acceptance as a standard. An inexact but useful analogy to GEOH5 is the ubiquitous Portable Document Format (PDF), an ISO standard that seeks to capture documents in a manner independent of application software, hardware, and operating system. In an analogous manner, GEOH5 provides an open, documented, extensible structure for storing and sharing geoscientific models, data, and metadata. The structure is aligned with the FAIR guiding principles for making data Findable, Accessible, Interoperable, and Reusable (Lightsom et al., 2022). An open format for geoscience data and models GEOH5 is a documented public, open, easy-to-use, vendor-neutral, and permanently accessible data exchange and storage format for the general geosciences, whose scope is primarily intended to cover the needs of the minerals industry and research community. Its power lies in its capacity to handle many data types—from property-attributed points, curves, and wireframe surfaces to drillholes, geophysical data of many types, 3D models, and mine production objects and data. The format facilitates interoperability between different software applications, fostering a collaborative environment for geoscientists, researchers, analysts, and other stakeholders, including for public dissemination. It provides a unified format that bridges the gap between different software tools. GEOH5 has its design roots in the Hierarchical Data Format version 5 (HDF5), a universally accepted and widely used data model, library, and file format for storing and managing complex data. HDF5’s attributes make it an obvious choice as a foundation for an open geoscience data standard: wide acceptance across numerous data-intensive industries, self-describing behaviour through integration of data and metadata, fast I/O,

DOI: 10.3997/1365-2397.fb2024019

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

89


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

excellent compression, file merging, cross-platform capability, unlimited data size, and access to libraries in a variety of programming languages. It provides both professionals and researchers with a robust means of handling large quantities of diverse data. The content of the data structure’s files is readable and writeable by third-party software using scientific programming environments such as open-source HDFview, Python, MATLAB, Fortran, C, and C++. As an illustration of accessing GEOH5 content from C++, we have provided GEOH5 importers and exporters as ASPEN-SKUATM (ASPEN-SKUA is a trademark of Aspen Technology, Inc.) add-ons, a commercial geological modelling and simulation application used in both the oil and gas and mining industries. GEOH5 is similar in objective to RESQMLTM (RESQML― reservoir and earth modelling―is a trademark or registered trademark of Energistics Consortium, Inc.), the familiar oil and gas industry data interchange standard, which also uses the HDF5 data format (in combination with XML schemas for implementation in software applications). The objective of RESQML is to provide an industry-led, open, non-proprietary standard to support data and model interchange among proprietary software applications across the reservoir modelling, interpretation, and simulation workflow. Applications that implement RESQML can access data and models in the defined standard format. However, RESQML covers only a subset of minerals industry requirements―there are many data and model types used in the minerals industry but not in oil and gas, and vice versa. Examples include the many differences between an oil and gas industry ‘well’ object and a mining industry ‘drillhole’ object; the types and complexity of non-seismic geophysical surveys commonly used in the mining industry; and the different modes and characteristics of production of minerals. The difference in types of objects and associated data between the mining of an ore deposit and producing a reservoir are vast. In fact, there is minimal use of oil and gas reservoir modelling and simulation application software in the mining industry. Due to the fundamentally different nature of the upstream minerals and oil and gas industries and the long timeframes anticipated in investigating and potentially implementing mineral-industry requirements through an established oil and gas standard, we approached the problem of developing a minerals industry standard independently, yet remaining aware of the oil and gas industry experience in establishing the RESQML interoperability standard. We investigated the possibility of following nascent mineral industry collaborative initiatives such as the Global Mining Guidelines Group ‘Open Mining Format’ (OMF) (https://gmggroup.org/projects/data-exchange-for-minesoftware). Scope evolution of such standards has been slow and remains focused on very basic object types (e.g., points, lines, surfaces, meshes) to the exclusion of important upstream exploration data types such as geophysical surveys. More narrowly focussed geophysical survey data standards, similar to GEOH5 in conception, such as ‘GS’ (Geophysical Standard) have recently been independently proposed (James et al., 2022). GS is based on the NetCDF (https://docs.unidata.ucar.edu/netcdf-c/current) format, in turn also relying on HDF5 as its underlying data structure. It is a useful initiative in paving the way to breaking 90

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

the current, deplorable state of geophysical data interchange in the minerals industry, which is dominated by exchanging files in proprietary binary formats that require commercial licences in order to access content. In the end, however, open data standard initiatives like OMF and GS are limited in scope, requiring multiple standards to be deployed to describe integrated projects that combine geological, geophysical, and other data types. We believe that what is required is a minerals-industry-focused, open data standard that has the important qualities afforded by HDF5, can evolve rapidly, and is at least as broad in scope as the role that RESQML serves in the oil and gas industry, extending across the multitude of mineral exploration and production data types. GEOH5 was created to meet that need. GEOH5 data structure

GEOH5 facilitates efficient data management and processing. Building upon the strengths of HDF5, it introduces an effective structure to encapsulate geological data, including spatial and attribute information. The format employs a compact and intuitive tree structure, ensuring quick access to data and simplified data processing. This feature reduces the time spent on data retrieval and manipulation, significantly enhancing overall productivity. The main structure of the format is shown in Figure 1, as displayed by the free HDFview program (https://www.hdfgroup.org/ downloads/hdfview). Groups, objects and data entities are stored in flat structures and indexed by a unique identifier as specified by the RFC 4122 standard (https://datatracker.ietf.org/doc/html/ rfc4122). Entities hold references to their own children for rapid navigation. At the top level, the Root container pointers to the full hierarchy of the file, providing the complete linkage between all entities and their dependents, ensuring a seamless and organised structure for efficient access and retrieval of information. Groups are simple containers for other groups and objects. They are often used to assign special meanings to a collection of entities or to create specialised software functionality. The current set of objects implemented in the data structure supports a range of geological, geophysical, geotechnical, and mining data and model elements that can be attributed with properties: points, curves, surfaces, volumetric domains, drillholes, drillhole targets, rectilinear 2D grids, 3D regular and ‘tartan’ grids, 3D octree grids, 3D VP (vertical parameterisation) grids, raster images, thin plates (to support electromagnetic modelling), airborne and ground EM transmitters and receivers, airborne and ground gravity and magnetic surveys, magnetotelluric surveys, tipper (ZTEM) surveys, mine geometry, microseismic events, ground deformation, plus various other minesite data types. Data are currently always stored as a 1D array, even in the case of single-value data. New data types can be created at will by software or users to describe object or group properties. Data of the same type can exist on any number of objects or groups of any type, and each instance can be associated with vertices, cells, or the object/group itself. Some data type identifiers can also be reserved as a means of identifying a specific kind of data. Data attributes include specification of the primitive type with optional descriptive metadata (e.g., units and text description) and display parameters to be used by a viewer. Primitive types include float,


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

performed on the data structure with simple function calls, as demonstrated in Figure 2. This high-level interaction with the GEOH5 storage format allows practitioners to easily leverage the rich Python ecosystem to build their own custom processing routines. GEOH5Py itself relies on the open-source NumPy and H5py packages. Documentation describing the format (geoh5py.readthedocs. io), and its GEOH5Py API, are available online and updated with every release.

Figure 1 At left, main structure of the GEOH5 file format. At right, Data, Groups and Objects entities are stored in flat HDF5 containers, each indexed by a unique identifier. Pointers to the child entities are given for rapid navigation through the tree structure.

integer, text, referenced or categorical, datetime, filename (which must correspond to a stored binary file as a data instance), and blob (which must correspond to a binary dataset as a data instance). GEOH5Py: An open-source API We created an open-source Python API to facilitate reading from and writing to GEOH5 format. With GEOH5Py, it is simple to build an application to read and write GEOH5, or to conveniently add GEOH5 to the import and export types supported by other software platforms. For example, we have used GEOH5Py to provide a conversion between OMF and GEOH5. With the help of the API, users can easily create, modify, and remove objects and data programmatically. The main component is the Workspace class. It handles all read/write operations

Geoscience ANALYST: a free GEOH5 viewer The utility of the freely downloadable (https://mirageoscience. com/mining-industry-software/geoscience-analyst) Geoscience ANALYST reader is a principal motivation for geoscientists to adopt GEOH5. It is a powerful viewer that displays file data and metadata in tables, charts, documents, maps, cross-sections, and 3D visualisations. In the PDF analogy to GEOH5, Geoscience ANALYST plays the role of the freely downloadable Adobe Acrobat reader — the existence of which is a principal motivation for users to adopt the PDF document standard. However, in contrast to the Acrobat reader, the Geoscience ANALYST reader can also import additional data and save them back to the GEOH5 file. Because you can create a new GEOH5 file or add to an existing one with the free viewer, Geoscience ANALYST is much more powerful than the free Acrobat reader in this analogy. It is intended that Geoscience ANALYST preserves data it does not understand (and is generally very tolerant with regards to missing information) when loading and saving GEOH5 files. This will allow third parties to write this format easily, as well as to include additional information for their own purposes that is not included in the formal specification. In the current implementation, Geoscience ANALYST automatically removes unnecessary information when saving.

Figure 2 Example demonstrating the creation of a new GEOH5 file containing a Points object and associated data, with the file contents viewed by the Geoscience ANALYST reader.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

91


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Geoscience ANALYST presents data object and property names in a conventional tree structure. Currently supported object types are points, curves, triangulated wireframe surfaces, drillholes, 2D (map) grids, 2D geophysical grids (curves in X-Y, vertically oriented, and topographically draped), multiple types of 3D grids (regular cell size, ‘tartan’ grid, octree grid, vertical prism), and rasters. It provides multiple, linked object

and property visualisation modes: cartesian and spherical (for global views) 3D cameras, 2D map views, cross-sections, 2D data profiles, decay curves, drillhole monitoring, scatter plots, box-and-whisker plots, histograms, and tabular data displays. When one or more points are selected in any of the display panels, the same points are indicated in all open display panels. The Geoscience ANALYST interface is illustrated in Figure 3,

Figure 3 The Geoscience ANALYST interface, with navigable tree structure at left, 3D camera and data table at centre, and an external database query panel at right.

Figure 4 A Python processing routine in a Jupyter notebook (at left) provides its output as a new or updated GEOH5 format file, the 3D visualisation of which is refreshed with a click in the free Geoscience ANALYST viewer application at right.

92

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024


SPECIAL TOPIC: DIGITALIZATION / MACHINE LEARNING

Figure 5 A version of Geoscience ANALYST illustrating the embedding of open-source SimPEG (https://simpeg.xyz) Python geophysical inversion codes directly into the menu system.

depicting underground mine infrastructure and microseismic events in the example. The combination of GEOH5, GEOH5Py, and Geoscience ANALYST provides a dynamic environment for research and software prototyping for geoscientists because of how easily it connects open-source Python libraries, open-source GEOH5 and GEOH5Py, and a free and powerful 3D viewer into which a wide array of contextual data and models elements (drillholes, geophysical data, geological models, etc.) can be easily imported. Figure 4 demonstrates a simple example in which the output of a Python data processing code written in a Jupyter notebook is easily displayed in 3D in Geoscience ANALYST, using GEOH5 as the common data structure. This capability has enabled us to create a repository of opensource geoscience applications called geoapps (https://pypi.org/ project/geoapps) as a central repository to interfaces and applications including geological and geophysical data processing, modelling, and inversion codes. (We have also created paid versions of Geoscience ANALYST that fully encapsulate open-source and proprietary processing and modelling functions, and that permit users to add access to Python applications directly to the Geoscience ANALYST user interface—see Figure 5.)

ance across the minerals industry. This includes geological survey organisations that are using it as a convenient, compact, and permanently accessible means of disseminating models and data with embedded metadata. Anyone can build an application to read and write GEOH5, or conveniently add it to the import and export types supported by modelling platforms. In addition to portability, the freely available data structure, API, and visualisation system provides significant benefits to open-source geoscience modelling initiatives, allowing modelling researchers to focus on modelling technology rather than the creation of data structures, user interfaces, and visualisation systems to support their work. The Python API provides a convenient mechanism for immediately visualising the results of Python modelling and data processing routines in the Geoscience ANALYST viewer at no cost, relieving Python application developers of the need to reinventgeoscience domain interfaces and visualisation methods. References James, S., Foks, L. and Minsley, B. [2022]. GSPy: A new toolbox and data standard for Geophysical Datasets. Frontiers in Earth Science. 10. 907614. 10.3389/feart.2022.907614. Lightsom, F.L., Hutchison, V.B., Bishop, B., Debrewer, L.M., Govoni, D.L., Latysh, N. and Stall, S. [2022]. Opportunities to improve

Conclusions Although only a few years old, the GEOH5 data structure is already in use by many thousands of users with broad accept-

alignment with the FAIR Principles for U.S. Geological Survey data: U.S. Geological Survey Open-File Report 2022-1043, 23p. https:// doi.org/ 10.3133/ ofr20221043.

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

93


CALENDAR

CALENDAR OF EVENTS 25-27 MARCH 2024

4th EAGE Digitalization Conference & Exhibition Paris, France www.eagedigital.org

February 2024 12-14 Feb

IPTC 2024 www.iptcnet.org

Dhahran

Saudi Arabia

19-21 Feb

EGYPES 2024 www.egypes.com

Cairo

Egypt

26-28 Feb

First EAGE Data Processing Workshop www.eage.org

Cairo

Egypt

29 Feb 1 Mar

GeoTHERM Expo & Congress 2024 www.geotherm-offenburg.de/en

Offenburg

Germany

4-6 Mar

EAGE Sub-Saharan Africa Energy Forum www.eage.org

Windhoek

Namibia

25-27 Mar

4th EAGE Digitalization Conference & Exhibition www.eagedigital.org

Paris

France

EAGE GeoTech 2024 - 3 rd EAGE Geoscience Technologies and Applications Conference

The Hague

The Netherlands

First EAGE Workshop on Advances in Carbonate Reservoirs: from Prospects to Development www.eage.org

Kuwait City

Kuwait

1-2 May

Seismic2024 www.spe-aberdeen.org/events/seismic-2024

Aberdeen

United Kingdom

13-15 May

Fourth EAGE/AAPG Hydrocarbon Seals Workshop www.eage.org

Al Khobar

Saudi Arabia

13-15 May

6th Asia Pacific Meeting on Near Surface Geoscience and Engineering www.eage.org

Tsukuba

Japan

13-16 May

InterPore 2024 events.interpore.org/event/46/

Qingdao

China

20-23 May

EAGE/SUT Workshop on Integrated Site Characterization for Offshore Renewable Energy www.eage.org

Boston

United States

27-31 May

12 th Congress and Technical Exhibition of the Balkan Geophysical Society (BGS) www.ageserbia.org

Kopaonik Mt.

Serbia

85 th EAGE Annual Conference and Exhibition www.eageannual.org

Oslo

Norway

March 2024

April 2024 8-10 Apr

(including 4 th EAGE Workshop on Distributed Fibre Optic Sensing and 4th EAGE Workshop on Practical Reservoir Monitoring)

www.eagegeotech.org

23-25 Apr May 2024

June 2024 10-13 Jun

EAGE Events

94

FIRST

BREAK

Non-EAGE Events

I

VOLUME

42

I

FEBRUARY

2024


CALENDAR

July 2024 30-31 Jul

EAGE Workshop on Advanced Petroleum Systems Assessments In Pursuit of Differentiated Barrels

Kuala Lumpur

Malaysia

www.eage.org

August 2024 12-13 Aug

3 rd EAGE Conference on Carbon Capture and Storage Potential www.eage.org

Perth

Australia

14-15 Aug

4th EAGE Workshop on Fiber Optic Sensing for Energy Applications www.eage.org

Perth

Australia

14-15 Aug

EAGE/SUT Workshop on Integrated Site Characterization for Offshore Wind www.eage.org

Perth

Australia

18-23 Aug

Goldschmidt 2024 conf.goldschmidt.info/goldschmidt/2024/meetingapp.cgi

Chicago

United States

September 2024 2-4 Sep

Fourth EAGE Marine Acquisition Workshop www.eage.org

Oslo

Norway

2-5 Sep

ECMOR 2024 European Conference on the Mathematics of Geological Reservoirs www.ecmor.org

Oslo

Norway

8-12 Sep

Near Surface Geoscience Conference and Exhibition 2024 www.eagensg.org

Helsinki

Finland

12-13 Sep

First EAGE Workshop on The Role of AI in FWI www.eage.org

Cartagena

Colombia

16-18 Sep

Eighth EAGE High Performance Computing Workshop www.eage.org

Jeddah

Saudi Arabia

17-19 Sep

Fourth EAGE Conference on Pre-Salt Reservoir www.eage.org

Rio de Janeiro

Brazil

23-24 Sep

Asia Petroleum Geoscience Conference and Exhibition (APGCE) icep.com.my/apgce

Kuala Lumpur

Malaysia

October 2024 3-4 Oct

Third EAGE Workshop on EOR www.eage.org

Buenos Aires

Argentina

6-8 Oct

EAGE Workshop on Naturally Fractured Rocks (NFR) www.eage.org

Muscat

Oman

14-16 Oct

Third EAGE Conference on Seismic Inversion www.seismicinversion.org

Naples

Italy

15-16 Oct

EAGE Conference on Unleashing Energy Excellence: Digital Twins and Predictive Analytics www.eage.org

Kuala Lumpur

Malaysia

21-24 Oct

GEO 4.0: Digitalization in Geoscience Symposium www.eage.org

Al Khobar

Saudi Arabia

24-25 Oct

Third EAGE Workshop on Advanced Seismic Solutions in the Gulf of Mexico www.eage.org

Mexico City

Mexico

29-30 Oct

EAGE Workshop on Borehole Geophysics for CCUS and Energy Transition www.eage.org

Hangzhou

China

29-31 Oct

Fourth SPE/EAGE Geosteering and Well Placement Workshop www.eage.org

Al Khobar

Saudi Arabia

November 2024 4-6 Nov

First EAGE Workshop on Tectonostratigraphy of the Arabian Plate www.eage.org

Al Khobar

Saudi Arabia

4-7 Nov

Fifth EAGE Global Energy Transition Conference and Exhibition www.eageget.org

Rotterdam

The Netherlands

5-7 Nov

First EAGE Conference on Energy Opportunities in the Caribbean www.eage.org

Port of Spain

Trinidad & Tobago

12-13 Nov

2 nd EAGE Workshop on Integrated Subsurface Characterization and Modeling www.eage.org

Kuala Lumpur

Malaysia

EAGE Events

Non-EAGE Events

FIRST

BREAK

I

VOLUME

42

I

FEBRUARY

2024

95


NEW BOOK RELEASED

Pre-Cambrian to Paleozoic Petroleum Systems of the Arabian Plate Edited by Thomas B. van Hoof

GET YOUR COPY

NOW ON EARTHDOC WWW.EARTHDOC.ORG

AVAILABLE IN EPUB FORMAT compatible with various e-readers

Benefit from an additional journal subscription! FIRST BREAK VOLUME 42 I IS SUE 1 I JANUARY 2024

2024

Petroleum Geoscience

VO L U M E 4 2 I I S S U E 1 I J A N U A R Y 2 0 2 4

Geoenergy

O S L O | N O R WAY

ABSTRACT SUBMISSION LAST CALL! DE A DL INE : 15 J A NUA RY 2024

SPECIAL TOPIC

Land Seismic EAGE NEWS President’s half-term report INDUSTRY NEWS Norway maps oil and gas resources TECHNICAL ARTICLE Machine learning for petrophysical modelling

SUBMIT NOW! WWW.EAGEANNUAL.ORG HOST

OSL24-V2Fd.indd 23457-FB24 January_Cover.indd 1 1-3

PARTNERS

28/11/2023 14:23

15/12/2023 10:46

As an EAGE member, you have free online access to a journal of your own choice. But did you know that it is also possible to subscribe to other EAGE scientific journals for an additional fee?

FOR MORE INFORMATION PLEASE VISIT EAGE.ORG/MEMBERSHIP Journal Scientific journals 171x115,5.indd 1

17/01/2024 14:39


JOIN T HE L A RGE S T MULT IDIS CIPL IN A R Y GEOSCIENCE & ENGINEERING EVENT

From June 10-13, 2024 in Oslo, experience cutting edge events with a focus on: “Technology and talent for oslo advert a secure, sustainable energy future.” Be a part of thought-provoking discussions shaping our sustainable energy journey with innovative technology and talent development. See you there! -

REGISTER NOW AT EAGEANNUAL.ORG ALL ACCESS PASS BUNDLE & SAVE Technical & Strategic Programs Exhibitions, Theatres & Networking Workshops, Field Trips & Hackathons

HOST SPONSOR

The Association aims to advance geosciences and related engineering, encourage innovation and technical progress, and enhance communication, fellowship, and cooperation among individuals involved or interested in these fields.


25-27 MARCH 2024 I PARIS I FR ANCE

DELIVERING BETTER ENERGY IN A TRANSFORMING WORLD

JOIN E AGE ’ S FL AGSHIP E VENT HIGHLIGHTING D I G I TA L I Z AT I O N A N D T E C H N O L O G I C A L I N N O V AT I O N I N T H E S U B S U R F A C E

TECHNICAL PROGRAMME

S T R AT E G I C P R O G R A M M E

FOCUSED EXHIBITION

D e l ve i n to D i ve r s e O r a l a n d P o s te r Pr e s e n t a t i o n s

E n a g a g e w i t h E xe c u t i ve S p e a ke r s i n Key n o te s , R o u n d t a b l e s and Panel Discussions

Meet the Companies Pi o n e e ri n g D i g i t a l S o l u t i o n s

REGISTER NOW!

TO CL AIM YOUR SPOT REGULAR REGISTRATION ENDS ON 10 FEBRUARY

HOSTED BY

W W W. E A G E D I G I TA L . O R G


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.