Next Generation Data Validation and Equation Based Reconciliation Software Technology Next Generation Data Validation and Equation Based Reconciliation Software Technology for Offshore Installations Collecting Data and Generating Information Measuring Operational Complexity Reconciling Errors Meters and Metrics
Published by Global Business Media
SPECIAL REPORT: NEXT GENERATION DATA VALIDATION AND EQUATION BASED RECONCILIATION SOFTWARE TECHNOLOGY
Next Generation Data Validation and Equation Based Reconciliation Software Technology Next Generation Data Validation and Equation Based Reconciliation Software Technology for Offshore Installations
Collecting Data and Generating Information Measuring Operational Complexity Reconciling Errors Meters and Metrics
John Hancock, Editor
Next Generation Data Validation and Equation Based Reconciliation Software Technology for Offshore Installations
Antonio MARTINO, Georges DE VOS, Belsim SA Sponsored by
Published by Global Business Media
Published by Global Business Media Global Business Media Limited 62 The Street Ashtead Surrey KT21 1AT United Kingdom Switchboard: +44 (0)1737 850 939 Fax: +44 (0)1737 851 952 Email: email@example.com Website: www.globalbusinessmedia.org Publisher Kevin Bell
Introduction DVR Theoretical Background Post-Reconciliation Consistency Check DVR Application from Downstream to Upstream Areas of the Oil and Gas industry New DVR Applications – Virtual Flow Metering Belsim’s VALI Data Validation and Reconciliation Solution Conclusions Belsim Vali Success Stories and References in the Upstream Oil and Gas Sector
Collecting Data and Generating Information
Editor John Hancock
Why Data and Information? Using the Measurement Other Factors Where to Measure
Senior Project Manager Steve Banks
Measuring Operational Complexity
Business Development Director Marie-Anne Brooks
Advertising Executives Michael McCarthy Abigail Coombes Production Manager Paul Davies For further information visit: www.globalbusinessmedia.org The opinions and views expressed in the editorial content in this publication are those of the authors alone and do not necessarily represent the views of any organisation with which they may be associated. Material in advertisements and promotional features may be considered to represent the views of the advertisers and promoters. The views and opinions expressed in this publication do not necessarily express the views of the Publishers or the Editor. While every care has been taken in the preparation of this publication, neither the Publishers nor the Editor are responsible for such opinions and views or for any inaccuracies in the articles.
© 2013. The entire contents of this publication are protected by copyright. Full details are available from the Publishers. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical photocopying, recording or otherwise, without the prior permission of the copyright owner.
John Hancock, Editor
Peter Dunwell, Correspondent
Field Management Looking After the Field and Equipment A Changing Operating Environment Further and Deeper Life Stages The Case for Reliable Measurement
John Hancock, Editor
Sources of Error Coping with Variances Data Validation and Reconciliation Data Validation Data Reconciliation Links in the Chain
Meters and Metrics
Francis Slade, Staff Writer
Algorithms – Complex but Necessary Reducing Uncertainty Flow Meters Going Digital
References 16 WWW.OFFSHORETECHNOLOGYREPORTS.COM | 1
SPECIAL REPORT: NEXT GENERATION DATA VALIDATION AND EQUATION BASED RECONCILIATION SOFTWARE TECHNOLOGY
TERM THAT is often used around businesses
distributed by Belsim SA, whose aim is to determine
where methodologies have been around for
and improve production efficiency, productivity and
a while is ‘work smarter not harder’. It sounds a
energy consumption, while reducing production and
bit obvious but it is also an appropriate term for
our times. After all, what has driven the second
Data validation and reconciliation has been a real
industrial revolution has not been some new
beneficiary of this latest revolution with better and
mechanical wonder (the steam engine?) or even
more effective ways of recording, manipulating
some new methodology for arranging a series
and presenting data and, in particular, of handling
of mechanical actions into a single system
inconsistencies in data so that the user gets
information corrected to recognised and universally
It has been the improvement in and availability of permanent open communication capabilities
accepted standards. That has significant benefits for field management and output allocation.
combined with greater volumes of knowledge
Also in this Report we have an article on why
about everything, based on improved means
data has to be collected and why the information
to record, process and present data… measurement.
generated from it is used in the oil and gas industry.
The convergence of these two phenomena and
Peter Dunwell then reviews how the sector is
the power of digital technology to compress them
developing and what part good quality data has
into ever smaller packages have made ‘work
to play in that. We then look at how data might not
smarter…’ more than a slogan: it’s a business reality;
always be accurate, some of the reasons why that can
even a necessity.
be so and what can be done to resolve the problem.
This Special Report opens with an article that
Finally, Francis Slade considers how developments
looks at Data Validation and Reconciliation (DVR)
in systems and methodologies are converging to
and traces its development from the late 1960s.
improve information available to all parties involved
While DVR was originally applied to downstream oil
in oil and gas exploration and production.
and gas processes, in the last 10 years it has been applied, also, in the upstream oil and gas industry. Nowadays it is widely used in many industrial installations, all over the world. The article goes on describe VALI, a process software developed and
John Hancock Editor
John Hancock joined as Editor of Offshore Technology Reports in early 2012. A journalist for nearly 25 years, John has written and edited articles and papers on a range of engineering, support services and technology topics as well as for key events in the sector. Subjects have included aero-engineering, testing, aviation IT, materials engineering, weapons research, supply chain, logistics and naval engineering.
2 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
SPECIAL REPORT: NEXT GENERATION DATA VALIDATION AND EQUATION BASED RECONCILIATION SOFTWARE TECHNOLOGY
Next Generation Data Validation and Equation Based Reconciliation Software Technology for Offshore Installations Antonio MARTINO, Georges DE VOS, Belsim SA
Introduction Data validation and reconciliation (DVR) is a technology that has gained importance with industrial processes that were becoming more and more complex. DVR started in the early 1960s with the aim of closing material balances in production processes and of identifying and eliminating gross errors. In the late 1960s also unmeasured variables were taken into account in the data reconciliation formulation. During the 1980s the area of DVR became more mature by considering general nonlinear equation systems coming from thermodynamic models. The principle of DVR is based on the concept of using process information, statistical and mathematical methods to automatically correct measurements in industrial processes, while closing mass and energy balances. The use of DVR allows the operator to have accurate and reliable information about their installation from raw measurement data. It produces a single consistent set of data representing the actual process operation. Historically, DVR has been applied in the downstream oil and gas industry, as well as in chemical and petrochemical sites. Over the last 10 years this technology found its place in
the upstream oil and gas industry (Offshore, Onshore and Subsea installations). As opposed to downstream applications, upstream DVR models have a much simpler process topology. On the other hand the process itself is much more challenging due to the (generally) low quality of data and the high variability of the process. Indeed, especially in subsea installation, the initial quality of the flow meters is rather poor, and calibration of sensors is very costly. Moreover, the process is often dynamic and far from steady state, in particular during start up and shutdown phases. All these factors make DVR a technology with very high added value, while the configuration of DVR models in upstream environments requires a high level of expertise.
DVR Theoretical Background Process measurements are never 100% correct. As a result, an operator has to face incoherencies in its production balances when using raw measurements. Measurement errors are caused by the sensor quality, (intrinsic sensor accuracy, sensor calibration, sensor location) and instabilities in plant operations. Typical errors are illustrated below:
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 3
The principle of DVR is based on the concept of using process information, statistical and mathematical methods to automatically
Data Validation and Reconciliation consists of two processes: 1. The Validation process denotes all verification actions before and after the reconciliation step through various levels of filtering. This ensures and provides a reasonable quality of the process information. 2. The Reconciliation is a technique to correct simultaneously each measurement as little as possible, taking into account the measurement uncertainty, so that all balances and constraints are satisfied. This ensures coherency of all data according to the process model. Reconciliation is mathematically expressed as an optimization problem of the following form (given n measurements yi):
The sum of the above penalties is the minimum possible computed. Data reconciliation is linked to the concept of redundancy, which is the minimum number of pieces of information that are required in
correct measurements in industrial processes
where yi* and yi respectively are the reconciled and the measured value of the measurement, xj is the unmeasured variable j, Ďƒi is the uncertainty of the measurement given a certain confidence level, e.g. 95%. F(x,y*)=0 are the p process equality constraints and xmin, xmax, ymin, ymax are the bounds on the measured and unmeasured variables.
The goal of the reconciliation is to minimize the correction needed to satisfy the system constraints. DVR uses mathematical models to describe a specific system (unit, plant) by a set of variables and a set of equations that establish relationships between the variables. This is showed in the diagram below.
order to calculate all of the system parameters and variables. If there are more of such measurements, there is positive redundancy. These additional measurements can be used as to cross-check and correct the measurements and increase their accuracy and precision. With no additional measured information, the system is by definition just determined with redundancy of level zero. If there are not enough measurements available to satisfy the system, there is a negative redundancy.
Post-Reconciliation Consistency Check
The result of the reconciliation is showed in the opposite picture, where the measurement y is corrected to y* and its standard deviation in improved. The next figure shows a separator where the 3 measurement M give an imbalance (what goes in is different of what is going out). By using the above formulation Vali software can reconcile the information R by computing the penalty P and by using the standard deviation Ďƒ.
4 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
Gross errors are measurement errors that may bias the reconciliation results. They are in strong conflict with the fundamental assumption of a steady state process when applying data reconciliation. Therefore it is important to detect and eliminate these gross errors from the measurement data. Gross errors can be classified in two types: (a) Performance-related gross errors, caused by sensor drifts, wrong calibration of instruments, complete instrument failure and systematic sensor bias. (b) Gross errors that are related to unaccounted for losses like material or energy losses due to leaks. Often gross errors are detected by means of statistical tests such as the chi square test which
combines all measurement penalties. Once the gross errors are detected and identified they are handled separately and the reconciliation can be done without these faulty measurements that spoil the reconciliation process. The chart above illustrates the Belsim VALI workflow for the DVR.
DVR Application from Downstream to Upstream Areas of the Oil and Gas industry The DVR has originally been applied to downstream oil and gas processes where the feasibility of the concept has been proven to supply reliable information. Nowadays it is widely used in many industrial installations all over the world. A recent high technology example is a brand new highly complex refinery, where Belsim DVR software VALI has been implemented to reconcile on a daily basis the whole refinery process data. In the last 10 years DVR has found several applications in the upstream oil and gas industry. One of the main applications for the upstream industry is within subsea installations. One reason is the extremely high maintenance costs in case of failure in subsea installations. DVR provides backup of production data, especially in case of sensor drift or complete failure. From a financial point of view, in cases of facilities that are shared between two or more
operators, this technology will help in providing the most accurate possible calculation of the share of revenue based on actual oil produced.
New DVR Applications â€“ Virtual Flow Metering DVR technology in the upstream oil and gas industry (Offshore, Onshore and Subsea installation) can be used as a Virtual Flow Metering (VFM) system. By using the existing measurements, along with their accuracy, the oil, water and gas flow rate can be estimated in real time. Well flow rates metering and surveillance is essential for reservoir characterization and to optimize oil production. However, accurate well flow rate estimation is difficult to achieve in old installations or where the sensors failed due to wrong calibration or problems with the meter itself. DVR based VFM systems estimate well flow rates without any change to the instrumentation layout. Flow rate corrections are calculated where field-collected measurements are available, and flow rates are estimated when there are no (usable) measurements to hand. The physical model of the installation provides the flow rates using rigorous thermodynamic equations. The non-exhaustive table below lists the locations and parameters that define various types of equations used to estimate the flow rates (oil, water and gas).
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 5
In cases of facilities that are shared between two or more operators, this technology will help in providing the most accurate possible calculation of the share of revenue based on actual oil produced
DVR based VFM systems estimate the flow rate by using simultaneously all the equations available (similar to the equations mentioned above) to make a single estimation, taking into account the uncertainty linked to each measurement. This approach allows for obtaining the best estimation of the flow rate thanks to the exploitation of the information redundancy. The system also forces all the data to be coherent (measurements and flow models), ensuring that all the parameters used in flow models (choke curve, pressure drops, etc.) are consistent with all the measurements in the installation. See above a screenshot of VALI model for a pumped well. There are several benefits in applying the DVR technology in an upstream oil and gas installation: • Allocation of individual well flow rates in real time • Less downtime due to well testing and early problem detection, • Improved revenue control due to calculation of well-produced volumes hourly. • Optimization of oil/gas/water flow rates due to reliable real-time data • Detection and monitoring of various phenomena such as water breakthroughs • Flow rate determination on poorly instrumented wells • Early problem detection, e.g. loss of production
Belsim’s VALI Data Validation and Reconciliation Solution VALI is a process modeling software developed and distributed by Belsim. Its aim is to improve production efficiency, productivity and energy consumption, while reducing production and maintenance costs.
6 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
VALI gives platform managers better insight into the actual performance and closed mass and energy balances. Thanks to advanced data coherency treatment, VALI provides reliable and accurate information based on installation and lab measurement data. VALI is the most powerful equation based process performance DVR software on the market. It uses information redundancy and conservation laws to correct measurements and convert them into accurate and reliable information. VALI is used in upstream oil and gas, refineries, petrochemical and chemical plants as well as power plants including nuclear power stations. VALI detects faulty sensors and identifies degradation of equipment performance (heat rate, compressor efficiency, etc.) The plant measurements, including lab analysis, are reconciled in such a way that mass (on a component per component basis) and heat balances are satisfied. L/V equilibrium and performance constraints are added when applicable. Unmeasured values are calculated and VALI also quantifies the precision of validated values, in particular the KPIs (key performance indicators). VALI features the following capabilities: Most rigorous DVR implementation: VALI directly solves the Data Validation and Reconciliation problem using its dedicated internal solver, without simplification or linearization of the problem. This ensures that the results have the best possible quality. Thermodynamics: VALI’s calculation engine relies on the most powerful thermodynamic calculation package which is necessary to represent the closest possible image of the process in the DVR model. Only by including
thermodynamic calculations can one be sure that the mass and energy balance calculations are trustworthy. Seamless integration of mass and energy balances: In VALI the user can easily choose which equations to utilize around the different pieces of equipment. Depending on the measurements available, one can apply mass balance, mass and energy balance or any equipment-specific equations. There are no separate models or even software packages necessary for mass balancing and for energy balancing. Gross errors: VALI includes unique gross error remediation techniques which efficiently identify and remove (or handle separately) gross errors found in the set of measurements. This not only ensures that the convergence is guaranteed but it also ensures that the results are trustworthy without doubtful measurements. Flexibility: VALI allows the easy introduction of any type of user-defined equations or KPI calculations in the model. For KPIs included in the model VALI automatically indicates questionable results and not only the KPI value. CAPE open: in VALI, users may include external thermodynamic methods through the CAPE Open standard. This extends the thermodynamic calculation capabilities even further. High redundancy: Thanks to the possibility of including any type of energy or performance equations as well as composition information, VALI models have the highest possible level of redundancy which means the highest possible quality in terms of trustworthiness. Inequalities: VALI has the unique capability to include inequality constraints in the model. The solver in VALI has been designed to take actively into account these types of inequality constraints. Solver: the embedded solver in VALI is a dedicated solver (based on a modern SQPIP algorithm) that has been written to handle complex reconciliation problems and to solve them in a short time.
VALI is a process modeling software developed and distributed by Belsim. Its aim is to improve production efficiency, productivity and energy consumption, while reducing production and maintenance costs
Non-standard compounds: VALI has the unique capability to easily define mixtures of standard compounds by calculating the characteristics (boiling point etc) of these non-standard compounds As the acknowledged leading DVR package on the market, VALI is perfectly suited as VFM system for upstream oil and gas installations. As can be seen below in the plot coming from a Belsim customer, VALI is able to continuously provide flow rates per phase (oil, gas and water) with a maximum 5% deviation from the sporadic well test results. Vali can detect, also, faulty sensors and correct them as showed below as follows for a pressure sensor.
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 7
Collecting Data and Generating Information John Hancock, Editor Like any business activity, offshore oil and gas needs to know what is happening, how much product is the result and what it is worth
As operators’ reservoirs and subsea portfolios become ever more complex and as pressure increases to both maximize production and extend the field’s lifecycle, the need for accurate, downhole information has never been more important
Why Data and Information? In very important ways, data and information are the lifeblood of any business. Offshore oil and gas is no exception to this rule and it might be considered that, given the unstructured nature of its source at one end of the stream and the key economic importance of its final product at the other end, data and information are even more important in this sector than others. Data is usually any measurements, i.e. numbers, that can be derived from the process whereas information is an arrangement of data to show trends and comparative realities which, in turn, can be used to support decision-making and much else besides. In the UK, the Department of Energy and Climate Change (DECC) in its paper ‘Oil and gas: field data’ has compiled a comprehensive compendium of data for oil and gas wells around the UK, the headings of which give a good insight into what has to be measured1. While there may be many volumes and rates that need to be measured, probably the most important one is production: as the introduction to OnePetro’s 2011 ‘International Petroleum Technology Conference’2 put it “how can you effectively manage the wells if you do not continuously know what they are producing?”
Using the Measurement With modern offshore oil and gas operations, the challenge is not so much the information available as making good use of it. “The first key issue is integrating and interpreting reservoir monitoring data. In fact, one of the biggest challenges in reservoir monitoring today is managing the vast reams of data that monitoring and other subsea equipment generates.”3 One key application is to support ‘allocation’ or the need to distribute the costs, revenues and taxes among multiple players collaborating on field development and production of oil and gas. “Results from the allocation process are important feed into production reporting to governments and partners, and allocation results may also feed operators internal systems for product sales, 8 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
accounting, enterprise resource planning, data warehouse and management information.”4 Allocation helps determine ownerships of hydrocarbons because contributing elements to a commingled flow or in a shared storage facility may have different owners. Also, the terms ‘allocation’ and ‘hydrocarbon accounting’ are sometimes used interchangeably: but hydrocarbon accounting has the wider scope, using allocation data in a petroleum management process by which ownership of extracted hydrocarbons is determined and tracked from a point-of-sale or discharge back to the point of extraction. In this way hydrocarbon accounting also covers inventory control, material balance, and practices to trace ownership. Back to Wikipedia (see 4 above), “Allocation is an on-going process based on flow or volume measurements, and gives the distribution of contributing sources, often with a final calculation per day which, in turn, provides the basis for a daily production report in the case of a field that produces hydrocarbons.” Data and information are important contributors to the industrial process and the financial business that accompanies it. So the answer to the question ‘why measure?’ can include all of the above or, could be summarised in the opening words of the Exploration & Production article ‘Going downhole to support production’5, “As operators’ reservoirs and subsea portfolios become ever more complex and as pressure increases to both maximize production and extend the field’s lifecycle, the need for accurate, downhole information has never been more important.” There is an increasing requirement for gathering data throughout the seabed and subsurface journey of oil and gas.
Other Factors But it isn’t only the operators and producers themselves who need to know and be able to allocate measurements from the operations of working hydrocarbon extraction. One reason why governments are keen to discover and exploit oil and gas reserves in their territory is to generate
massive volumes of tax revenue. Going again to the case of the UK, The Department of Energy and Climate Change includes a department for Licensing, Exploration and Development (LED) whose responsibilities include the regulation of fiscal oil and gas measurement, i.e. working out on what the tax is due. Taxes charged on oil and gas production include the same ones that are charged to any activity in the economy but with a couple of special additions. One of those is Petroleum Revenue Tax (PRT) which seeks to tax a high proportion of the ‘economic rent’ (superprofits) from exploitation of the U.K.’s oil and gas. Another tax is the Ring Defence Corporation Tax (RSCT) to ensure that corporation tax on profits from oil extraction activities are paid in full as the profits accrue. It can in fact become even more complex when not only do multiple owners share facilities but also facilities and infrastructure may be shared across more than one field and even, in cases such as the North Sea, may be shared across two jurisdictions where fields and continental shelf territorial waters boundaries overlap.
Where to Measure Because of the high degrees of variability in the process and because of the impact of outside and natural forces on flows and production, measuring upstream activities has always been more problematic than in the more mechanical and systematic downstream activities, i.e. as the product moves from field to outlet it becomes increasingly manageable. For this reason among others, methods had to be generated to try to overcome the inconsistencies and variability of data from upstream activities. Data validation and reconciliation has been practised since the 1960s with regular improvements in the quality of results. For the foreseeable future, offshore oil and gas exploration and production will continue to see growth with Douglas-Westwood projecting more than 7000 fixed and more than 200 floating platforms, and with 190,000 km of pipeline currently installed plus a number of major modification programmes… in the next couple of years. The need for measurement leading to reliable information can only grow with that.
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 9
Measuring Operational Complexity Peter Dunwell, Correspondent And monitoring the myriad improvements and efficiencies that have evolved in recent times
Using these measurements without any correction yields inconsistencies when generating plant balances or estimating performance indicators
EASUREMENT OF data and output information provide important windows onto oil and gas production. That is true throughout the life of a field so that, “Key aspects for efficient and effective project start-up are well/reservoir surveillance and hydrocarbon accounting – it is important to know how much the wells are producing and the composition of the fluid streams, for maximum production, flow assurance, asset technical integrity and accounting purposes.”6 In fact, even during exploration, the flows of liquids and gases need to be measured every step of the way if the potential and exploitable value of the field being explored is to be properly assessed.
Field Management But it is particularly during the productive life of a field that data measurement will need to be validated and reconciled to meet the range of purposes for which it will be applied (see previous article). There are several values that arise from the application of data validation and reconciliation (DVR) techniques and systems to simplify data. It allows the accurate allocation of flow rates from each well and, because measurements are in real time, they can, to some extent, replace well testing procedures which means less downtime and earlier detection of problems to avoid loss of production. It will also improve revenue control, support the optimisation of flow rates and help monitor the condition of known flaws or issues in the system. Another and increasingly important aspect of oil and gas production anywhere, but especially offshore, is the welfare of that environment itself. The International Association of Oil & Gas Producers explains succinctly in the paper ‘Offshore environmental monitoring for the oil & gas industry’7: “the oil and gas industry conducts environmental monitoring in offshore areas where exploration, development, production and decommissioning activities take place. The information collected supports environmental management activities, assists in meeting
10 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
regulatory requirements and provides valuable data on the state of the Marine environment.” These days, the environment and the closely related factor of safety have to be high on any operator’s priorities. Wherever possible, a proactive approach to both matters is best and such an approach can only really happen if all of the conditions and outputs of a well are understood which, in turn, requires reliable measurement and data gathering.
Looking After the Field and Equipment So, “Measurements are needed to monitor process efficiency and equipment condition, but also to take care that operating conditions remain within acceptable range to ensure good product quality, avoid equipment failure and any hazardous conditions,” say Georges Heyen and Boris Kalitventzeff in their paper ‘Process monitoring and data reconciliation’8 “… operators are now faced with a lot of data, but they have little means to extract and fully exploit the relevant information it contains. Furthermore, plant operators recognize that measurements… are never error free. Using these measurements without any correction yields inconsistencies when generating plant balances or estimating performance indicators. Even careful installation and maintenance of hardware cannot completely eliminate this problem.”
A Changing Operating Environment One major development of recent years has been the growth in subsea processing facilities which further complicate the measurement challenge: subsea processing moves processing and the paraphernalia that accompanies it down to the sea bed so that some of the factors to be measured can only be captured within the subsea processing plant. Rigzone9 explains the attraction of subsea processing for an industry where all costs are high. “Originally conceived as a way to overcome the challenges of extremely deep water situations, subsea processing has
become a viable solution for fields located in harsh conditions where processing equipment on the water’s surface might be at risk. Additionally, subsea processing is an emergent application to increase production from mature or marginal fields.” Having installed such expensive plant at the bottom of the ocean and possibly, in the case of Norwegian Arctic fields, under sea ice for at least part of the year, measurement will be an essential component in the safe operation of that installation. And it’s not just subsea processing that is creating challenges requiring measurement as part of their management. In his article ‘Dig Deep’ Norbert Dolle, reservoir engineer for Shell explains, “Well-by-Well production allocation in subsea clusters is challenging without separate test lines. Subsea and downhole measurement devices can help but may prove unreliable.” He continued to set out the challenges that faced Shell in the early days following installation of its Penguin cluster in 2002. When its measurement equipment failed or became erroneous, Shell found itself with little reliable information with which to optimise production allocation and field management. The solution was an integrated periodic testing-by-difference and data-driven model which was able to create reliable data for the field.
Further and Deeper Another major development has been the drive to find reserves in increasingly distant and deep oceanic waters. Also, as oil and gas prices rise and as technology becomes more effective at exploiting reserves, marginal or technically demanding fields become more financially viable and existing assets with low production are able to generate valuable profit margins once more. Again the value of being able to measure these things and to be confident of the data emerging is a very important contributor to commercial success.
Life Stages All subsea installations follow a similar pattern
of life stages… exploration, proving reserves, building and installing structures, production and maintenance, decommissioning, and dismantling or making safe. At each of these stages, reliable data is important. But for many fields, an additional overlay/stage has been inserted; that of life extension. Life extension can be to continue exploiting a field where new technology or, simply the price of oil, has revived its profitability or where new fields can only be profitably exploited by tying them back into the infrastructure of older fields. In this, with a field and much of its infrastructure being operated beyond its original design life, reliable data become all the more important
The Case for Reliable Measurement “… The growing geological complexity of offshore oil and gas fields, increasingly being found in remote deepwater locations in challenging and environmentally sensitive operating conditions, has increased the need to put effective reservoir monitoring processes in place. Consequently, operators today need an integrated and flexible reservoir monitoring system… They also need to be able to effectively visualise, manage and interpret the data, and ensure seamless flow of information from the field to the desktop.” This, from The American Oil & Gas Reporter10 sums up why exploration and production that pushes the physical boundaries also needs good quality data to manage it. There are several reasons why measurements in upstream oil and gas offshore production environments might need validation and correction. The environment itself may generate a significant number of variables but also the properties of the product flowing through the measurement devices can vary over the long term and sometimes in the short term as a reaction to geological conditions: whatever the reason, reliable, validated, reconciled and/or corrected data is an absolute requirement of the modern offshore oil and gas industry. WWW.OFFSHORETECHNOLOGYREPORTS.COM | 11
Reconciling Errors John Hancock, Editor
Taking what is inconsistent and applying known logic to arrive at a reliable result
It can help to know what factors are causing measurement variations and there are several systems available to measure those outside influences in real-time
HERE ARE sound reasons why trustworthy measurements of oil and gas production and processing need to be available, not least of which is the value of reserves to local economies. However, extracting oil and gas from beneath the ocean, processing them and transporting the product to the surface (or vice-versa) is hardly laboratory conditions. So measurements taken are almost never 100% correct, which means that raw measurements are less likely to be accurate and therefore less likely to be useful. There several factors that can affect accuracy, including the sensors used to collect measurements whose accuracy, calibration or location could affect readings; inconsistencies within the plant operation and the impact of the surrounding environment.
Sources of Error The sources of error within sensors are well documented11: • I ntrinsic sensor precision is limited, especially… where robustness is… more important than accuracy ; • Sensor calibration is seldom performed as often as would be desired…; • Signal converters and transmission add noise to the original measurement; • Synchronization of measurements may also pose a problem... Other errors arise from the sensor location or influence of external effects.
Coping with Variances In order to cope with errors and bring readings within acceptable parameters, increasing use is being made of data validation and reconciliation (DVR) technology. It can help to know what factors are causing measurement variations and there are several systems available to measure those outside influences in real-time. For instance, there are monitoring systems available to provide the means to measure ocean current profiles in ultra deepwater, surface currents, ways, winds, and
12 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
structure positions and motions (all of which can affect process readings).12 It is also possible to establish reasonable parameters for measurements using a system called testing-by-difference which “enables the determination of flow rates by reading a group of wells and subsequently retesting a subset of that group under different flow conditions.”13 The Offshore Technology article quoted above continues to tell us that “Determining the rate of a single well in coming production stream can be done by measuring the full stream, as well as the stream minus one well. The difference between these rates corresponds to the rate of the removed well. This method is particularly useful where test facilities are inadequate or where shutting some wells to test others is unacceptable. The industry frequently uses testing-by-difference but rarely in subsea tiebacks over 50 km.”
Data Validation and Reconciliation In George Heyen’s and Boris Kalitventzeff’s paper referenced earlier (see 11 above) we are told that a technology that has grown over the last 50 years to improve the quality of measurements from increasingly complex industrial processes is called data validation and reconciliation (DVR). DVR applies known information about the process alongside statistical and mathematical calculations to correct any deviant readings delivering reliable information from variable measurement data. Using a mathematical model of the plant in which the process is being conducted DVR identifies measurement errors against the long-term averages and target values the process operation.
Data Validation Having long been an academic research topic, data validation now attracts more interest since the amount of measured data collected by digital control systems and archived in process information management systems exceed what can be handled by operators
and plant managers… The economic value of extracting consistent information from raw data is recognised. Data validation thus plays a key role in providing coherent and error-free information to decision-makers. Data validation is also sometimes described as the process of ensuring that a program operates on clean, correct and useful data. It uses routines, often called ‘validation rules’ or ‘check routines’, that check for correctness, meaningfulness and security of data that input to a system. If that seems a little opaque, Heyen and Kalitventzeff (reference above) offer us a good practical guide to data validation categorised in several steps, which I have edited for space reasons: measurement collection; conditioning and filtering; verifying the process condition and the adequacy of the model; gross error detection; assessing the feasibility of data reconciliation; where feasible, data reconciliation; result analysis; and editing and archiving the results.
Data Reconciliation “Data reconciliation (DR) is usually defined as assistant to resolve inconsistencies between plant measurements and mass balances (and other balances such as energy or component balances) throughout the plant.” The inlibra article14 quoted continues to tell us that data reconciliation is usually used in five ways: to provide accurate information to monitor and optimise the operation of process units; to identify sources and magnitudes of losses; used on a regular basis, to reduce the weight per cent of losses, especially in plants with multiple owners and where the products are distributed to the various owners; to help prioritise instrument maintenance; and to provide
economic justification and show optimal locations for instruments. Even after reconciliation, it makes sense to conduct a consistency check. Mike Arnoldly at Topdown Consulting15 also highlights another matter to consider. “Most organizations severely underestimate what it takes to reconcile data. They often don’t consider how much historical data is needed and other items that need to be accounted for. The best way to make sure data reconciliation is smooth and mostly uneventful, is to take a planned and considered approach.” His blog continues to set out such an approach.
Links in the Chain At the bottom line, data measurement, validation, reconciliation and application are all steps in the data processing chain which means that errors or biases in any of them would at least be carried through the chain and, depending on the weight attributed to individual results, might even be magnified during the remaining process after they have been set. So, for the whole operation, it is very important that measurements and their resulting data are as reliable as possible.
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 13
Meters and Metrics Francis Slade, Staff Writer
Ways of measuring output and systems to make sure itâ€™s accurate
The next stage up from a simple flow meter is a multiphase flow meter which is able to deduce the proportionate flow rate of each fluid phase in order to support the process of allocation
Algorithms â€“ Complex but Necessary This is not the place to start investigating the intricacies of complex algorithms and formulae that can be used in the validation and reconciliation of collected measurements: if nothing else, they take up a lot of space and readers may be more interested in how they came about and what they can achieve than the intricate mechanics of how they work. It is probably useful though to say that the earliest algorithm applied for this purpose was proposed in 1960. The algorithm in question was designed to analyse steady-state processes but, in practice, is also used to handle measurements obtained from processes operated close to steady-state with only small disturbances. This approach is acceptable when monitoring performance parameters that vary only slowly with time. There are lots of things that get measured in offshore oil and gas but the one on which the financial viability of a field depends and from which the values to be attributed to each participant in the process can be calculated is flow rate.
Reducing Uncertainty The oil and gas industry is used to regulatory authorities in the jurisdiction where any facility operates setting requirements for measurements of produced hydrocarbons, particularly where those measurements are needed for the calculation of taxes or royalties payable to the government; otherwise known as the fiscal measurements. A problem with this is that, as previous articles have suggested, there is a degree of uncertainty in the measurement and allocation systems used in the often hostile environments where offshore oil and gas operate. In general, uncertainty in the allocation system is related to a contributory uncertainty in the measurement of each input. Therefore investment in improved metering systems and
14 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
technologies to reduce the uncertainty will be valued not only by the operator or producer but by all parties with a financial interest in the output. Data validation and reconciliation is a function embedded in many process analysis and simulation software packages. It can also be packaged as a stand-alone software solution. A lot of the success (or otherwise) of data validation and reconciliation can be traced to the quality and robustness of known values and parameters to which the algebraic equations used in testing can be applied and against which measurements are tested for the likelihood of their accuracy.
Flow Meters The piece of equipment on which all this depends is the flow meter. Its quality and robustness as well as the accuracy and frequency with which it is calibrated and set are all important contributors to the quality of measurement and data. The results will also depend on the medium to be measured, its viscosity, the velocity, pressure and temperature of its flow, its chemical content and a number of other factors. The next stage up from a simple flow meter is a multiphase flow meter which is able to deduce the proportionate flow rate of each fluid phase in order to support the process of allocation. In locations where it is too costly or impractical to use flow rate metering, a set of methods and techniques have been devised to provide estimates of flows in order to complete allocation. Flow meters may not seem very exciting pieces of equipment and most of them function out of sight which means that it is easy to think of them as being more or less the same. However, in reality they comprise a broad group of components from simple mechanical models to IT integrated models. To describe them all would take more than the space available in even this Re[port and would not really inform the reader any more about the quality of measurement. But it is worth mentioning that
the quality of a meter can make an enormous difference to its effectiveness. Meters themselves need to operate in increasingly hostile conditions as resources such as subsea or Arctic oil and gas fields are exploited. Such conditions might make traditional mechanical flow meters less attractive with the need for maintenance. In short, moving parts need to be kept moving and some of the conditions in which meters now operate are not conducive to that. Plus there is a simple cost factor that maintenance requires labour (never cheap) and other costs for the resources whose use businesses try to minimise. Also mechanical devices have to be physically read whereas modern practice prefers devices that, wherever located, can communicate to a central point. And devices that can communicate one-way can be communicated with (two-way) which makes tasks such as recalibration much more straightforward, more frequent and less costly. In fact, separate flow metering devices can become redundant where an operator installs data validation and reconciliation (DVR) technology, which can be used as a virtual flow metering (VFM) system.
Going Digital In line with many other industries, oil and gas and especially offshore installations, are making increasing use of technologydriven digital processes. In the January 2012 issue of Offshore magazine, Richard Heddle, John Foot and Hugh Rees of BP writing about BP’s ‘Field of the Future’ program explained that, “Most of the technologies developed under the programme are digital in nature and focus on enhanced surveillance, alerts and control.”16 As well as improving the quality of measurement data and control, using digital IT-based systems will substantially alter the architecture of subsea operations. Again this is not a subject that could even be lightly covered in this Report but, as with digital processing in any area (for instance digital manufacturing), the result will be a cleaner and more easily operated process. Much of the viability of offshore exploration and production for everybody involved depends on price and quantity so a more effective means of measurement should always be a welcome addition to the process.
WWW.OFFSHORETECHNOLOGYREPORTS.COM | 15
UK, the Department of Energy & Climate Change, ‘Oil and gas: field data’ https://www.gov.uk/oil-and-gas-uk-field-data
OnePetro, 2011 ‘International Petroleum Technology Conference’ http://www.onepetro.org/mslib/servlet/onepetropreview?id=IPTC-14518-MS
The American Oil & Gas Reporter, July 2010, ‘System Integrates Reservoir Monitoring’
Exploration & Production article ‘Going downhole to support production’
OnePetro ‘Virtual Measurement Value during Start Up of Major Offshore Projects’ http://www.onepetro.org/mslib/servlet/onepetropreview?id=IPTC-14518-MS
International Association of Oil & Gas Producers, ‘Offshore environmental monitoring for the oil & gas industry’ http://www.ogp.org.uk/pubs/457.pdf
Georges Heyen and Boris Kalitventzeff, ‘Process monitoring and data reconciliation’
Rigzone, ‘How Does Subsea Processing Work?’ www.rigzone.com/training//insight_pf.asp?i_id=327
The American Oil & Gas Reporter, July 2010, ‘System Integrates Reservoir Monitoring’ http://www2.emersonprocess.com/siteadmincenter/PM%20Roxar%20Documents/Flow%20Metering/American%20Oil%20and%20Gas%20Reporter%20July%202010.pdf
Georges Heyen and Boris Kalitventzeff, ‘Process monitoring and data reconciliation’
Woods Hole Group ‘ Real-Time Systems for Offshore Oil & Gas Operations’ http://www.whgrp.com/project-descriptions/ADCP.pdf
Offshore technology ‘Dig Deep’ http://www.offshore-technology.com/features/feature1807/
inlibra ‘What Is Data Reconciliation and How Is It Used?’ http://dearwater.com/inlibraweb/downloads/What_is_Data_Reconciliation.pdf
Topdown Consulting http://blog.topdownconsulting.com/2011/07/reconciling-data%E2%80%94what-is-the-best-approach/
Offshore Magazine, ‘Virtual flow metering improves field data’
16 | WWW.OFFSHORETECHNOLOGYREPORTS.COM
Offshore Technology Reports… the leading specialist combined online research and networking resource for senior upstream oil and gas industry professionals.
• Up to the minute Industry and Technology information available to all site users on a free of charge open access basis. • Qualified signed up members are able to access premium content Special Reports and interact with their peers using a variety of advanced online networking tools. • Designed to help users identify new technical solutions, understand the implications of different technical choices and select the best solutions available. • Thought Leadership – Advice and guidance from internationally recognised upstream oil and gas key opinion leaders. • Peer Input – Contributions from senior upstream oil and gas industry professionals. • Independent Editorial Content – Expert and authoritative analysis from award winning journalists and leading industry commentators. •
Unbiased Supplier Provided Content.
Designed to facilitate debate.
Written to the highest professional standards
Defence Industry – Special Report on Next Generation Date Validation and Equation Based Reconciliation Software Technology
Published on Nov 14, 2013
Defence Industry – Special Report on Next Generation Date Validation and Equation Based Reconciliation Software Technology