Journal of the Professional Petroleum Data Management Association Fall 2014

Page 1

Foundations Journal of the Professional Petroleum Data Management Association

Volume 1 | Issue 3 | Fall 2014

10 THINGS TO KNOW ABOUT WELL LOGS A set of rules for checking the quality of well logs

Foundations Foundations: The Journal of the Professional

Table of contents Volume 1 | Issue 3 | Fall 2014

Petroleum Data Management Association is published four times per year by JuneWarrenNickle’s Energy Group. CEO Trudy Curtis



A set of rules for checking the quality of well logs



By Martin Storey, senior consultant, Well Data Quality Assurance Pty Ltd

Vice-Chair Robert Best Secretary Janet Hicks Treasurer Peter MacDougall Directors Trudy Curtis, Rusty Foreman, Paul Haines, David Hood, Allan Huber, Yogi Schulz, Joseph Seila Head Office Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Email: Phone: 403-660-7817 CEO Bill Whitelaw


Defining the geotech


The role of a geotech is one rarely set to paper. So what does a geotech do, and for whom? By Gordon Cope

A better measure

Editor, Special Projects Rianne Stewart


The Units of Measure project aims to introduce a new efficiency for doing business

Contributors Gordon Cope, Jim Crompton, Danette Flegel, Jay Hollingsworth, Jess Kozman, Gary Masters, Harry Schultz, Martin Storey Editorial Assistance Laura Blackwood, Sarah Miller, Sarah Munn Creative Lead Cath Ozubko

All’s “well” in Saskatchewan


Saskatchewan pragmatism leads the way to adopting the Canadian Well Identification System By Danette Flegel, Government of Saskatchewan

Join the discussion on LinkedIn


The PPDM Association and industry participants are finalizing the process to establish professional accreditation for petroleum data analysts By Gordon Cope

By Jay Hollingsworth, Gary Masters and Harry Schultz

President Rob Pentney



4 Standards for what, when and who By Jim Crompton, data management and analytics consultant DEPARTMENTS

Standards & technology


Industry news and updates

Upcoming events Find us at events and conferences around the world in 2014 and 2015


Graphic Designer Ginny Tran Mulligan Ad Traffic Coordinator Lorraine Ostapovich Advertising Nick Drinkwater, Account Manager Calgary 2nd Flr-816 55 Ave NE Calgary, AB T2E 6Y4 Tel: 403-209-3500

Edmonton 220-9303 34 Ave NW Edmonton, AB T6E 5W8 Tel: 780-944-9333


The Professional Petroleum Data Management (PPDM) Association is a global, not-for-profit society within the petroleum industry that provides leadership for the professionalization of petroleum data management through the development and dissemination of best practices and standards, training programs, certification programs and professional development opportunities. For 25 years, the PPDM Association has represented and supported the needs of operating companies, regulators, software vendors, data vendors, consulting companies and management professionals around the globe.

Foundations | Fall 2014 | 3

Guest Editorial Standards for what, when and who By Jim Crompton, data management and analytics consultant


always seem to be talking about standards. Throughout my years of giving presentations and writing articles, standards and architecture have been and are my mostdiscussed topics. I don’t think that I am a standards fanatic; as a matter of fact, I usually find myself arguing the position that we have gone too far with standardization. So why do I keep coming back to this topic? Standards are important, and standardization does not have to be the enemy of innovation. Adoption of the right standards should enable a company to operate more efficiently and effectively at a global or regional scale and even focus investment in innovation on the things that can bring the industry the most value. But despite efforts from corporate executives and the focus of internal IT departments, I don’t think we have the messaging around the role of standards correct. So, pardon me, I am going to try this one more time. Many companies have a culture of central policy making but distributed decision making. It’s obvious that before we set standards, we have to understand and align them to business needs. We often oversimplify those needs for the sake of two drivers: efficiency and cost reduction. Saving a little bit of expense by limiting options can be a good or a bad thing. Reducing options on a technology or service that only has utility value is a good

thing. It can help keep investments focused on the greatest value and can clarify the utility foundation upon which more valuable solutions can be built. I may get grumpy that someone is taking away my favourite way to do something, but I need to recognize the modest value of that choice and get on board. However, when a standards program oversimplifies a critical work process—say if we think a reservoir-management suite of tools should be the same for a waterflood project as for a heavy oil steam flood, or that drilling a complex deepwater subsalt well is the same as drilling a shallow shale gas well—we may be overstepping our role. If IT attempts to limit the applications suite for the sake of IT efficiency, we can end up providing an 80 percent solution to a valuable process and limiting our ability to create a competitive advantage. Knowing when one standard is the right answer and when a guideline of several solutions are the right answers is an art form and a critical balancing act for IT. We need domain knowledge and buy-in to make many of those choices. In addition, in order to identify the business process or capability to which a standard applies, we need to determine the appropriate responsible organization and individual roles (a responsibility assignment matrix, also known as RACI chart, is a good tool for this). A functional group, like drilling or reservoir management, or a regional operating unit, like the United

Correction A table on page 21 of the summer issue of Foundations contains incorrect information. The corrected table appears here. We apologize for any inconvenience.

4 | Journal of the Professional Petroleum Data Management Association

States onshore or Canadian heavy oil, needs to define its standards and invest in data-quality work and end-user literacy with the new standard. Establishing clear data-ownership responsibility is critical. When you have a new standard for What is a Well established in a drilling database, does that mean that drillers own the information object of the well? What about the operations and maintenance roles? What about workovers and redrills? In many business units, these small capital projects and sometimes operation-expense jobs are run by operations—not drilling. Can you have an effective system of record if no function will take ownership—or if more than one function claims ownership— of data definition or quality, driving the right user behaviour, understanding how the data flow and life cycle aligns to the desired work process, technology refresh cycle, vendor alliance relationship or internal support? All of these issues are part of the expectations of the asset life cycle management plan for that function and have to be moderated by the IT function, not owned by it. In setting standards, we also need to appreciate the maturity level of the organization. Some standards are more applicable when a group has reached a higher level of integration and design maturity. A master data management solution is a critical role for an organization tackling cross-functional workflows, but could be an unnecessary

















P H O T O : G E O R G I Y S H PA D E / T H I N K S T O C K

complication for a function still working on domain-specific applications and closely coupled data stores. So is standardization necessary? It depends. Our important attempts to rationalize and simplify our application inventory are filled with important choices. While one goal is to reduce IT expense, we also aim to provide a foundation that our business can use to leverage digital technology for a competitive advantage. Consider applying standards from an information, not a technology, perspective. Some choices are obvious: older versions of the same software and applications that are no longer used should go. Other choices

are more complicated and require careful discussion with the user community and thinking about the consequences of losing functionality. Another choice is to just leave some diversity alone when the requirements for specific tools create more opportunity than the rationalization brings savings. At other times, standardization needs to wait until the community is ready for it. Don’t just give me a shorter list of standards. Give me a menu of the right number of solutions, well supported and integrated for all the value-creating opportunities that we want to pursue. Then I can stop writing about standards.



Want to take part in the action? Here’s how:

Learn and Enjoy – Register to Attend Get Recognized – Become a Sponsor Spread your Message – Exhibit at our Tradeshow Share your expertise – Submit an Abstract

Visit to register now!

To complete your experience, also register for the Houston Data Management Workshop and Field Trip on March 9. For more information contact us at:

Foundations | Fall 2014 | 5

Standards & Technology

NEWS Diverse topics covered at Brisbane and Perth symposia By Jess Kozman

They came; they talked Big Data and professional certification; they covered history from the mastery of fire to the latest hackathon; and they referenced everything from clay tablets to petaflop computing. During a two-week period in August, over 100 professional data managers from the Asia-Pacific petroleum and resource industry met for a day of workshops and two days of symposia in Brisbane and Perth, both in Australia. Organized by the Professional Petroleum Data Management (PPDM) Association, the meetings brought together industry experts and practitioners to address technical standards and how data, information and knowledge add value to high-tech and capital-intensive oil and gas projects. The Brisbane workshop featured a presentation on the visualization and analytics of business data from a senior business intelligence architect at Santos Ltd., discussions on the fragility of storage media and an update on the status of digital well data in Australia. Interactive workshop sessions covered topical themes in petroleum data management, including the mix of personality,

training, education and experience needed for successful data management, the most important tools for data managers, and the future of the profession in two, five and 10 years. In Perth, the theme was “Making Today’s Vision Tomorrow’s Reality,” and the event highlighted collective action and community-building. The agenda included technical presentations from operators, vendors, government regulators, consultants and academics, along with a 20/20 session (20 minutes of presentation and 20 minutes of audience conversation) on conservation of data quality, a workshop on data types and metadata, and three sessions around professional certification, job descriptions and testing. The highlight was a presentation from Resources Innovation through Information Technology about their open data innovation event, or hackathon, at which developers and coders had 54 hours to prototype solutions to industry problems using open data from the industry and well-defined problem sets. The PPDM Association is actively pursuing a similar event for the petroleum industry.

6 | Journal of the Professional Petroleum Data Management Association

A memorandum of intent to establish a global professional society for petroleum data managers has been agreed to by representatives from Common Data Access Limited (CDA), the Expert Community for Data and Information Management (ECIM) and the Professional Petroleum Data Management (PPDM) Association. The aim of the society, which will be not-for-profit, international and independent, is to be the place where professional data managers go for community, knowledge and professional development. “The society will be available to all individuals involved or interested in petroleum data management, whether from oil companies, NOCs, service companies, regulators, academia or anywhere else. It promotes the value of data management as a career destination with professional qualifications and regular certification. An open, transparent society will enable a community to develop where knowledge is shared and the industry as a whole can benefit from this initiative,” says Rusty Foreman of BP p.l.c. CDA, ECIM and PPDM all have preexisting work in this area, including competencies, education, training, certification, accreditation, events, publications, standards and bodies of knowledge. The society aims to bring that experience and breadth of understanding together to create a firm foundation for industry acknowledgement of the need for professionalization and recognition of the business value of good data management. Workgroups have been set up to take this initiative forward, and they comprise several key threads, including budget, legal, governance, organizational and membership. Participants will consist of the executive leadership from CDA, ECIM and PPDM, as well as BP, Royal Dutch Shell plc, A.P. Moller-Maersk Group and Halliburton.


Society for petroleum data managers to be created

Standards & Technology


Saskatchewan adopts new Canadian well ID system The Professional Petroleum Data Management Association and its industry partners are pleased to announce that Saskatchewan will be the first jurisdiction to implement the new Canadian Well Identification System (CWIS). Responding to requests from oil companies, industry vendors and regulators (the former Energy Resources Conservation Board was a co-author of the project charter), the CWIS project was undertaken to upgrade the system of well identification in western Canada, which is outdated due to advances in drilling and completions technologies and demands for information. After widespread consultation and expert volunteer participation, the first

draft of a new system was released in mid-2013. Essentially, the system comprises three related identifiers that recognize every well, every wellbore and every wellreporting stream. CWIS meets the modern database needs of operators, regulators, vendors and other stakeholders. It maintains legacy unique well identifiers, but also incorporates a standard system of coded identifiers to allow business processes to leverage accurate information in a timely manner. Every data item in every well can be managed from creation to delivery to archive. The latest CWIS version incorporates refinements gleaned from input on the first draft. Visit for more information.



Well Code

Component Type

Component Value



Well ID





AB 1597532


Wellbore ID





AB 1597532B001


Well Reporting ID





AB 1597532V001

1. The well ID identifies a well. Most information is filed and retrieved according to the well it came from or relates to. 2. The wellbore ID identifies a wellbore. All downhole measurements and construction, including tests and completions, can be located by depth and time intervals within the wellbore. 3. The well reporting ID identifies a well reporting stream for which the regulator requires information.

Energistics publishes PRODML V1.3 Standard Energistics announced the publication of the PRODML V1.3 Standard. Included in this release is a new version of the Distributed Temperature Sensing (DTS) data-object, which represents a major step forward in DTS reporting. The original DTS standard has been enhanced for up-to-date practices and technologies.

Other key capabilities include: • Multiple facilities on one optical path that can be mapped onto the fibre “as measured” length; • Logging and other forms of conveyance; • Controlled lists of curves eliminates previous log curve ID ambiguity; • Keeping measured and interpreted data together; and • Support for tracking equipment changes over time, such as additions or removals of segments from the optical path. Additional information is available at

Divestco launches updated mapping product Divestco Inc. has launched GeoCarta Version 3.8, a data access and analysis application that connects Environmental Systems Research Institute, Inc. (Esri) mapping technology with well and land databases from the Professional Petroleum Data Management Association. The software boasts powerful data browsing, data source flexibility and robust exports, all of which make exploring a simple, stress-free process. Version 3.8 introduces features such as: • Split rights display: the ability to generate split polygons to better visualize any overlapping map layers in an area of interest; • Simpler processes to update user map layers: update the results set layer or rerun the layer query with the click of a button; • Additional facility production data: includes facility production stats, battery production, production daily average stats and production monthly stats; and • Proprietary surface land agreements: users can import proprietary surface land agreements from LandRite. For more information on GeoCarta, please visit Divestco online at

ETL Solutions expands development team ETL Solutions Ltd has welcomed two new recruits to the company’s development team. Lee Grubb and Zhipeng Chang both join the ETL Solutions team to help deliver Transformation Manager 5.12, an update focusing on stability and improved performance. The company is actively seeking consultants to work in the exploration and production domain. Details of the vacancies can be found at

Foundations | Fall 2014 | 7

region to the acquisition of geological and geophysical information; the interpretation of that information in order to complete high-grade prospects; the planning and drilling of those prospects; the subsequent production of commercial discoveries and the eventual disposal or abandonment of the assets. Increasingly, companies are recognizing that data and information are resources that need as much care throughout the life of an asset as a well or pipeline. Master data management strategies are becoming commonplace along with recognition of the role of data management to steward master data for all stakeholders. The role of the geotech, on the other hand, is focused on the technical data requirements of a specific set of users (geoscientists) at specific points in the life cycle of a well.

Defining the geotech The role of a geotech is one rarely set to paper. So what does a geotech do, and for whom?

By Gordon Cope


he following article is based on an online discussion initiated by oil and gas data managers to explore the responsibilities and challenges facing geoscience technicians. Foundations would like to thank Giorgio Drei, Vanessa Johnson, David Lloyd, Stephen Lord, Scott Stephens, Martin Storey, Scott Tidemann and others for contributing their knowledge and insights. Our intention is to stimulate further productive discussion about this subject and help share the many viewpoints that exist about this important function. Your responses and ideas are always welcome at Finding and producing hydrocarbons involves many complex steps, from the initial identification of a prospective hydrocarbon

8 | Journal of the Professional Petroleum Data Management Association

Geotechs work with geoscientists to find data, put it in a form suitable for analysis by specialized interpretation software and assist with tasks such as research, validation and verification, data loading, analytics and working with contractors and vendors. In small companies, the geotech may also be responsible for regulatory approvals and submissions, contract management and much more. The scope of data handled by a geotech is broad and complex. This data can include seismic, well, survey, core, production and land information as well as joint-venture responsibilities and more. Corrections or additions to the data should be captured, validated and managed in an appropriate interpretation or analysis tool and ultimately made available to a trusted system of record or master data store so that all key stakeholders have access to the best possible data over the life of the asset. Currently, there is no single recognized title or job description for these workers in the oil and gas sector. Titles such as geological technician (or geotech), geoscience technologist, geotechnical assistant, petroleum technician (petrotech), geological data manager, technical assistant, technical aid and data librarian are often used. However, their function is much the same: they are responsible for making the right data available to the geoscientists with the appropriate technical software and for making sure that the data is properly stewarded for the future. Commonly, master data managers have an overall concern with corporate data stores, and geotechs have a concern for active technical works in progress. While the geotech is primarily focused on a set of data related to a specific interpretation project, the master data manager is responsible for a corporate database that is typically used to share key data among many user communities and as the foundation for business intelligence systems. Clearly, there are many points of intersection between the two environments, and, in some companies, the same individual may be assigned both tasks. In any case, it is essential that data transfer processes be coordinated between the two functions so that data can flow as seamlessly as possible in order to honour the needs and workflows of all users.

P H O T O : S U M I K O P H O T O/ T H I N K S T O C K



RESPONSIBILITIES A geotech is responsible for all of the geological, geophysical and engineering data needed to meet the objectives of a particular project. Essentially, this involves searching for the data, checking the quality, creating a working data store (usually in a vendorprovided interpretation software application) and collating and presenting it in a form, such as a map, that allows team members to use and interpret the data effectively and efficiently. Typically, the geotech works under the supervision of the geoscience and engineering team but is not directly involved in the interpretation of the data. Rather, the value a geotech contributes is through finding and loading missing data, identifying and correcting data problems and managing data so that geoscientists and engineers can concentrate on interpretation and adding value. The level of skill that a geotech brings to the project varies due to a number of factors. If the geotech comes from an IT background, those skills will include a good working knowledge of what databases contain, how to find specific data and how to validate the data. If the geotech comes from a geoscience or engineering background, those skills will include a good working knowledge of the relationship between different data sets, how to ensure good data quality and the information needed to effectively achieve the objectives of a project. Experience also plays a significant role. The geotech occupies a complex and demanding function for which many diverse skills are needed. Depending on the size and scope of the company, a geotech may be obliged to assume many different roles with regard to not only research, quality validation, geosciences, engineering, software functionality and mapping, but also regulatory requirements and contract obligations. Because formal training opportunities are limited or lacking altogether, the principal method of learning is on-the-job training.

GEOTECH CHALLENGES AND NEEDS Many issues are common to all professions. Exploration, for instance, has an accelerated work environment often related to deadlines (like land sales), so everything has to be done quickly. Information is often incomplete or has unintentionally been corrupted. Data that was not gathered when it should have been is often lost for good, and new ways of managing and interpreting data are constantly emerging, requiring the continuous reevaluation of work that has already been done. Explorers tend to adopt a range of assumptions and caveats that are often not made explicit, thereby engendering misconceptions. Geotechs face a unique challenge in that there are no widely recognized industry standards for producing well-managed geoscience data sets, so any conditioning tends to be driven by the objective of the project and is specific to that project only. This means that work that has been done rarely adds permanent value to the master database and can actually create a situation where perfectly competent preparation for one

application has the potential to reduce the value of the data for another use. The list of challenges faced by the geotech is long and daunting. Many companies are making great strides toward solving these problems internally; however, no single company can hope to solve each of these problems for the entire industry. These challenges include: • W hen gathering data, the geotech may not be able to ascertain what data actually exists, what data is still available either in-house or through a vendor, and where the data is. • If the geotech has a reason to think that data does exist, then finding, transcribing and checking data quality can take weeks. The data may be incomplete or it may lack control items, raising issues about its usefulness and value. It is difficult to rationalize to management the time and effort spent if, in the end, the value of the data is so questionable as to be unusable. • Older legacy data is often not available in digital form. Data entry is expensive, time-consuming, error-prone and, if the original prints are too faint to transcribe, useless. • Sometimes data has been processed with algorithms that are no longer documented, or with parameters that never were documented, or it cannot be validated. • Contextual information that is essential to interpretation, such as the pressure and temperature conditions of core samples, may not be available. While general assumptions can be made, doing so introduces avoidable uncertainty. • Data can be degraded through the course of preparation when, for example, a well log is under-sampled while being converted from one format to another. • Several versions of the data may be available without any information regarding what they are and why they are different. • Prior data work may be available, but there may be no documentation of how it was produced, or even what original data was used. • Organizations differ in size and complexity and have varying abilities for addressing such issues.

ADDRESSING CHALLENGES Currently, most challenges are addressed on a corporate level with in-house standards. Unfortunately, these can deteriorate over time or change as data managers retire or leave. Universal standards specific to the entire oil and gas sector would help address many of these challenges. Today, data management professionals recognize the need to work together so that useful and practical solutions can emerge and be adopted. Some of these solutions include: • Standards for the design, maintenance and upgrading of master databases. • Standards for cataloguing all data that is acquired by a company, including catalogues of project and master data repositories.

Foundations | Fall 2014 | 9

Feature • Standards for the way data in master databases is loaded, updated and made available to users. • Standards for storing derivation information associated with data, such as algorithms and parameters. • Standards for permanently recording metadata about master data indicating its provenance and trustworthiness and whether or not it is up to date. • Standards for the way data from master databases is downloaded into project data stores. • Standards for the way alterations to data are integrated into the master data store. • Standards for the way geotechs and master data managers are trained. • Standards for keeping audit trails so that changes can be traced. • Industry-set certification for geotechs and master data managers. The Standards Leadership Council organizations are currently focused on working with industry to develop standard and best practice solutions that work for everyone.

GEOTECH AS DATA MANAGER Unfortunately, the data management role of the geotech is often overlooked with little formal acknowledgment of the importance of making data available in a timely manner, in the correct formats and double-checked for quality, purpose and accuracy.

While master data management is emerging as a recognized discipline, the geotech may be seen as neither data manager nor geoscientist; in truth, the geotech is a bit of each and needs training and support in order to satisfy the demands of both sides of the job. Fortunately, there are now efforts underway to begin to overcome these challenges. The Professional Petroleum Data Management (PPDM) Association and Common Data Access Limited (CDA) are working together to professionalize data management. CDA has worked closely with data managers from oil companies and service contractors in Europe to complete a detailed description of domain competencies as part of a broader program to professionalize petroleum data management. The site provides data managers with the opportunity to build their own competency profiles by assessing their own level of competence against the defined competencies. Data managers can add evidence to each competency level that they claim. In addition, the PPDM Association is working with industry to develop certification programs for data management professionals. Industry experts from around the globe contributed to a question bank for the certification exam, and nearly 100 participants from a wide variety of data vendors, operators and regulators with between three and five years of experience in petroleum data analysis have now written the pilot exam as part of the exam validation process. The certification process will be released in late 2014.

Caught the Photobug? Enter the Foundations Photo Contest to get your photo published! Catch an amazing nature shot? Got perfect lighting on that rig? Get ‘exposed’ with Foundations! Submit your three best creative photos each quarter to have a chance to be published in the next edition of Foundations. Winners will be selected by open vote at each quarter. Winning photos will be featured on the cover of Foundations (First Prize) and inside Foundations (Second Prize). Visit to learn more about submitting your entry, including Rules and Regulations. You do not need to be a member of the PPDM Association to submit a photo or vote. Winning photos will be published in Foundations and may be used in other PPDM Materials. Photo Courtesy of Ernest Harrison, 2014.

10 | Journal of the Professional Petroleum Data Management Association


A better measure

The Units of Measure project aims to introduce a new efficiency for doing business By Jay Hollingsworth, Gary Masters and Harry Schultz


ithout question, oil and gas data is very technical. Through the long life cycle of an asset, well or facility, vast amounts of estimated, calculated and measured data are created. Many organizations are involved in creating and using the data, and the data may be contained in dozens—or even hundreds—of systems across multiple companies. Much of the data represents measurements associated with various physical properties like depth, fluid density and concentration. And while there are many different measurement systems in existence today, they do not all necessarily agree with each other. For example, “ft” can stand for foot or femtotonne. A foot can be abbreviated as “ft” or “F.” And a gallon can be a different volume depending on if it’s metric or American. This contradictory behaviour makes it difficult to successfully write applications that extract business intelligence from diverse sets of data or to integrate multiple data silos into one. Standardization can make communication more consistent and training easier, and can improve clarity. Once consolidated, shared units of measure can promote interoperability and provide the foundation for data loading and improved data quality.

A HISTORIC OVERVIEW Energistics (formerly the Petrotechnical Open Standards Consortium [POSC]), the global upstream oil and gas open standards consortium, has traditionally been the repository for several units-ofmeasure-related specifications. Although they all have roots in the International System of Units (SI), they are not completely compatible.

For instance, the Energistics Unit of Measure dictionary defines a unique symbol for each concept and assigns each symbol a conversion to a base unit. Conversely, the API RP66 specification defines elemental symbols and a grammar, for deriving more complicated unit expressions. Each elemental symbol has an underlying definition, which allows a conversion to base units to be recursively derived. The Energistics quantity class specification provides a way to organize the Energistics unit of measure symbols into coherent sets. Unfortunately, the unit of measure symbols do not always conform to the RP66 grammar, and there is no tie between RP66 and the quantity classes. The Energistics Unit of Measure Standard, which includes the Unit of Measure dictionary, was created to merge the best concepts contained within these older specifications and provide important enhancements. The RP66 grammar was patterned after the SI’s best practices for constructing a unit. This is very flexible, which means that all possible combinations are allowed. For example, the symbols “m.s” and “s.m” are equivalent. Likewise, the symbols “km/s” and “m/ms” are considered to be equivalent by SI because the multiplier implied by the prefi xes is the same (1E3 = 1/1E-3). By extension, any combination of prefi xes that results in the same multiplier is equivalent, such as km/s, m/ms, Mm/ks, Gm/ Ms, mm/us, etc. Furthermore, many mechanical variations are possible such as km.s-1, s-1/km-1 and (1/s)/(1/km). This increased complexity can be confusing if the symbols are to be used within the same user interface.

Foundations | Fall 2014 | 11


UNIT DIMENSIONS Unit Dimensions are simple expressions that are used to convey fundamental concepts of measurement. Each expression is constructed from building blocks using a few simple rules and symbols. There are 125 Unit Dimensions provided in this release. The 10 building blocks are: Angle - A Length - L Temperature Difference - D Mass - M Electrical Current - I Amount of Substance - N Luminous Intensity - J Solid Angle - S Thermodynamic Temperature - M Time - T

UNITS OF MEASURE This is the set of units that are available to use. Units may be simple (ft or degF). Units may be complex (Btu[IT].in/(h.ft2.deltaF)).

For example, the Unit Dimension L2M/T2 uses length, mass and time (Length squared * Mass/Time Squared).

UNIT QUANTITY There is a many-to-many relationship between Units of Measure and Quantity class.

QUANTITY CLASS A quantity class represents a set of units with the same dimension and the same underlying measurement concept.

Obviously, many units of measure belong to the same quantity class (e.g. ft and m both belong to the length class).

The quantity classes “energy” and “moment of force” both use the Unit Dimension L2M/T2. However, these are not the same class of quantity and are used for different kinds of measurements.

It is less obvious that some units of measure may be used to represent more than one quantity class. For example, J (Joule) can be used to indicate energy or moment of force.

THE UNITS OF MEASURE PROJECT Managed by Energistics, the Units of Measure project is an effort that encompasses participants from both the Professional Petroleum Data Management (PPDM) Association and Energistics member bases. It has evolved into a Standards Leadership Council project, where it draws its requirements from the PPDM Association, the Society of Exploration Geophysicists and the archival history of Energistics/POSC. Units of measure data has been pulled from the following organizations: • Energistics • TIBCO OpenSpirit by TIBCO Software Inc • The PPDM Association • The U.S. National Institute of Standards & Technology • Bureau International des Poids et Mesures • The International Organization for Standardization • The American Petroleum Institute • The Society for Petroleum Engineers • The Society of Exploration Geophysicists • The International Association of Oil and Gas Producers • The European Petroleum Survey Group • Schlumberger Limited • The International Gemological Institute • API RP66, V1 and V2

12 | Journal of the Professional Petroleum Data Management Association

Additional input from other organizations and a team of interested experts has also been incorporated. The final deliverables of the project include units of measure sets in both XML and Excel, the coalescing of units into quantity classes, unit symbol grammar specifications, guidance for creating and managing unit dictionaries, and mappings to POSC 2.2, EPSG 8.1, OpenSpirit, and RP66 V1 and V2.

THE FUTURE OF UNITS OF MEASURE The new Energistics Unit of Measure Standard uses a combination of the techniques previously discussed. It defines a unit symbol grammar specification that allows computer parsing of the symbols, but it pre-defines the supported set of symbols that conform to that grammar specification. That is, the full combinatorial explosion of possibilities is not supported. Pre-defining the supported unit set is similar to the approach taken by ISO 15926 where a metre is indicated by “9561” and by the United Nations Centre for Trade Facilitation and Electronic Business units, where a metre is indicated by “MTR.” If we think of the symbol as a unique code, then the grammar specification allows us to create derived codes from basic codes. This combination provides the flexibility of the ISO technique with the more focused support of the code technique while using a more user-friendly code.


Feature The Energistics conversion data specifies a precision of about seven digits, which is deemed adequate by most authorities focused on written documentation (the user interface). However, the underlying assumption about conversion factors is that they fundamentally represent unity. For example, the factor “2.54 centimetres per inch” represents a ratio of two identical physical lengths and therefore no error or change in semantics will be introduced if an inch value is multiplied by that ratio. The new Energistics Unit of Measure Standard takes the view that all conversion data should be captured to the maximum precision possible. This will allow very precise conversions on modern (64+ bit) architectures using double-precision calculations. Any presumption about rounding to a reasonable measurement precision should be reserved for the user interface. Conveniently, the international community has been working for many years to redefine the most common unit concepts so that they have an exact conversion to SI, and the new dictionary explicitly captures that knowledge. Because the dictionary uses consistent grammar for constructing the symbols, the conversion data for each derived symbol can be tested against the conversion data of its underlying components. In addition, prefi xed symbols are explicitly flagged so that the conversion data can be tested against the prefi x’s multiplier. As an additional validation, the dictionary captures underlying definitions such as an inch being 2.54 centimetres. The quantity class is used to group all units that represent the same underlying measurement concept, such as mass, length, mass per length, mass per mass and volume per volume. This knowledge can be used to constrain the units that are allowed to be specified for a property. For example, a relational database management system column that is categorized as a mass can use triggers to constrain an associated unit symbol to be in the set defined for the mass class. The inclusion of dimensional analysis makes it easier to map external concepts to a class. There are 11 quantity classes that are categorized as being “dimensionless.” Ten of these classes represent ratios of alike quantities, such as volume per volume, that represent a special case of dimensionless and use base units that explicitly reflect their underlying nature (m3/m3). This special handling of ratios is consistent with SI recommendations and assists in understanding differences in the ~100 units, which would otherwise just be dimensionless within the context of something like RP66. While the dictionary supports SI recommendations, such as using the symbol m3/m3 for volume ratios, it also recognizes the need for legacy viewpoints such as allowing an unqualified “%” symbol for all classes that represent a ratio of alike quantities. In addition, it adds some specialized percentage units such as %[vol] for “percent by volume”, where the square bracket notation indicates a variation of the preceding concept.

IMPLEMENTATION AND ADOPTION Today’s technology benefits us the most when we can integrate different systems into a unified data source. Business intelligence tools, geographics information systems and Big Data analytics all perform more consistently and accurately if they are working from

the same underlying framework. Units of measure are an essential part of the technical and semantic framework upon which integration is built. Plans for implementation and adoption of the new units of measure are already in the queue. The PPDM Association will incorporate the standards into its data model reference set, making it easier for companies to deploy units of measure accurately and consistently. The PPDM Association had a units of measure dataset in existence; however, the organization has determined that this new set is better, more consistent and complete, and much more useful. Implementing a standardized set of units of measure correctly is not an easy thing to do, but sharing ideas and best practices will help the transition occur more easily. Losing the ambiguity that is currently in many existing data sets will be a significant step forward for the industry as a whole. However, many companies may have difficulties with their legacy systems. Anticipating that, specialized software and consulting services will emerge to help manage this problem effectively.

Implementing a standardized set of units of measure correctly is not an easy thing to do, but sharing ideas and best practices will help the transition occur more easily.

The Units of Measure project has produced new unit sets and received and reviewed public comments, and it is in the process of producing new sets with new requirements. The workgroup has already seen take-up from the Society of Petroleum Engineers’ Drilling Systems Automation Technical Section, the OPC Foundation’s Unified Architecture and Oilware Inc., with other organizations, such as the PPDM Association, ready to implement. As the Units of Measure matures, there is real value for the industry in adoption—it will provide a solid foundation for key modules in PPDM; validate Energistics data in the context of an existing relational data model; improve the functionality of the PPDM Association database; promote interoperability between PPDM databases; enable improved data sharing with non-PPDM Association systems; and promote the implementation of a high level SQL function interface for all unit of measure operations, including data import, conversion, query, display and data export. On a more human level, adoption of standardized units of measure encourages what we all want—a smarter and more productive way to do business.

Foundations | Fall 2014 | 13

Saskatchewan’s upstream oil and gas industry is a major driver of the province’s economic engine, providing about 36,000 person-years of direct and indirect employment. The industry is expected to contribute approximately $1.6 billion in revenue to the provincial economy in 2013-14.

All’s “well” in Saskatchewan Saskatchewan pragmatism leads the way to adopting the Canadian Well Identification System

By Danette Flegel, Director of PRIME, Ministry of the Economy, Government of Saskatchewan


s the province that pioneered ATMs and debit cards in Canada and founded universal health care, it’s no surprise that Saskatchewan is leading the way once more. Saskatchewan is the first jurisdiction and regulator in the country to adopt components of the Professional Petroleum Data Management (PPDM) Association’s Canadian Well Identification System (CWIS). “Saskatchewan is known in the oil and gas industry for its pragmatic, commonsense regulatory approach,” says Ed Dancsok, assistant deputy minister responsible for the petroleum and natural gas division of the Government of Saskatchewan’s Ministry of the Economy.

“Adopting parts of CWIS, and particularly adopting PPDM’s well-established terminology to identify well and facility infrastructure components, are examples of the province taking practical steps to make it easier for industry to do business in Saskatchewan.” The Prairie province traditionally associated with agriculture is also Canada’s second-largest oil producer, accounting for 15 percent of total crude oil production. Saskatchewan is the sixth-largest oilproducing jurisdiction in North America behind Alberta, Texas, North Dakota, Alaska and California. An estimated 29,900 operating oil wells in the province produced 177.9 million barrels (28.3 million cubic metres) worth $13.7 billion in 2013.

14 | Journal of the Professional Petroleum Data Management Association

VISION The importance of the petroleum and natural gas industry to Saskatchewan’s prosperity is one of the reasons the province undertook a major revitalization project to overhaul its major business processes and systems, says Dancsok. Known as the Process Renewal and Infrastructure Management Enhancement (PRIME) Project, the multi-year, multimillion dollar program will deliver an integrated information system that will enable industry to complete regularly performed business activities with the province online. It’s this system­—the Integrated Resource Information System (IRIS)—where the CWIS standards come into play, says Dancsok. “The ministry has a clear vision of what it wants to achieve through PRIME, including how data in IRIS will be used in the development and regulation of the province’s resources,” says Dancsok. “Core to that vision is the desire to implement industry best practices and to standardize with other jurisdictions where practical. CWIS was developed by, and in consultation with, industry and regulators, and is based on PPDM’s globally accepted standards. Adopting CWIS for IRIS simply made sense.” Implementing CWIS requires wholesale changes to established data systems. Dancsok admits Saskatchewan was in an enviable position when it kicked off PRIME’s Well and Facility Infrastructure Project (WFIP), the project that’s developing the ministry’s well and facility infrastructure module on IRIS. Since it was already in the process of renewing its business support systems, the ministry could also consider other enhancements.



“Along with modernizing our business processes and replacing our legacy technology systems, we took the opportunity to look at our well identification system,” says Debby Westerman, WFIP’s lead subject matter expert. “Our legacy Well ID had so much embedded meaning attached to it that it became clumsy and difficult to use it for its primary purpose: to quickly and unambiguously identify and retrieve information related to a well or a well component.” Compounding Saskatchewan’s challenges with its well identification system is the prevalence of horizontal drilling in the province, adds Westerman. Of the 3,371 oil wells drilled in 2013, 2,433, or 72 percent, were horizontal. The ministry’s current well information system (WIS), which has been in use since 1985, was originally designed to support vertical wells. Functionality for horizontal wells was added to WIS in the mid-1990s, and other functionality was patched onto the system as time progressed and the industry changed. One of the first steps the ministry took to begin standardizing its well identification system was adopting the Unique Well Identifier (UWI) in December 2011 as part of PRIME’s Registry Saskatchewan Inclusion Project, which oversaw the implementation of Saskatchewan production reporting onto Petrinex. “While the UWI is a useful identifier, it has some limitations to the effective management of well information,” says Westerman, adding that UWI will continue to be an attribute in IRIS. CWIS is being adopted to complement UWI because it provides a unique, permanent identifier that does not imply temporal order or meaning. “The ministry wants to ensure the system it uses to identify a well and its components does not have any embedded meaning that could impact how the data is interpreted now or in the future.”

DETOUR Westerman notes that even though the ministry is revamping its well and facility infrastructure business systems and is able

to make significant improvements to how wells are identified and how its data is structured, it could not adopt the full-scale PPDM data model—at least not yet. “One of the reasons we looked at PPDM’s data model is because it’s comprehensive,” says Westerman. “We have to move over 100,000 well files into IRIS, and changing all of their attributes to PPDM’s just wasn’t practical at this point in time. What the data model did do, however, was help guide how we, as a regulator, structure our IRIS data to ensure that it’s PPDM-friendly, so full adoption of the data model may be possible over time.” The ministry is implementing PPDM’s What is a Well terminology identified in the CWIS standards into IRIS, which both Dancsok and Westerman call an excellent step forward in its internal and cross-jurisdictional standardization efforts. Saskatchewan, British Columbia and Alberta are partners in the New West Partnership Trade Agreement. Along with creating Canada’s largest barrierfree, interprovincial market, the agreement also commits the partners to “mutually

recognize or otherwise reconcile unnecessary differences in their standards and regulations” ( Dancsok believes the adoption of the CWIS standards aligns well with intent of the New West Partnership. “If others follow suit, the adoption of common well terminology and identification may mitigate miscommunication, solidify the soundness of the data collected, and thereby increasing confidence in that data,” says Dancsok. “Regardless of where the data comes from—from a data vendor, a regulator or a company—everyone will have confidence that the data is of the highest calibre and value possible. It will save government and industry time and will likely avoid costly errors due to ambiguity.” Trudy Curtis, chief executive officer of PPDM, states it’s that spirit of collaboration and cooperation that makes Saskatchewan a leader and ambassador for the adoption of CWIS in Canada. Besides endorsing the initiative from the start, Saskatchewan also provided subject matter experts like Westerman to the standards working group. “Saskatchewan recognizes that transparent, consistent and accurate data is core to managing the development of its significant natural resources,” says Curtis. “Their common-sense approach to adopting CWIS makes it a role model for other jurisdictions. For Saskatchewan, it’s the principles of the standards that matter; how they are implemented can be tailored to each organization’s business needs.” True to his Saskatchewan roots, Dancsok is reluctant to hang the trailblazer mantle on his province’s role in CWIS. “Our province and our people are notoriously resourceful. Throughout our history, we’ve led monumental advancements in Canada, but we did so not with the intent of being a leader,” says Dancsok. “It was an unintentional consequence of our citizens’ practicality. Our adoption of CWIS, and our desire to develop the province’s oil and gas resources responsibly using the best data possible, is a demonstration of Saskatchewan’s pragmatism.”

Foundations | Fall 2014 | 15



Petrosys knows how to keep E&P data fresh and relevant. SOFTWARE




As the business of finding and developing oil and gas becomes more challenging and efficient, the PPDM data model has evolved to better describe the underlying processes. The PPDM databases built by Petrosys over the past two decades have been effectively migrated along this evolutionary path, improving performance and functionality whilst sustaining the quality and integrity of the content. Petrosys continues to provide expert support for PPDM data management when and where needed. To learn more go to


Certified The PPDM Association and industry participants are finalizing the process to establish professional accreditation for petroleum data analysts By Gordon Cope


otal S.A., based in Paris, operates in 130 countries around the world. In Canada, its wholly owned subsidiary, Total E&P Canada Ltd., has comprehensive holdings in the oilsands, including partnerships in the Surmont, Fort Hills, Joslyn and Northern Lights assets. Exploring, developing and producing oilsands assets requires collecting and analyzing huge amounts of data. Data managers must be familiar not only with how data is qualified, stored and retrieved, but also with such oil and gas basics as well logs, geology, seismic, land, business practices and geospatial systems. Finding the right personnel for the job can be a daunting task. “We have from four to eight staff in the GeoInformation group,” explains Marc Nolte, geoinformation manager with Total E&P Canada. “Recently, we advertised an opening and we had over 50 applications.” Data managers come from all walks of life, from engineering and computer science to geology and geophysics; essentially, they are people who show an affinity for data. Unfortunately, there is no formal process to certify them as data management professionals. “From my perspective, if I

can look to see if someone has a certificate, it makes my life easier,” says Nolte. “[A system of formal certification] would have tremendous value to the industry.”

WHAT IS CERTIFICATION? The professional certification process is quickly moving toward reality. “One of the major goals of the PPDM [the Professional Petroleum Data Management Association] and petroleum industry members is to professionally certify petroleum management professionals,” says Trudy Curtis, director and chief executive officer of the PPDM Association. “To that end, the PPDM Association has established the Certified Petroleum Data Analyst (CPDA) certification program as the first offering in a number of petroleum data management professional certifications.” Generally, professional certification involves several standardized factors. The Association of Professional Engineers and Geoscientists of Alberta (APEGA), for instance, is an association established by the Government of Alberta to act as a regulatory agency and oversee certification. Those wishing to gain APEGA initial membership must show a bachelor’s degree in science or engineering in order to

qualify as a member-in-training. They must then accumulate four years of experience in the oilpatch in order to apply for full membership. APEGA oversees the National Professional Practice Examination for each discipline; those who pass receive certification as a professional engineer in Alberta. Throughout subsequent years, members are expected to continue their professional advancement by attending relevant industry courses taught by post-secondary institutions and accredited private and public institutions. Some data management certification programs already exist, like DAMA International’s Certified Data Management Professional certification program and the Association for Information and Image Management’s Certified Information Professional exam, but nothing exists yet for data management in the exploration and production space, specifically. An exploration and production certification program would have many benefits. “From an oil company perspective, human resources will have something that they can formally use to help compensate staff,” says Nolte. “From the data manager’s perspective, they will have formal recognition of their skills and capabilities.” Several years ago, the PPDM Association and industry members organized the Petroleum Education Task Force (PETF) to catalogue the training that a petroleum data manager needs and to identify and advance all the necessary aspects of certification. The PETF determined that, rather than having a specific requirement regarding a degree or diploma, candidates for certification would have to demonstrate competency, experience and knowledge. “We have found that competent data managers come from many different educational backgrounds,” says Megan Sutherland, project coordinator at the PPDM Association. “There will be a minimum time period requirement—three to five years—in which they [must] demonstrate experience, however.” Candidates must also demonstrate competency and knowledge by passing

Foundations | Fall 2014 | 17


an exam that focuses on eight core areas: data governance, data analysis, data quality management, data security, knowledge of spatial location data, knowledge of exploration and production, master data management and communication. In the oil and gas industry, data management professionals generally fall into four main disciplines: data analysts, geospatial analysts, business analysts and records analysts. “We discussed launching exams for all four disciplines at the same time, but eventually decided to do the entire process for data analysts first,” says Nolte. Starting in 2013, PETF began working with Yardstick, a testing and training consultancy, to develop an exam. “We put together over 360 questions relating to eight core competencies and 26 sub-core competencies,” says Nolte. In April and May 2014, volunteers underwent pilot exams in 10 locations

globally (Calgary; Houston; Paris; The Woodlands, Texas; Doha, Qatar; Jakarta, Indonesia; Oklahoma City; Perth, Australia and two locations outside of London). Feedback from the pilot exams has allowed PETF and Yardstick to refine the testing process. “We are still working through the creation of two exams,” says Nolte. “You have to make sure that there are no natural enemies—questions that essentially ask the same thing in different ways. Our goal is to have the exams ready by the fourth quarter of 2014. Data analysts with three to five years’ experience can then take the exam and, if they pass, receive PPDM’s CPDA designation.”

WHAT STILL NEEDS TO BE DONE The PETF is putting together a certification handbook to outline the policies and procedures that will eventually allow the creation of an independent, accredited body with the powers to formally certify

and register professional petroleum data analysts as well as administer oversight and governance of the profession. “There are dozens of procedures that are needed to be created,” says Sutherland. “They include how to go through the application process, professional development, appeals, privacy, renewals and re-certification.” In the near future, PPDM and PETF anticipate a flurry of activity. “We expect there will be a lot of pent-up demand in the first six months to take the test,” says Nolte. “We’ll then be getting a lot of feedback from data analysts and employers in order to refine the process.” In the longer run, much work remains to be done. “Launching the CPDA certification process is a means of demonstrating to the sector the value and competency of data analysts,” says Curtis. “We look forward to extending the process to include geospatial analysts, business analysts and records analysts.”



Get the most in-depth news, data and trends from the wellsites and boardrooms of Canada’s oilpatch. A continuous stream of timely oilpatch news articles Topical data sets and statistics gathered by our researchers Personalized news streams and notifications with MyDOB

October 21-23 Delta Lodge at Kananaskis

Access to news on the go with your tablet or smartphone

Email today to get a free two-week Not a subscriber SUBSCRIPTIONS@DAILYOILBULLETIN.COM yet?


For details and registration visit:

18 | Journal of the Professional Petroleum Data Management Association

Technical Article

10 THINGS TO KNOW ABOUT WELL LOGS In today’s fast-everything society, I am often asked for a short set of rules for checking the quality of well logs. Since you asked…. By Martin Storey, senior consultant, Well Data Quality Assurance Pty Ltd


ow does one distill several decades of learning on rigs and in tape libraries, data rooms and offices into a few snappy points? As with many questions, the answer depends on the circumstances. The first rules must therefore have to do with framing the situation.


Distinguish between historical data and incoming data.

With historical (also known as legacy, old or pre-existing) data, what is available is generally all there is, and its quality control consists of verifying that it is what is expected for the particular data set. With incoming (also referred to as new or future) data, the quality control should include verifying that all the technical specifications of the contract for the product are met and, if they are not, engaging with the supplier until they are. In practice, however, contractual specifications are rarely detailed or exhaustive, and the incoming data quality control is shared between the operations and petrophysics departments, without either side following a precise script.


Distinguish between original data and derived data.

Original (also known as raw, acquisition or received) data is the record from an operation done once and reflects the circumstances at that one particular time, never to be repeated exactly.

Derived (often referred to as evaluated, processed or interpreted) data is produced from original data, either within the organization or by an external party, and could conceivably be recreated. Derived data almost always has an interpretative character, so it could have more than one version. In real life, the distinction between original and derived data is not always clear-cut, particularly in the past decade when much original data has already been subjected to various forms of processing prior to its initial delivery. Log data interpreters tend to use mainly original data, while other users of log data tend to use derived data. Original data is often voluminous and raw, while derived data should have been conditioned and validated, or generated from data that has been conditioned and validated, and is therefore presumed safer. However, different applications may require the logs to be conditioned differently.


Examples of sources for the different log data categories are shown in the table below, which is not exhaustive. The interrupted lines indicate that the distinctions are not absolute. As subtle as these distinctions may seem to some, they largely determine the activities required to control the quality and manage the non-quality of the log data. It is therefore important to keep data sets of different categories distinct and readily identifiable. The original query has now morphed into four distinct questions, and the first two rules do not actually serve to accept or reject data sets. We are getting to that, but we are not yet done with fundamental principles.


Data must be managed for the short and the long term concurrently.

The point of this rule is that managing only for the immediate requirements and initial workflows is insufficient and ultimately damaging.




• Own records or databases • Data vendor • Provider of public domain data • Merger or acquisition

• Data acquisition company for own or joint-venture-project ongoing operations • Data processing company • Asset swap


• Own records or databases • Processed data vendor • P rovider of public domain interpretative data • Merger or acquisition

• Data acquisition company for own or joint-venture-project ongoing operations • Data processing company

Foundations | Fall 2014 | 19

Technical Article


Original logs normally come in two forms—print and tape—and both are required.

» T he print (or film or image) is frequently delivered as a digital raster or vector image file, such as a TIFF or PDF. » The tape (or CD or digital data) is generally a digital file in Digital Log Interchange Standard (DLIS) or Log ASCII Standard (LAS) format, but historically other formats have been used, such as the Log Information Standard (LIS) and TIF. Both are needed because each may contain information that is essential for the valid exploitation of the data and that is found nowhere else. The image below shows a barely legible section of a log print that contains critical information about the data and the depth reference. This information is not on the tape and ignoring it could result in overlooking a prospective interval or perforating the wrong formation in a well.


Original data must be obtained and stored in the original format, and that format is usually DLIS or LIS, not LAS.

DLIS frequently contains contextual information found nowhere else, while LAS generally contains very little contextual information. DLIS and LAS are among the most common digital formats for log data. DLIS and its predecessor, LIS, are binary formats, while LAS files are written in ASCII and can be read using any text reader. LAS was first introduced in the late 1980s by the Canadian Well Logging Society and was originally intended for exchanging basic digital log data on floppy disks between personal computers in a quick and easy-touse format. It succeeded, and 25 years later it remains a convenient format for moving log data around. The specifications of LAS have been enhanced, and LAS has the potential to contain almost any log data. In practice, however, LAS files are never complete and generally contain only the main curves. DLIS files are much more likely to contain the ancillary curves, acquisition parameters and contextual information required for the exploitation of the data. DLIS files can be large, with numerous curves that can disorient the casual user; however, these complete files are required in the record to ensure that the data can be evaluated and interpreted in the future. If you must also receive data deliveries in LAS files, the consistency of the DLIS and LAS data sets needs to be checked systematically. The LAS set should be a subset of the DLIS set, and the curve names, data sampling, depths and data values should be exactly the same. As an empirical rule, any conversion from one format to another results in the loss of data required by the technical experts for

exploitation or reprocessing. Because of a lack of a unique digital log data exchange standard, many data exporter software programs have produced corrupted files, which reside incognito in our archives.


For the exploitation of any log data, context is king and must be preserved.

Data without context can only be used by making arbitrary assumptions that increase the organization’s risk unnecessarily. According to a Society of Professional Well Log Analysts article, Current Status of Well Logging Data Deliverables and A Vision Forward, the first contextual feature of a log is its print header, including the reference metadata, the remarks section, the description of the logging tool string configuration and hopefully more. There is a lot of other contextual information relating to logs, including the operations sequence, the detailed drilling history, mud composition and tidal conditions, and any of these may have a critical impact on data exploitation. This is not an anecdotal point. Good interpreters should verify that their conclusions are consistent with other relevant information. The more thoroughly that consistency can be verified, the more confidence there will be in the recommendations used to support the organization’s decisions. In many countries, the combination of the original log prints, log tapes and the operator’s end of well report (sometimes called a final well report or a completion report), complete with attachments and appendices, can normally provide all the information required for the present and future exploitation of the original log data.


The completeness of the data sets should be assessed and captured using a data inventory.

Example of essential information found exclusively on a log print.

20 | Journal of the Professional Petroleum Data Management Association

Data sets vary in form and content due to their companies of origin, their locations, the time when the data was generated, the vintage of the tools and software used, the operator’s specific requirements, etc. There is therefore no standard delivery. Some data sets contain several different prints, log quality control prints, image


Log data is generally “hot” for a short period after it was acquired or generated, but the initial demands on the data are often quite light: for most operational decisions or rush evaluations, high-quality data is not required and the decisions can be made on the basis of preliminary data. The data becomes hot again at various and largely unpredictable times later on. Eventually, the demands on the data may become more and more specific, requiring higher and broader data quality. People are often surprised to hear that old data can be as good as modern data, but it is, and the main impediment to the use of old data is not its lack of intrinsic value, but its availability and incompleteness. The lifespan of data is longer than our own, and as temporary custodians, we must preserve its value for future exploitation. Let us now get practical.

Technical Article

prints, processing results, time-based data, raw and corrected data, special tool documentation and several successive deliverables. A good way to stay organized is for the main people involved—operations geologist, petrophysicist, reservoir engineer—to compile an inventory of the expected data beforehand. This is not always done, but it helps everyone clarify the content and timing of deliverables and plan their work accordingly. Later, the inventory should be a central feature of the data management system, so that a prospective user can establish what a data set contains and whether its various components are available.


Validate the metadata first.

The validation of the prints’ metadata is essentially the confirmation that the header data is correct, clear and coherent. Bad metadata is a major contributor to data loss, according to Guy Holmes in Search and Rescue: The Data Lost and Found Initiative. Not finding a data item when it is needed can result in severe opportunity loss and additional costs. Although it may not appear in the record, more than one well has been drilled to re-acquire data that could no longer be found. It is often said that data acquisition companies are very good at doing the hard stuff and not so good at doing the easy stuff. In fairness, metadata is often provided to data acquisition companies by the operating company. Getting the reference information right is basic quality assurance and everyone’s responsibility. Well coordinates and elevations are notoriously unreliable on logs, but more disconcertingly, so are wellbore names, which are often the main handle of a data set. Data sets are surprisingly often registered against the wrong wellbore—Laurel-1 instead of Laurel-1ST1—or against the wrong well name—Hardy 1 instead of Hardy-1—rendering the data difficult to locate. Validating the metadata may be feasible with programmatic rules if this does not interfere with the data user’s day-to-day work, and it may be best driven by the data management team. The PPDM Association’s What Is A Well is a valuable reference for this.


Perform basic checks on the original data prints.

Once the metadata has been verified, the data itself should be checked. From the data management perspective, the main items to consider are: » Prints should have a header and a tail so that it is clear that no section of the log print has been torn out or is otherwise missing. » In between the header and the tail, in approximate top-to-bottom order, most prints should include: • a full log header including remarks; • tool sketch with the sensor measure points; • a well sketch; • one or several log sections, preferably clearly labelled (ex. main log; repeat section; correlation pass; main log, 1:200 scale); • a repeat section in the case of formation evaluation logs; • parameter listings for each log section; and • a tool calibration section. » Modern logs should also contain: • a job chronology; • a depth information box; • a survey listing; and • quality-control plots. » Some logs are presented on logarithmic scales. If the grid scales are not consistent with the data scales serious interpretation errors can occur, so it is wise to check these, too.


Basic checks on the original data tape.

Many things can go wrong with log data, and there is no set list of rules to check or a certain way to assure quality. In spite of several competent people having been involved in the production of the log data, such as the logging engineer, a wellsite witness, line managers and a petrophysicist, errors often slip in. Moreover, each set delivered is liable to have new errors that were not in the previous delivery—perhaps a wholly or partially missing curve, missing vector data or a sampling interval set slightly incorrectly at 0.152 metres instead of 0.1524 metres. Every tape delivered must therefore be verified. Aspects that can easily be checked

by the data management group include the tape’s legibility using the organization’s normal readers and the presence of at least one file per log section. Some logs are acquired and put on the tape but not presented in print. For instance, logs recorded while running in the hole and a gamma ray log recorded in casing up to the ground level. Again, metadata is a frequent problem with tapes: is the tape correctly labelled and is each file on the tape clearly identified? Both the logging date and the tape creation date must be specified on the label. When several versions of a tape exist, the latest one normally supersedes all previous ones, which should be destroyed to minimize future confusion—even if all are labelled “final.” The checking of derived data is even less generic, although similar principles apply: the data must be fully identified and complete, and it must contain all necessary contextual information or references, and preferably be stored in a future-proof format. Actual checks and processes will depend on the organization’s systems. Well log data quality assurance is not straightforward because no one person in the chain of custody is at the same time fully informed, competent and enabled to do it all. From the perspective of data management and with the aim to “provide the right data at the right time to the right people,” it is essential to work with all stakeholders to maximize the quality of the incoming data, as well as to minimize quality erosion during its life cycle. In essence, most of the rules apply to other categories of well data, for instance, mud log data or even core analysis data. The likelihood that data will be exploited for its original purpose, as well as for initially unforeseen purposes, is directly related to its quality and availability: the better these are, the more easily the data can be exploited and the more value can be obtained from it, even decades after it was acquired. Martin Storey started in the oil and gas industry as a logging engineer, then became a wellsite petroleum engineer and a petrophysicist. He is now an independent practitioner, consultant and trainer in petrophysics and data management based in Australia.

Foundations | Fall 2014 | 21

Upcoming Events





OCTOBER 2014 S M T W 5 12 19 26

6 13 20 27

7 14 21 28

1 8 15 22 29

NOVEMBER 2014 S M T W 2 9 16 23/30

3 10 17 24

4 11 18 25

5 12 19 26

DECEMBER 2014 S M T W 7 14 21 28

1 8 15 22 29

2 9 16 23 30

3 10 17 24 31

JANUARY 2015 S M T W 4 11 18 25

5 12 19 26

6 13 20 27

7 14 21 28

FEBRUARY 2015 S M T W 1 8 15 22

2 9 16 23

3 10 17 24

4 11 18 25




2 9 16 23 30

3 10 17 24 31

4 11 18 25




6 13 20 27

7 14 21 28

1 8 15 22 29




4 11 18 25

5 12 19 26

6 13 20 27




1 8 15 22 29

2 9 16 23 30

3 10 17 24 31




5 12 19 26

6 13 20 27

7 14 21 28




Midland, Texas, USA

Houston, Texas, USA

Perth, Australia




Oklahoma City, Oklahoma, USA

Denver, Colorado, USA

Denver, Colorado, USA



Oklahoma City, Oklahoma, USA

Houston, Texas, USA

Tulsa, Oklahoma, USA



Dallas/Fort Worth, Texas, USA

Calgary, Alberta, Canada

Dallas/Fort Worth, Texas, USA




Perth, Australia

Midland, Texas, USA

Tulsa, Oklahoma, USA







Dallas, Texas, USA

Calgary, Alberta, Canada


Schedule subject to change

VISIT PPDM.ORG FOR MORE INFORMATION 22 | Journal of the Professional Petroleum Data Management Association

Become a PPDm

member One membership fee; over $100 million worth of knowledge. Now that’s ROI.

Members of the Professional Petroleum Data Management Association get immediate access to a wealth of tools and knowledge in the field of oil and gas data management standards, best practices, and education, including: Standards • PPDM 3.8: The leading open standards data model in the industry • What is a Well? • Well Status and Classification

Business rules: best practices for creating and managing master data stores containing the most trusted data for an organization

Knowledge sharing • Training and certification • Events: workshops, conferences, user group meetings • Forums, wiki pages, blogs

Quarterly subscription to Foundations

Membership in the PPDM Association is one of the best investments you can make. For more information on membership packages, visit us online at Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Canada Email: Phone: 403-660-7817

Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.