Foundations, Volume 4 Issue 1

Page 1

Foundations Journal of the Professional Petroleum Data Management Association

Print: ISSN 2368-7533 - Online: ISSN 2368-7541

Volume 4 | Issue 1

1st Place Foundations Photo Contest Winner; “Encountered Many Defeats But Still Not Defeated.” (Vikrant Lakhanpal)

Ready In Advance A data manager should ask a string of questions about any data, including contracts and JVAs. (Page 4)

From National Data Repository to Data Energy Cloud Working with a Regulator. (Page 23) PLUS PHOTO CONTEST: This issue’s winners and how to enter (Page 20)


Make Better Decisions EnergyIQ creates information certainty in an uncertain E&P world. Rapid and accurate analysis with certainty Geoscience Well Master to enable consistent data-driven decisions Corporate Well Master to drive operational excellence End-to-end planning, execution and performance Integrated workflows based on real-time information, events and activities Contact EnergyIQ to learn how you can work with the most trusted E&P data available. www.EnergyIQ.info  info@EnergyIQ.info Š 2017 EnergyIQ, LLC. All Rights Reserved.


Table of Contents

Foundations

Volume 4 | Issue 1

Foundations: The Journal of the Professional Petroleum Data Management Association.

COVER FEATURE

4

Ready in Advance Questions to Ask By Ted Wall

CEO Trudy Curtis

From National Data Repository to Digital Energy Cloud Working With a Regulator 23 By Jamie Cruise

Senior Operations Coordinator Amanda Phillips

GUEST EDITORIALS

Senior Community Development Coordinator Elise Sommer

Telling Insightful Stories By Jim Crompton

Article Contributors/Authors Jim Crompton, Jamie Cruise, Trudy Curtis, Guy Holmes, Peter MacDougall, Monica Mills, Joseph Seila, Yogi Schulz, Joseph Suarez, and Ted Wall Editorial Assistance Emma Bechtel, Beci Carrington, Dave Fisher, Monica Mills Graphics & Illustrations Jasleen Virdi Graphic Design

Data Storytellers

Q&A With Joseph Seila By the Foundations Editorial Team

Vice Chair Robert Best Secretary Lesley Evans Treasurer Peter MacDougall Directors Amii Bean, Robert Best, Brian Boulmay, Jeremy Eade, Lesley Evans, Trevor Hicks, David Hood, Allan Huber, Peter MacDougall, Christine Miesner, Shashank Panchangam, Paloma Urbano Head Office Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4, Canada Email: info@ppdm.org Phone: 1-403-660-7817 Publish Date: February 2017.

27

FEATURES

Hands On With The PPDM Association Board Of 9 Directors First Experiences By Peter MacDougall

Achieving Success in Developing Regulatory Standards BOARD OF DIRECTORS Chair Trevor Hicks

6

10

13

And Their (Potential) Impact on the Oil Industry By Guy Holmes

Things to Know‌ About Digital Data Storage Types By Monica Mills

18

The Importance of Standards By Joseph M. Suarez DEPARTMENTS

Calgary Symposium Celebrates Anniversary

22

2016 Calgary Data Management Symposium, Tradeshow & AGM

Thank You To Our Volunteers

26

Featuring Mike Skeffington, Paula Jennings and Art Boykiw

Directive PNG017 Case Study By Yogi Schulz

Global Technology Trends

Electronic Data Interchange

Photo Contest

20

This issue’s winners and how YOU can get your photo on the cover of Foundations

Upcoming Events, 31 Training and Certification Join PPDM at events and conferences around the world in 2017. Learn about upcoming CPDA Examination dates and training opportunities

16

ABOUT PPDM The Professional Petroleum Data Management Association (PPDM) is the not for profit, global society that enables the development of professional data managers, engages them in community, and endorses a collective body of knowledge for data management across the oil and gas industry.

Foundations | Vol 4, Issue 1 | 3


Ready In Advance ORD EM E P RO GR

E

S

S

O

By Ted Wall, Ted Wall Consulting

Y

our life as data manager in an oil company might be simpler if every project was 100%-owned from exploration permit to production operations. But that is a rarity today, except in a few National Oil Companies (NOCs). The days of the Seven Sisters are long gone. Even the largest multinationals take partners in some of their operations. Partnerships can expand opportunities, reduce risks and leverage specializations, all with the aim of lowering costs and increasing returns on capital employed. Joint Venture Agreements (JVAs) also allow companies to benefit from expertise they may not have in-house. In this era of cost cutting and specialization, JVAs are proliferating. Any sort of partnership relationship, including a JVA, requires extra attention to the ownership of information. Who owns what? Who is responsible for what? What is the impact of selling or acquiring an interest? The correct answers are necessary for corporate success and risk management, but the knowledge and understanding of the complexity of JVAs may be declining as a result of staff reductions. As a Data Manager you will likely not be able to give detailed answers to all the questions in this article. The following questions do, however, need to be asked. There needs to be a determination of

who in an organization is responsible for obtaining and establishing what the correct answer is to the various questions. A data manager should ask a string of questions about any data, including contracts and JVAs. For example: • Is the data complete, clean, accurate and up to-date? • Do the records, both active and archived, comply with my governance procedures for retention, security, backup and retrieval? • For any data that must be exchanged among partners or government authority, is the format and content acceptable to all parties? Every aspect of the Exploration and Production (E&P) business also has specialized data that can only be properly handled by someone with intimate knowledge of the peculiarities. JVAs are no exception. For example, • Beyond the normal rules, do you understand the unique data requirements for each contract? • Do all divisions or departments agree on who is partnered in the JVA? Do they all use the correct ownership percentages for every partner? Do they agree on who the partners are? • Do all divisions or departments agree on the status of agreement conditions that may change according

4 | Journal of the Professional Petroleum Data Management Association

to variables such as performance obligations and anniversary dates, e.g. farm-ins, penalty wells, and area of mutual interest (AMI)? If there are differences in the data used / stored by different divisions or departments these differences need to be understood, explained and reconciled. It is not the Data Manager’s responsibility to fix discrepancies, but any discrepancies need to be addressed. • Are the files in paper format, an electronic format, or both? How accessible is the unstructured information in the paper records? • Have all changes to the agreement been applied to the records, whether paper or electronic? Have the previous versions been retained and appropriately archived? • Are the specific contractual terms around how notices are served, and when the changes that are being sent out become effective, known and understood for every agreement? Rights of First Refusal can have unexpected impacts on purchases and sales. • Is there a full, auditable history documenting all changes, additions and corrections to the records?

QUESTIONS THAT ARE ESPECIALLY RELEVANT TO COMPANIES THAT ARE THINKING OF BUYING OR SELLING CONTRACTUAL INTERESTS: The data management issues, and their financial implications, relating to JVAs become critical if the company is considering whether to divest or acquire assets. Ideally, these issues should be resolved before the decision is taken. Otherwise, the company may be committing to a sale or purchase that cannot be fully supported by the necessary information. For starters, consider these: • Do you have a clear and accurate picture of all the agreements that will be impacted by this transaction? The impact will be even more complicated if you are only selling a portion of your interest.


Cover Feature Feature

• Do any of the agreements have terms that limit or restrict your ability to make a deal? • If your purchaser defaults on the agreement, are you liable to meet the obligations of the existing contract? • Can you be assured that all environmental liabilities will be transferred and stay with the purchaser? Will these liabilities revert to you if the purchaser goes bankrupt? • How will you identify all the data that must be delivered with the sale of the asset? • Can you prevent the accidental transfer of proprietary information that is not subject to the sale agreement? This is especially critical if your partners have shared ownership of the data. Seismic data is often not transferable or saleable. One issue that is often encountered occurs when company A acquires assets from company B. Company B may have licenses from a data vendor (for seismic or well related data as well as geologic and petro-technical analysis) which may have conditions that prevent transfer of that data without re-licensing. • Are your existing staff able to effectively assemble the data for the asset to be sold? Are the cost and time of data assembly significant enough to be factored into the sale price and closing date? What liability or penalty is associated with failure to fully deliver on time? • How will you receive, validate and integrate the data for the asset you are purchasing? Will you need to add staff numbers and expertise? Can your systems handle the increased load and diversity? Will you need new systems and software? In an ideal world Data Managers are involved early in the process of any potential acquisition, and are involved in determining and scoping out the implications of any proposed acquisition. This would include determining if systems need to be expanded or upgraded, and if new software would need to be acquired. Upgrading the training

of existing staff and the potential need for additional personal should also be examined. • If your company has public shareholders, can you ensure compliance to the laws around the disclosure of information and that insiders do not have an illegal advantage? • Do all the ownership interests you are selling or buying have a clear chain-of-title in respect to ownership? If not, have the discrepancies been addressed by a working interest clarification agreement? • Are any of the interests or ownerships subject to a right of first refusal? • For mineral leasehold interests, is the lease current and held by production? A typical revenue-generating asset has a lot of information that is used for business analysis. As a data manager, can you ensure that the necessary data is available to answer questions like the following? • Can the realizations be examined on a well by well basis, and by individual financial identifier? • Are the revenues, expenses and realizations reported at an intermediate or more aggregate level? Does the recorded data make sense at each and every level of aggregation? • Do the values booked in the financials make sense? • Do the prices realized per unit make sense? • For a royalty agreement, are the royalty rates appropriate for the various products produced and marketed? Are the royalties,

paid or received, spelled out by the agreement(s)? • What royalties are the buyer responsible for or entitled to? If you are selling your entire interest in a well or facility will your royalty agreements be assumed (taken over) by the purchaser? • Will your company retain a royalty interest even though your working interest is sold? Many companies will retain a royalty on farmouts or sold mineral interests. Are you selling or acquiring an entire company? Perhaps it is only an interest in one geographical area. Perhaps the proposed deal is for a specific portion of an interest or for a specific type of interest? The possibilities are extensive and can be complicated and even confusing. Again, the question that needs to be fully answered is “What is really being sold or acquired?” Perhaps: – Natural gas rights only – Oil rights only – R ights to all formations, surface to basement, or just a specific formation. • Are any of your assets pledged as collateral? • Are your assets in close proximity to the assets of other companies? • Are there right-of-way issues with facilities that are close to the subject asset? • Do these issues impact the value of the proposed transaction? • Will you need agreement from other parties or stakeholders to sell, acquire, develop, expand or move the product(s) to market?

Foundations | Vol 4, Issue 1 | 5


Cover Feature

• Are all the relevant agreements available? For a partial sale, is the subject part clearly and accurately described? • Are the rights being sold or purchased definitively and properly associated with geographic locations, including subsurface locations and formations? Are all locations referenced within the same coordinate reference system? • If many different contracts relate to an asset, does one specific contract govern? If so, which one? Can the governing contract change over time or in differing situations?

In any situation, as seller or a buyer, you need accurate, timely and complete data. Good data standards can help your business and your stress level. Data that meets PPDM standards will increase effectiveness and efficiency of sharing, evaluating and transferring data. Good data can improve the merging and/or setting up of data in your own systems or in the new acquired systems. If your data does not already meet PPDM standards, perhaps now is the time to implement better data standards. When it comes to agreements and sales, it pays to be ready in advance.

This is the first of many lists of questions that can be asked about oil and gas data. Would you like to contribute to this endeavour? If you have a question or questions to add or topics that you would like addressed around oil and gas data please email the Foundations Editorial Team at foundations@ppdm.org. About the Author Ted Wall is an Oil and Gas professional with extensive knowledge of joint venture agreements, mineral rights, data evaluation and verification as they relate to ongoing operations, acquisitions and divestitures.

Guest Editorial

Data Storytellers ORD EM E P RO GR

E

S

S

O

By Jim Crompton, Reflections Consulting

W

ith the advent of “Big Data,” we are all suffering through the hype of all the new digital technologies. In the back of our minds we understand that there is a “people” element to all this as well, and finally we are seeing some new job descriptions arriving to recruit new talent to help your company deal with all the new data that you now need to manage. Data Managers and Data Analysts have been with us forever, even if their job titles are production engineer, financial

analyst or operations specialist. Companies with effective data governance processes already know that roles (and even full time jobs) around data stewards and data champions can add value. Not many companies have Chief Data Officers yet, however, some of the new “attention economy” start-ups are experimenting with this position. Gartner’s annual CDO survey estimates there are 2,000 positions with “chief data officer” in the title. Gartner’s prediction: by 2021, the Office of the CDO will be seen as

6 | Journal of the Professional Petroleum Data Management Association

a key business operational function comparable to IT, HR, and Finance. A number of firms have started “centers of analytics excellence” and the job postings for data scientists are now easy to find. I even read an article that the number of MBA degrees is starting to stabilize or even decline while specialized degrees in finance and even big data are becoming more popular in business schools. The job I always wanted to be is a “data storyteller.” A data storyteller doesn’t tell stories about data, but instead tells insightful stories about key business issues using data. Data stories are more than visualizations. Many of those who talk about data stories seem to confuse the two. While graphics can help to tell the story, they are not the story. Granted, storytelling morphs to fit every new technology. The visualizations are better than ever, regardless, few are more than


charts on parade. Longtime storytellers look at them and wince, laugh, or just turn away and so does most of the audience. I have come across several good articles from TDWI about storytelling. For this article, I have pulled together some highlights from three of them to describe the power of good storytelling. I would guess that most of you know a good data storyteller when you hear one. For those who don’t, read on. A good data story is often a detective story. Some stories invite people to discover the patterns before they’re revealed. Something has happened here, however you don’t know what it is. You have a mass of data, much of it irrelevant. Your job is to find the best narrative that makes sense of it. A story with a human angle is often the best. A story’s trappings cast light on the data. Data is data, but which data we choose to ignore and which we value can be affected by stories we hear and choose to believe. Starting and ending points make all the difference. To watch your story’s starting and ending points may be easily forgotten in other contexts. Here are a few key elements of a good story: • Authenticity and Meaning: Walter Fisher described his theory about authenticity in the 1980s. People use “narrative rationality” to judge stories based on two criteria. Narrative coherence is a measure of a story’s

logic and consistency and how well it makes sense. Narrative fidelity is a comparison of the story with our own beliefs and knowledge. • A story has to be meaningful: Stories that grip us have high stakes. A story about annual sales makes a stronger story than one about quarterly sales, but neither can match one about the CEO keeping his job. • Explanations versus Stories: Jock Mackinlay breaks the “data story” into two types: stories that “speak to the human condition” and stories that are mere explanations of data discovery. He wrote, “A story involving data is stronger than an explanation involving data” because of the story’s emotional connection. Data by itself, no matter how pretty the chart or how much time you spent on the PowerPoint graphics, does little for most people. Contrast that to watching a good data analyst cycle through data, getting results and asking new questions with each pass and describing any thoughts along the way. That can actually be exciting. Think of the data as a rock that’s been carved by water. It is easy to see the rock, but the real story is the water. That is, there’s the data and there’s the story about how the data got here and how it was formed. Data sits; stories move. Here is some advice from good data storytellers:

• Focus on the audience. Stop thinking about the data and think of what the audience wants to know of what it means, what’s new, or what’s different. Think of what story you would tell a member of the audience over coffee. • Is anything significant in your data? Events become newsworthy with timeliness, proximity, novelty, or impact. • Look for anomalies: “Paying attention to apparent anomalies is one of the reasons that we have survived as a species.” (writes Stephen Denning in The Leader’s Guide to Storytelling, 2011; Jossey-Bass) Deciding what data to show, lose, or summarize has to be guided by audience and medium. What does the audience really want to know? What does it know already? What incomplete stories can you support or question? Some stories do end so it is about time to end my story about data storytelling. See you next edition. About the Author Jim retired from Chevron in 2013 after almost 37 years with a major international oil and gas company. After retiring, Jim established Reflections Data Consulting LLC to continue his work in the area of data management, standards and analytics for the exploration and production industry. REFERENCES

1) H ow to Find a Story in Data: Tips for data storytellers who struggle to find stories in data, TDWI, By Ted Cuzzillo, December 15, 2015 2) Data Storytellers: Making a Story Meaningful, TDWI, By Ted Cuzzillo, December 1, 2015 3) 6 Principles of Data Storytelling, TWDI, By Ted Cuzzillo, October 27, 2015 4) https://en.wikipedia.org/wiki/Chief_data_officer Here is another great source for would be storytellers: 1) S tories That Move Mountains: Storytelling and Visual Design for Persuasive Presentations, Martin Sykes, A. Nicklas and Mark D. West, Wiley, 2013

Horror Stories about Data Disasters can be exciting too. See Page 31.

Foundations | Vol 4, Issue 1 | 7


Are trust issues affecting your relationship with

Your Data?

Poor-quality data costs the average company

$14.2 ANNUALLY

MILLION *

From business decisions based on bad data. From bad data replicated across multiple systems. From manual data compilation and massaging for reporting.

Bad data is bad business. EnerHub™ from Stonebridge Consulting is a cloud-based enterprise data management solution for oil & gas. EnerHub validates and syncs data across your organization, connects source systems, and delivers the information that matters to your people. EnerHub’s open architecture plugs into your existing systems and enables you to elevate data quality to a corporate competency.

Get your relationship with your data back on track. Contact us to schedule an EnerHub™ demo. Business advisory and technology solutions for next-gen oil and gas

www.sbti.com | info@sbti.com | 800.776.9755 * Gartner, The State of Data Quality: Current Practices and Evolving Trends, Dec. 2013


Feature

Hands On With The PPDM Association Board of Directors ORD EM E P RO GR

E

S

S

O

By Peter MacDougall, IHS Markit

M

y first foray into data was for the creation and verification of it. I “sat wells” in Western Canada as a wellsite geologist. That position is responsible for all the geological information that is created and collected for the well. Tasks included collecting hard information like cuttings samples and cores and then making informed interpretations based on my geological knowledge. A critical mission for a wellsite geologist is to ensure that the well information is correctly captured on the logs and reports (e.g. elevations, location, well name, etc.). The transition from geologist to data manager is quite simple. You are already managing information, whether you know it or not. From wellsite to the office, you need to make decisions to find targets or optimize existing production. How are these decisions made? You need to gather as much information as you can in order to make informed decisions. It is easier when you can get that information in a recognized and consistent standard and have the assurance that it has been vetted by recognized information experts. This is where PPDM comes into play. My first PPDM experience was working with well information housed

in a PPDM 2.x compliant model and loading the information from 9-track tapes. To ensure that I was using the right principles for the model, I signed up for a PPDM Model Overview course. The presentation was given by Trudy Curtis and Yogi Schulz. Trudy is the current CEO and Yogi was a long-time Board Member. Their enthusiasm for PPDM ensured that the course was a great introduction to the model which was the main focus of PPDM at that time. Twenty-one years later I am still working with a PPDM data model, albeit a newer version! In addition to technical advancement, the other advantage of going to a PPDM course is the people you meet. There are always industry oriented and data passionate people to network with to expand your circle and horizons. The Association has so much more to offer than just the data model. One of my favourites is the community that PPDM has created over the 25+ years that it has been in existence. There are so many intelligent people that work within our oil and gas data management industry and so much to learn from them. Through Data Management Luncheons and the Symposiums/Trade Shows PPDM has established a great framework to learn and network. I still remember the

first Symposium that I attended where Trudy stated, “You need to meet at least three new people over the next few days.” This statement has stuck with me and PPDM events provide me with a great chance to reach out to new like-minded people. There is always someone new to chat with and they will bring a new perspective that you can learn from. Another way to touch base with new individuals is to spend time in a Work Group or Committee. These can take up time but have been rewarding for me and make me more valuable to IHS Markit. I like the idea of working on industry issues with the best oil and gas data managers. There is always something at these sessions that you can learn and expand your knowledge while creating a standard or increasing PPDM’s body of knowledge. Currently, I am on the Regulatory Data Standards Committee and we are looking at establishing some standards for the regulatory community. Again, the committee is populated with experienced industry people and there will be great opportunities for me to learn and network. Someone asked me once why I have stayed so long at IHS Markit – twenty one years and counting! I told them it was because IHS Markit was always changing and there were constantly new challenges and new things to learn. The same guiding principle applies for the PPDM Association and I think that is why I like being involved. No matter what you want to do there are opportunities that will appeal to many different people and there is always the chance to learn something new or meet a new person. Whether you want to volunteer for a Work Group or run for the Board of Directors, you can use the PPDM Association to expand your horizons and networks. Get involved and enjoy the chocolate! About the Author Peter MacDougall is a Professional Geologist who is passionate about oil and gas information and putting the customer first.

Foundations | Vol 4, Issue 1 | 9


Achieving Success in Developing Regulatory Standards By Yogi Schulz, Corvelle Consulting

ORD EM E P RO GR

T

E

S

S

O

he Regulatory Data Standards Committee of the PPDM Association stands before a huge opportunity to enhance and harmonize the data standards for regulatory reporting to, and data management practices of, oil & gas industry regulatory agencies. The development of Directive PNG017: Measurement Requirements for Oil and Gas Operations provides an excellent case study of how representatives from regulators, operators, vendors and service suppliers can collaborate to enhance and harmonize all types of standards. The regulatory standards work was sponsored by the Ministry of the Economy (ECON) of the Government of Saskatchewan, a province in Canada, and was completed earlier in 2016. A branch of ECON is responsible for the regulation of the oil and gas industry in Saskatchewan. These new requirements have been introduced without noisy conflict and have been welcomed, or at least accepted, by all stakeholders. The measurement requirements described in PNG017, a 400 page document, elaborate on the following topics: 1. What and how volumes must be measured or calculated. 2. What, where and how volumes

may be estimated. 3. Accounting procedures related to calculated volumes. 4. What data must be kept for audit purposes. 5. What volumes must be reported. 6. Standards of accuracy for measured and estimated volumes. 7. Under what circumstances deviations from base requirements are allowed. Here’s a summary of the learning from collaboratively developing PNG017 that will help the Regulatory Data Standards Committee achieve success in moving toward its ambitious regulatory data standards goals.

ADDRESSING MEASUREMENT SHORTCOMINGS ECON wanted to introduce new measurement requirements to: 1. Reduce long-standing measurement noncompliance. 2. Improve accuracy and completeness of production volumes being reported. 3. Correct under-reporting of flare and vent volumes. 4. Raise the level of assurance over accuracy and completeness for crown royalty purposes.

10 | Journal of the Professional Petroleum Data Management Association

5. Respond to environmental and safety concerns.

BUILDING ON EXISTING STANDARDS The first decision of ECON was to build on an existing, well-established standard. For the PNG017 development that was the Alberta Energy Regulator (AER) Directive 017: Measurement Requirements for Oil and Gas Operations. AER generously contributed this document and actively participated in the consultation process. All the stakeholders easily agreed that harmonizing the Saskatchewan requirements with the Alberta requirements offered the following significant benefits: 1. Easiest, fastest and lowest cost standards development for ECON, the regulator. 2. Easiest and lowest cost implementation and adoption for operators. 3. Proven, established measurement requirements that: a. Minimize ambiguity. b. Simplify achieving reasonable compliance for operators. c. Reduce the cost of monitoring compliance for ECON, the regulator. 4. A long list of small improvements to


Feature Feature

MAINTAINING MOMENTUM

the AER Directive 017 requirements that AER was pleased to adopt. As a result, the PNG017 standards development process: 1. Leveraged widely-adopted and proven best practices for measurement. 2. Demonstrated the ability of typically cautious regulators to collaborate. 3. Avoided the introduction of yet another, slightly different standard. 4. Produced the first ever multijurisdictional, harmonized set of oil & gas regulatory requirements in Canada.

Standards development initiatives are at risk of losing momentum and bogging down. ECON recognized that: 1. The longer the standards development process takes, the more it will cost and the more likely some participants will drop out due to fatigue or changing circumstances. 2. Operator and supplier participants are constrained in how much effort they can allocate to the consultation process. 3. They did not have the subject matter expertise to manage the diverse content of the discussion. 4. They did not have the capacity to facilitate the discussion and manage the development process. As a result, ECON engaged a consultant to manage the standards development process to ensure project momentum, the planned schedule and deliverable quality were maintained.

CONSULTING WITH INTEGRITY

ENCOURAGING SUPPORTERS

At the outset, the participating ECON staff announced that they were genuinely interested in comments and viewpoints of other stakeholders. Initially, that statement was received with polite skepticism by some stakeholders. However, the ECON staff adhered to that statement of principle for the duration of the standards development process. This ECON behavior quickly built trust among all the participants. The trust led to many candid exchanges about the implications, costs and benefits of many detailed technical standards proposals during the series of consultation meetings. These frequent candid exchanges by all the participants demonstrated that no one was holding back thinking that raising questions might turn into a disadvantage later. The combination of technical expertise and trust ensured that the new requirements met operator and regulator needs. As a result, the standards development process avoided conflict and produced a high-quality deliverable that is being widely adopted.

To ensure reasonableness and clarity in the new requirements, representatives were requested from AER, Canadian Association of Petroleum Producers (CAPP), Explorers and Producers Association of Canada (EPAC), Industry Measurement Group (IMG) and Saskatchewan Headquartered Oil Producers (SHOP) to participate in the consultation process. These representatives became supporters for the new requirements in their own organizations and at industry events such as the Canadian School of Hydrocarbon Measurement (CsHm), IMG meetings, CAPP committee meetings and Petrinex committee meetings. Petrinex is the organization that operates the production reporting system for Alberta and Saskatchewan. As a result, the organizations that would be impacted by the new requirements became well aware of what was required of them and the timeline during which they needed to take action.

DRIVING TO A CONSENSUS Every consultation encounters

objections from specific individuals and stakeholders that can place the standards development project at risk and hinder driving to a consensus. During this project, the facilitators ensured that objections and different perspectives were heard thoroughly. Frequently, the requirements were tuned or clarified to address the concerns. In most cases the ECON staff made decisions to accept or not accept the consensus of the participating representatives in real time in the consultation meetings. In rare cases the ECON staff wanted to consult with other Ministry staff before accepting a consensus. On these occasions, the topic was always resolved during the next consultation meeting. This decisiveness kept the standards development project on schedule and made the government position unambiguous throughout the project.

OVERCOMING RESISTANCE TO CHANGE Not surprisingly, many Saskatchewan operators, large and small, were happy with the previous situation of little or no measurement requirements with sporadic enforcement at best. In particular, most small, local operators were oblivious to the: 1. Low compliance of their operation. 2. ECON’s unhappiness with their poor compliance. 3. Higher expectations of industry performances desired, or sometimes loudly demanded, by external stakeholders. To these small operators, the new measurement requirements constituted huge change to current processes and will require some capital investment to improve measurement. To build awareness of the new measurement requirements, compliance expectations, and overcome resistance to change, ECON: 1. Held multiple communication events for all stakeholders and for their own staff. 2. Created new web pages for measurement on its website. 3. Provided regular updates at

Foundations | Vol 4, Issue 1 | 11


measurement community meetings and conferences. 4. Formally communicated the availability of the new requirements with their effective date to every operator in Saskatchewan. As a result, awareness of the new measurement requirements became high, adoption is growing and publicly voiced opposition by Saskatchewan operators is non-existent.

ADDRESSING IMPLEMENTATION COSTS Naturally, Saskatchewan operators were concerned about implementation costs for the new measurement requirements and about subsequent operating costs. The participants in the standards development process were highly aware of this concern and addressed it by: 1. Harmonizing the new requirements with existing AER

requirements to leverage Alberta implementation expertise and experience among operator staff, vendors and consultants. 2. Offering the Compliance Reporting Tool for Directive PNG017 to help operators plan their implementation and report progress in a comprehensive way. 3. Providing a four-year implementation period to allow time for orderly change management and to spread costs. As a result, implementation costs for the new standard have been contained while measurement accuracy and completeness is improving.

CONCLUSION

concerns that will undoubtedly arise during its standards development work. More specifically the benefits include: 1. Harmonizing standards across jurisdictions produces lower, shared implementation and operating costs. 2. Comprehensive consultation produces a superior quality standard. 3. A higher quality standard leads to wider adoption and therefore more value. Most regulators can experience these benefit by participating in the Regulatory Data Standards Committee. About the Author Mr. Schulz has over 30 years of Information Technology experience in various industries, including serving on the PPDM Board of Directors for 20 years.

The experience gathered from the successful development and ongoing implementation of the Directive PNG017 standard can help the Regulatory Data Standards Committee address these same

12 | Journal of the Professional Petroleum Data Management Association

...

More Info: www.ppdm.org/certification

be

CPDA ™ - An Industry Differentiator

to

Signify your expertise Demonstrate comprehensive knowledge and skills Promote respect and recognition Give your contributions more weight Stand out from the crowd in job applications

nt

Now is the time to earn your Certified Petroleum Data Analyst (CPDA™) credential:

a D u w GOO ER yo TT BE ST BE

Yourself

Do

Distinguish

For more about regulatory standards, visit the Committees menu at www.ppdm.org.

Name Position Title Company Name


Feature

Global Technology Trends and Their (Potential) Impact in the Oil Industry By Guy Holmes, Katalyst Data Management ORD EM E P RO GR

E

S

S

O

he technologies that are changing the IT landscape around us often take time to get traction in the oil sector. Sometimes it appears we, as an industry, are so heavily invested in yesterday, that tomorrow is too much to think about. But sometimes revolution does not start within the disquiet, rather it starts next door and spreads into the open arms of disquiet as though it had been waiting for it all the time. Some ground breaking disruptive technologies that are appearing on the landscape within an array of other industries will at some point seek to pass on their benefits in the oil sector. How long it takes to adopt them is up to us. This article speaks at a high level to the impacts of a small but important group of these technologies including Hadoop, Block Chain, Internet of Things (IoT) and IBM’s Watson; and how these technologies will likely one day be an important part of our industry – open arms or not. Several years ago, I attended a course put on by the Australian Institute of Company Directors (AICD). The course focuses on the areas that all company directors need to be aware of in order to be well-rounded, responsible and professional. It covers areas like financial literacy, legal responsibilities, strategy, etc. A few months ago, I attended

T

the course again as a refresher so that I could get myself up-to-date on the latest developments in company director issues and responsibilities. One of the major differences between the first course and the latest update was that it included a significant section on business risk in the area of “Disruptive Technology.” It is a review of the risks that companies face when it comes to emerging technologies that can make sweeping changes to an entire market with little to no notice. It reviews ways to identify these technologies, participate in these developments, and how to protect a company’s interests from these disruptions. Like the car for the horse, or the Uber for the taxi, disruption development has become an industry in its own right, with developers hunting for ways to make tools and services that will cut through an industry and grab market share in massive chunks. What follows is a review of new technologies that stand to improve (and potentially disrupt) current oil industry technologies and practices.

HADOOP What is it? Hadoop is a programming framework that provides tools for the processing and storage of extremely large datasets in a distributed computing environment. The framework is open source and a product of The Apache Software Foundation (www.apache.org).

Why is it important? Hadoop has made significant impacts in cost savings and efficiencies where (in particular) large datasets are involved. The method used by Hadoop to store very large files is a distributed file system that can map data wherever it sits. Hadoop can also use the local resources of where the data sits to process the data so that the processing can be done in the same location. In addition, Hadoop can also handle data that is unstructured meaning that data does not necessarily need to be in rigid tables within a database. Hadoop effectively lets you see everything you have from one spot, process it from many, and removes the need to “overmanage” data through databases, tables, etc. What are some example adaptations? Hadoop should not be looked at as a replacement for your current technology, but rather a toolset to augment and improve it. There are many uses for Hadoop in the oil sector. I am sure that some of these have already been put to good use by the more innovative companies in our industry. But for those that have not, it is worth giving this technology a good hard look. An Example: An explorer wants to look at some seismic data from the Gulf of Mexico to answer a question about a particular geology. The dataset required is 20Tb in size and is currently on tapes that need to be read to

Foundations | Vol 4, Issue 1 | 13


Feature

disk to be processed. In a typical workflow, the 20Tb will be read and processed in sequentially. In the end, the explorer creates about 200Mb of new data from this work and spends two weeks in doing so. Hadoop, in the first instance, makes keeping volumes of data (like this 20Tb) in a highly available state and it reduces the traditional costs of storage because it removes a lot of the overheads of managing that storage. Hadoop then allows the dataset to be processed in situ. In many cases in the oil industry, data has to be copied to where the processing will take place. Hadoop, however, allows the processing to take place wherever the data is (i.e. instead of bringing the data to the processing, the processing can come to the data). With the right tools in place, the example above would be reduced from weeks (most of which are waiting for data to be made available and processed) to just several hours. Hadoop can work with structured data like massive databases or unstructured data like miscellaneous file types, including; log files from web sites, spreadsheets, social media interactions, incoming sensor data, and streaming field and production data.

BLOCKCHAIN What is it? A blockchain is essentially a distributed database that continuously maintains a list of ordered records called blocks. Each block contains a timestamp and a link to a previous block as well as other metadata about that block. A distributed database is a database which is not attached to a single common processor and, in fact, may be stored on multiple computers, located in dispersed locations across the globe, spanning an array of interconnected computers. Blockchains are inherently protected from modification after a block has been written as once recorded, the data in a block cannot be retroactively altered. Blockchain is the backbone of the management of the digital currency Bitcoin. Why is it important? Aside from blockchains use in the expanding market of digital currency, it has numerous applications where the need to

track (without error or possible altering of information) is required. Just as blockchain works within the currency market, so can it be used to track physical items like art, music, a barrel of oil or a software license. Although PPDM does not use a blockchain model, PPDM should consider the implications of blockchain and how this could impact the next generation database model. What is an example adaptation? In the seismic acquisition market, speculative acquisition companies are constantly acquiring new data and licensing this data to oil companies for use in their exploration programs. In many cases, the licensing of the data they acquire has contractual requirements that prevent the purchaser from transferring the data to another party, or if transfer is permitted, it usually has uplift or transfer fees attached. Currently, acquisition companies rely on the oil companies, press releases or market intelligence to notify them of data transfers. Vast sums of money are not recovered by acquisition companies every year because they are not aware of data transfers that have occurred. A blockchain could be used to track every seismic trace in a survey acquired by an acquisition company, the licensing of each trace in that survey, and any transfer of that trace from one party to another. Embedding blockchain technology into a “smart contract� for the sale of the trace data, for example, would allow an oil company to transfer the trace data to a new oil company and the blockchain, in turn, would automatically track that change, automatically debit the bank account of the oil company and automatically credit that of the original licensor.

INTERNET OF THINGS (IoT) What is it? The Internet of Things is broadly considered the networking of physical devices that enable these objects to collect and exchange data. To better imagine what the term internet of things might encompass, picture a series of temperature sensors located in the ocean placed around a gas production platform. Information is collected by these devices in real time and

14 | Journal of the Professional Petroleum Data Management Association

is continuously sent back to the platform to monitor ocean temperature around the platform. Now connect that platforms sensor array to all of the other platform arrays in the Gulf of Mexico and then connect the Gulf of Mexico’s array to those deployed globally by a weather forecasting agency. Then connect all of these to the International Space Station which is monitoring average ocean temperatures in the northern hemisphere in real time to analyse global warming trends. As the IoT field grows, and the use of the sensor technologies increase, the sensors are tending to become more cost effective to purchase, and more companies are finding ways to embed sensors into their technology. Why is it important? IoT is all around us now in our everyday lives. From its use in smart buildings to monitor air conditioning, to getting traffic alerts on a smartphone when driving, IoT is providing life changing intelligence on scales large and small. What is an example adaptation? IoT is already in use in the oil sector but has a lot more to offer as the technology (and our imagination) improves. Imagine an oil company that has both upstream and downstream investments like a producing oil field, a refining operation and 50 retail gas stations. As petrol is sold at the retail gas stations, the plant is notified in real time of consumption trends for unleaded, leaded, high octane, etc. products by sensors on the petrol bowser. The plant can increase or decrease production of the various products to meet the demands of its clients based on this input. In return, the plant can communicate with the production field on its requirements for raw material in an effort to keep production at an optimum level to service the whole supply chain. The efficient use of the IoT can eliminate the need for the expensive storage of refined


Feature

output until it is consumed, it can prevent lost revenue from gas stations selling out of a product, and can significantly improve the efficiency of an entire supply chain.

IBM WATSON What is it? Watson is a question answering computer system that was invented by IBM as part of its DeepQA project. The Watson system is capable of answering questions posed in natural language by accessing structured and unstructured content that has been provided to it in advance to build its knowledge base. IBM’s goal is to develop computers that interact in natural human terms across a range of applications such as the fields of medicine and law. To achieve this, IBM needed to program the computer system to understand the questions that humans ask, in the many ways that humans can ask them, and provide answers that humans can easily understand. Many of us probably think that this technology is already all around us in systems like Siri or voice activated navigation systems. But the reality is that most of these systems only respond to a limited array of questions that have to be phrased in a certain way on a very specific subject matter. Watson on the other hand has been designed to understand complex questions and phraseology posed to it in a wide range of grammatical contexts. It does this by parsing questions into sentence fragments to identify phrases that are statistically related and then searches its knowledge base to locate possible answers to the questions. It then ranks all of the possible answers and uses algorithms to determine which one is the best possible fit as the answer to the question. It may not seem like a significant advancement, but in many ways it is ground breaking.

As an example, there is a well-known joke by Groucho Marx that goes something like “One morning I shot an elephant in my pyjamas”. The sentence could easily be interpreted in several different ways. For example, who was wearing the pyjamas the person or the elephant? When he shot the elephant, did he use a gun or a camera? Fortunately Marx finishes the joke with, “How he got into my pyjamas I’ll never know” so we then understand the context. Watson has been programmed to break down these phrases and create context to allow it to interpret the sentence in a way a human might. Why is it important? Aside from the fact that the world has been trying to replace humans with machines for a very long time, the impact of Watson can and will be profound in many other ways. Whether it is a doctor or nurse asking Watson about a cancer treatment plan, or an investor trying to understand what influences the price of oil, Watson is already a ground breaker in the area of artificial intelligence. What are some example adaptations? Like most companies, often the knowledge base for a specific area in the business rests inside the head of the last person who worked on a project. Watson is changing that dynamic by ingesting any and all data that can be obtained from that specific area of the business, organising it, remembering past questions, the feedback you provide it, and not only presenting its answers in a meaningful way, but also refining the answers as new information becomes available. Watson can be used in exploration, for example, to monitor the progress of a drilling project and help prevent expensive downtime on the rig. By feeding Watson live inputs from the drilling progress like current lithology, current depth, current temperature, type of drill bit, and the rotational speed of the bit, Watson could compare all other wells drilled in that basin, or through that type of geology, with that type of bit and at that speed to determine if maximum drilling efficiency is being maintained. It could alternatively be used to determine if there is the danger of a looming mechanical failure by monitoring

trends and historical failures for tell-tale signs of impending breakages. Watson can do this by consistently reviewing the inputs it is provided, looking at historical drilling projects and reports from the past, analysing the drilling contractors experience and the drill rigs specific toolsets, and provide meaningful intelligence about the project that a human simply would not be able to provide. Woodside Petroleum in Australia implemented Watson in 2015 to help on a range of fronts, including; infrastructure construction, production monitoring, and predicting future trends from historical data and after a year of use, Woodside is increasing its use of the technology across a broad range of business units.

COMBINING THESE TECHNOLOGIES Aside from using each of these tools independently, it won’t be long before most or all are used in conjunction with one another. It is not hard to imagine an array of sensors in an IoT creating unique records that are tracked in a blockchain, processed by an implementation of Hadoop and then streamed into a data repository that forms the knowledge base for Watson to ingest. With a combination like this, you could get real time highly qualified and useful data, processed at high speed, emanating from a remote device that has full provenance and ownership tracked. It will take considerable vision to come up with solutions that combine these tools in an end to end system. But whoever does will be onto something quite special. About the Author Guy has enjoyed a successful career in the international oil and gas industry for 17 years, having held professional roles overseeing logistics, data management, information technology, data preservation and recovery.

Foundations | Vol 4, Issue 1 | 15


Things to Know About Digital Data Storage

By Monica Mills ORD EM E P RO GR E

S

S

O

W

hat is data storage? The term can refer to anything with information recorded on it. (e.g. Reference books, online data sharing websites, data servers)

TYPES OF DIGITAL DATA STORAGE There are as many data storage types as there are ice cream flavors. Some of the most common types of data storage types are ON-SITE, REMOVABLE, AND OFF-SITE. Depending on your company’s retention requirements, you are likely using several of these. On-site data storage – Any storage device that is made to remain with the computer such as a hard disk. This is a low-cost storage solution, a one-time investment that only requires extra space as needed and it does not require internet access. The disadvantages in this type of storage type are numerous – risk of theft, catastrophic disasters, subject to human error, and storage size. Removable data storage – A common everyday type of data storage may include such things as CD-ROMs, flash drives, and external hard drives. Data that can be portable, useful in storing information you don’t want anyone else to access, and creating back-up copies of information. The removable storage idea dates back to

punch cards. The punch card manufactured by IBM is a card measuring just 7 ¾ inches by 3 ¼ inches. This card can be used to contain digital data represented by the presence or lack of holes in predefined positions. In the 1890s, voting machines used punch cards to store voting results. Removable devices fall into three categories: • Magnetic Storage Devices are coated with iron oxide. The data is created using magnetized dots. The dots are created, read and erased using magnetic fields created by electromagnets. Floppy disk drives and zip drives are examples of the magnetic storage devices; floppy and zip drives are obsolete, rapidly replaced by CD-ROMs, DVDs and USB memory sticks. • Optical Storage Devices store huge amounts of data. The most common optical device most of us are familiar with is the compact disk (CD). The data is arranged in a dot pattern, these dots are read using a beam of light. The light beams which reflect off the dots are interpreted as bits of data. The data is burned onto a disc using a laser beam to mark the surface of the CD-R (creating a series of dots). CDs are being replaced by Blu-Ray, HD-DVD, and DVD-RAM. • Solid State Storage Devices include digital cameras or memory sticks. Solid state storage is defined by many as “no moving

16 | Journal of the Professional Petroleum Data Management Association

parts” (no spinning or laser beams). In many enterprises, solid state technology is used for primary storage, replacing the traditional hard drive, compact disc and disc RAM methods. Since there are no moving parts, the solid state storage device performs faster, longer, and more reliably. Off-site data storage – One of the most popular types of data storage these days is also known as “cloud storage.” Information is stored away from your computer and maintained by a third party. The data can be accessed remotely from virtually anywhere at any time. Depending on the amount of data you are storing and the security protection you are looking for, the price and value varies by provider. The common uses for the cloud are accessibility, reliability, and data backups. Providers ensure clients with security capabilities, data encryption and authentication for their services. Here are just four types of cloud storage - personal, private, public, and hybrid. • Personal Cloud Storage – Individual data storage (i.e. Apple’s iCloud) • P rivate Cloud Storage – Enterprise internal data storage. A computing infrastructure privately held by a corporation that has capabilities similar to a cloud but more secure. • Public Cloud Storage – Data is accessible over the internet; the user pays


Feature

for the storage capacity being used. (i.e. Google AppEngine) • Hybrid Cloud Storage – Data stored is a combination of critical data stored in a private cloud and other data being accessible through a public cloud storage. Hadoop uses distributed storage of very large data sets on computer clusters. Global demand for data storage is rapidly growing. Data storage technologies will continue to evolve to accommodate these demands at global, national and corporate levels down to individual, personal levels. About the Author Monica Mills is a recent CPDA graduate. She has over twenty years of experience in the Oil and Gas industry. She is no stranger to data, having worked for nine years as a Data Coordinator for the Information Management Group and seven as a Petrophysical Technician which taught her the importance of data management.

A

Write the Be the

rticles uthors > Letters to the Editor > Guest editorials > Technical papers and much more!

Interested in being published in Foundations? Visit ppdm.org/publications to find out how!

Foundations | Vol 4, Issue 1 | 17


Electronic Data Interchange: The Importance of Standards

By Joseph M. Suarez ORD EM E P RO GR

S

E

S

S

O

hortly after the end of the Second World War, the devastated country of Germany was divided into administrative zones by the allied powers. The United States, Great Britain and France occupied the west, while the Soviet Union controlled the east. About 100 miles within the eastern part of Germany, the city of Berlin was similarly divided up into sectors amongst the allies. The spirit of mutual cooperation did not last long. On June 24, 1948, due to philosophical and economic differences, the Soviet Union blockaded road, rail and barge access to the American, British and French sectors of West Berlin. The leader of the Soviet Union, Joseph Stalin, hoped that by depriving West Berlin of basic necessities, the citizens would rebel and the entire city of Berlin would be under his control. The western allies quickly decided not to withdraw from Berlin in order to keep Communism in check. Since there were open air corridors over East Germany, cargo planes would be used to fly in supplies to West Berlin. The United States military dubbed the initiative “Operation Vittles,” but it is known in history as the Berlin Airlift. With the population of West Berlin at approximately two million people in 1948, the Berlin Airlift was an immense logistical undertaking. Impressive amounts of cargo were hauled in the beginning of the project (approximately 5,000 tons per day), but nowhere near the amount necessary for the approaching winter. The task of eliminating bottlenecks and maximizing the speed with which cargo was delivered

and distributed was delegated to U.S. Army Master Sergeant Edward A. Guilbert. One major source of delays Guilbert and his logistics team identified was that shipping manifests were in an inconsistent format and often in different languages. In order to solve the problem, a standard manifest was developed that could be transmitted via telex, teletype or telephone ahead of the cargo, enabling the crews to operate at optimal speed. Throughput improved to approximately 8,000 tons of cargo delivered per day, a 60% increase. The most productive 24-hour period was from April 15-16, 1949, during which 1,398 flights (almost one per minute) delivered 12,940 tons of cargo. The Berlin Airlift was considered a success until it ended on September 30, 1949. In total, 278,228 flights delivered 2,326,406 tons of freight into the western sectors of Berlin. The use of standard manifests during the Berlin Airlift is considered the first application of Electronic Data Interchange (EDI). The traditional working definition of EDI is that it is the computer-to-computer exchange of business documents in a standard format amongst business partners and, in many quarters, Edward Guilbert is considered to be the “Father of EDI.” As an employee of DuPont in the early 1960s, Guilbert leveraged the lessons learned during the Berlin Airlift by creating electronic messages to send cargo information between DuPont and Chemical Leaman Tank Lines. By 1968, many airlines, railroads and freight companies were using electronic manifests, resulting in the formation of the Transportation Data Coordinating Committee (TDCC).

18 | Journal of the Professional Petroleum Data Management Association

In what may be an interesting side note for people in the oil and gas industry, Donald Johnston of the Phillips Petroleum Company is credited with introducing the term “electronic data interchange” during a speech he gave at a TDCC conference sometime in the middle of 1975. The term became official when the TDCC published the first American public EDI standards in July of 1975. After TDCC published the first EDI specifications, the food and grocery industry launched its own initiative to explore the benefits of EDI. In an effort to coordinate these nascent standards, the American National Standards Institute (ANSI) created the Accredited Standards Committee X12 (ANSI ASC X12) in 1978. In 1985, the United Nations sponsored the EDI for Administration, Commerce and Transport (EDIFACT) organization for the purpose of developing international EDI standards. Today, there are two mature sets of EDI standards: ANSI ASC X12 predominates in North America and EDIFACT in the rest of the world. There is an old saying in the EDI milieu that standards are like toothbrushes: everyone wants one, but nobody wants to use someone else’s. Concurrent with


Feature

Paper vs. Electronic Invoices the rise of the internet and related technologies, particularly of eXtensible Markup Language (XML), several vertical (industry specific) standards have been spawned in order to fill perceived needs. Some popular examples are ACORD (insurance), RosettaNet (electronics), CIDX (chemicals), and PIDX (petroleum). That brings us to the present day, where there are many standards with more to come. It is important to know the history of EDI in order to understand the mindset behind the vast benefits it brings to a business. Data managers probably will never have to supply a city of two million people with food, fuel and goods for 15 months, but the lessons learned from almost 70 years of traditional EDI technologies are valuable for creating new technologies and tools that will similarly withstand the test of time and, perhaps equally important, secure management endorsement.

BENEFITS OF EDI EDI has been the linchpin of many businesses simply because it is a mature technology that works well. In spite of the constant refrain of the denizens of the internet, “Is EDI dead?”, many large companies have a significant investment in their EDI processes and they are not about to change unless there are extremely compelling reasons to do so. The financial benefits of EDI are wellknown and demonstrable. According to a whitepaper published by GXS, a leader in the integration services industry, EDI:

• Lowers transaction costs by at least 35%. • Improves data quality, delivering at least 30% to 40% reduction in transactions with errors. • Speeds up business cycles by 61% by eliminating high error rate processes such as rekeying and time-consuming corrective actions. In the same whitepaper, GXS customers reported the following benefits: • An electronics manufacturer reported that the cost of processing an order manually was calculated at $38.00 USD compared to $1.35 USD to process via EDI. • An automobile manufacturer reduced a key cycle time from 30 days to 24 hours – a 97% reduction. • A major retailer reduced order cycle time from 24 days to six days – a 75% reduction. Data professionals should be looking for similar quantifiable gains when looking for management buy-in of PPDM tools. The tremendous benefits realized by full EDI implementation include not only significant financial savings, but also substantially shorter business (and therefore payment) cycles. One of the first tasks of an EDI analyst when approaching a new project is to perform a gap analysis. A gap analysis compares the actual performance of a system to the desired performance. The bulk of the analysis focuses on mapping from the EDI standards to existing business systems. Some of the greatest gains from EDI are derived from analyzing and optimizing existing business processes, automating repetitive manual tasks and increasing volume by maximizing the number of participants (called “trading partners” in EDI parlance). One of the stated objectives of the PPDM Model Design is to focus on business-driven requirements, not IT requirements. Another area in which there is common ground between EDI and the PPDM Data Model is that both improve data quality. The savings from addressing this one problem alone can be significant.

According to a 2002 report from The Data Warehousing Institute (TDWI), data quality issues cost U.S. businesses more than 600 billion dollars per year. Consistent with its heritage of automating manual processes and removing opportunities for human misinterpretations, EDI improves data quality by eliminating errors caused by input errors and omissions. Aside from the problems caused by bad data, such as overpayments, chargebacks or late payments, further delays are caused by the corrective actions themselves. Good data models and rules take data quality improvement one step further in that appropriate constraints can be enforced on the data and database tools can be leveraged to enhance data quality.

CONCLUSION Many lessons can be learned from the rich history of electronic data interchange regarding the creation of business processes that stand the test of time. As demonstrated by the Berlin Airlift and the day-to-day operations of numerous successful companies, business efficiencies can be enhanced when everyone speaks a common language in the form of standards. The PPDM Rules and Data Model also provide a common language and framework which enhance the management of data and add value to a business. Data managers should always seek to quantify that value. REFERENCES

http://www.history.com/topics/cold-war/berlin-airlift http://www.apnewsarchive.com/1998/ Statistics-About-Berlin-Airlift/idf21c2385ba4e11eb92aedfd8598818de Notto, Ralph W. Challenge and consequence-- forcing change to eCommerce. Tucson, Ariz: Fenestra Books, 2005. Print. GXS, “The Benefits of EDI,” 2011. Wayne W. Eckerson, Data Quality and the Bottom Line: Achieving Business Success through a Commitment to High Quality Data, The Data Warehousing Institute, 2002.

About the Author Joseph M. Suarez is a Senior Independent Electronic Data Interchange Consultant and holds a Master of Science Degree in Instructional Technology.

Foundations | Vol 4, Issue 1 | 19


Photo contest

Foundations photo contest

“PERSEID METEOR SHOWER ATOP MOUNTAIN LOOKOUT” BY CHRISTY TURNER 2nd Place in the Volume 4, Issue 1 Foundations Photo Contest “Laying on top of the lookout at Moraine Lake watching the Perseid Meteor shower was a pretty amazing experience with visiting friends from Iceland. There is something quite magical about the night sky and it’s why I am obsessed with photographing it..” – August 16, 2016 Born and raised in Calgary, Christy Turner is a world traveller and night sky photographer. Her travels have taken her to more than 70 countries and she enjoys doing photographic expeditions where she receives training and instruction while vacationing in highly scenic places. She lives in Calgary with her boyfriend and they eagerly anticipate the return of jobs within Calgary’s respected oil industry.

20 | Journal of the Professional Petroleum Data Management Association


Photo contest

On the cover:

“ENCOUNTERED MANY DEFEATS BUT STILL NOT DEFEATED.” BY VIKRANT LAKHANPAL 1st Place in the Volume 4, Issue 1 Foundations Photo Contest

“You may encounter many defeats, but you must not be defeated. In fact, it may be necessary to encounter the defeats, so you can know who you are, what you can rise from, how you can still come out of it. This rock at Lake Powell is symbolic of standing still through different seasons and coming out more beautiful and strong. Our industry is a similar warrior. It has gone through various downturns but each time comes out stronger in an innovative way.” – August 10, 2015. Vikrant Lakhanpal graduated from University of Houston in May with a Masters in Petroleum Engineering

Enter your favourite photos online at photocontest.ppdm.org for a chance to be featured on the cover of our next issue of Foundations!

Foundations | Vol 4, Issue 1 | 21


IN

Industry News

Calgary Symposium Celebrates Anniversary

T

wenty-five years might seem like a long time to some people. To those who have been part of the exciting work of the Professional Petroleum Data Management (PPDM) Association, the years have flown by in a whirlwind of activity. In 2016, one hundred people attended the Calgary Data Management Symposium to enjoy a varied and interesting agenda, a vendor tradeshow, and two networking socials; thanks to our sponsors and exhibitors for helping us enjoy all these activities! In 2016, we recognized a few of the many people who have been involved with PPDM since its inception, and continue to be engaged in our activities. The Pioneer Award for Dedication was presented to Wes Baird, Art Boykiw, Dave Fisher, Mel Huszti, Pat Rhynes, Yogi Schulz, and Chuck Smith. The unique contributions made by each contributed to the comprehensive program that is available to our members today. Visit our website to learn more about these outstanding people. Regulatory data strategies were discussed from many perspectives in a panel discussion; inputs from Australia, the US and Canada were offered. Continuing to drive out standards and best practices will add value to regulatory processes, and along the way, significant advantages to

operators, service companies and other stakeholders will also be realized. Jim Crompton talked about how standards are making headway into our industry; his frank disclosure of both opportunities and challenges helped attendees understand why standards are difficult, even as they add value. Jim also challenged us with a viewpoint on data governance strategies. Other talks focused on practical implementations of data management strategies, such as the well header master implementation at Devon Energy, and a proposal for file naming standards that may offer improvements in managing unstructured data. Justin Glatz offered some ideas on understanding how business value is realized. Yogi Schulz, Roger Milley and Paula Jennings provided interesting and varied perspectives about how good data can be leveraged in analytics tools in order to add tangible business value. Harry Schultz discussed the importance of comprehensive reference data, particularly for data that is used by many. Tom Greaves both entertained and informed the audience as he revealed some challenges related to construction around existing utilities. Wes Baird discussed the emergence of self-serve data, and how it fits into the data management role. Gabriela Rico talked about the importance of how data is collected, consolidated, classified and

analyzed in support of business functions. Technical details on accessing PPDM using in-memory analytics were offered by Gautam Pamula. Some practical opportunities for managing data quality in a data lake environment were offered by Ketan Puri and Rama Manne. A fascinating review about how industry GIS experts visualized the devastating fires in Ft McMurray was presented by Jon Connick and Brad Connatty. Non-technical presentations focused on personal and professional development, as Adam Czarnecki discussed quirky human behavior and what gets people interested in your messages and goals. Michelle MacPherson talked about building your professional brand, and offered specific tips that many in the audience implemented immediately! Chelsea Briggs outlined ideas for bridging the knowledge gap between generations as the great crew change comes. Overall it was a well-attended event with an interesting agenda. Presentations are available at www.ppdm.org for members About the Author Trudy Curtis is the CEO of the PPDM Association.

Pioneer Awards: Trudy Curtis, Wes Baird, Art Boykiw, Pat Rhynes, Yogi Schulz, Mel Huszti, Chuck Wilson, Dave Fisher, Trevor Hicks

22 | Journal of the Professional Petroleum Data Management Association


Cover Feature

centre of a growing collection of business services in the regional Digital Energy Cloud, which are designed to reduce the cost but raise the overall levels of compliance with monitoring and reporting requirements set by the regulator.

THE ROLE OF STANDARDS

From National Data Repository to Digital Energy Cloud ORD EM E P RO GR

E

S

S

O

By Jamie Cruise, TARGET Oilfield Services

INTRODUCTION

E

ver since the 1990s, national oil and gas regulators have been building National Data Repositories (NDRs). Often described as digital ‘databanks,’ the main objectives of a typical NDR are to preserve the data acquired by oil and gas companies during their work in country, and to promote investments by using the data to reduce exploration, production and transportation risks. From our work over the last 10 years with regulators, we know that early stage NDRs are typically passive, with a focus on capturing data and building up the repository as a national asset. However, as the NDR matures, there is a natural drive for the regulator to move to a more proactive operating model, where the systems and content provided

The regulator specified that their new service would be based on open standards, to ensure that the national data assets would be easily and cost effectively accessible for the many decades of the NDR’s expected operating lifetime. The use of open standards allows the repository content to be accessed, managed and analysed using tools from any suitably skilled supplier. This has the dual advantage of both increasing the data owner’s choice of suppliers, as well as reducing supplier costs to develop new tools. It also drastically reduces the cost of changing supplier when the contract is renewed, driving the vendor community to compete on business value, and preventing the customer from becoming ‘locked-in’ to a proprietary application.

BUILDING A PPDM-BASED MASTER DATA REPOSITORY by the repository are used directly to assure sustainability, safety, growth and good governance in the industry. This shift is often associated with moving to a different funding model for the NDR – away from purely government-funded, towards a collaborative funding approach with participation from the operating companies who consequently expect to get value by using advanced NDR services. In this article, we describe how TARGET worked with the regulator in a major producing country to support their innovative vision to transform an existing passive National Data Repository into a proactive ‘Digital Energy Cloud’ for their country. The starting point for the project was to construct a new standards-based master data repository to replace the previous databank implementation. The new repository service forms the hub at the

The new repository stores a wide variety of data types. This includes well-related data: headers, deviation surveys, formation markers, field and processed logs, well documents etc. It also includes seismicrelated data: field and processed 2D/3D, gravity and magnetic surveys, VSP, etc. The service also holds information about operators, entitlements, data transactions, and linkages to geospatial data. The volume of wells and seismic data directly managed by the previous service was in the scale of hundreds of terabytes, with the new service expected to scale to around 10 petabytes (largely due to the increasing volume of modern seismic surveys). The previous data bank solution was based on a proprietary application that archived the data submitted by the operating companies using a combination of industry standard formats (e.g. SEG-Y,

Foundations | Vol 4, Issue 1 | 23


Cover Feature

SEG-D, DLIS, LAS, PDF, etc.) and a proprietary data storage engine for well log data. The previous master database schema was based on an older data model standard that is not being actively developed or widely used in other products. The support for industry standards allowed TARGET to export the data and metadata in non-proprietary format, and cost-effectively migrate the previous data bank contents to the PPDM data model. PPDM is a good choice for long term sustainability as it is widely used in the industry and actively maintained. In the new service, we use PPDM 3.9 as a common data model for storing or indexing all data. It provides a broad data model that fully covers the scope of data types needed by the NDR. TARGET’s technology includes a dynamic data access engine that maps the subset of PPDM needed for the data bank to a high-level business object layer. As the service expands to cover new data types, we add new metadata to the system to incorporate the new data structures and relationships. The user interface and data management workflows are also driven by this same metadata, creating a living system that can be adapted and extended onsite without requiring new software to support each change. Keeping the deployed solution flexible is essential to keeping the total cost of ownership of the system low, whilst ensuring that it remains aligned with the ever-changing business environment. Geospatial data access is another area where the new solution takes a non-proprietary approach. Map services are connected using OGC spatial data access standards – WFS (for vector data) and WMS (for map image tiles) – rather than being dependent on a specific spatial infrastructure product. The use of open standards in the new service drives down the cost of implementing the storage, database and data access architectural layers

of the master data repository. This ‘commoditised’ vendor-neutral approach delivers a common data access infrastructure that is driven by the needs of the data owner and controlled by the data owner, rather than the application vendor.

TRANSFORMING THE OPERATING MODEL FOR THE NDR The previous NDR service was funded, staffed and operated by the regulator. Data loading and unloading/ delivery operations were performed at a dedicated bank operations centre, which also hosted all the IT infrastructure. Incoming data was sent to the bank operations centre by the customer. Once received, the data package was checked, fixed and loaded by dedicated Ministry operations staff. Conversely, retrieving data from the bank meant making a request for data to the operations centre, which would be retrieved by the bank staff operating the desktop software running the service. The requested data would then be unloaded, formatted and sent to the original requestor. In an era of low oil prices/revenues and big advances in IT, this kind traditional high cost operating model is a prime candidate for transformation. The regulator’s vision for the new service directly addresses the issues of streamlining bank operations and reducing IT costs by specifying that the new service is outsourced, self-service and cloud-based.

OUTSOURCED, SELF-SERVICE DATA MANAGEMENT OPERATIONS As the outsourced service provider, we at TARGET use our MEERA DX™ Platform software and its Master Data Repository (MDR) module to provide a web-based interface to the new repository. This interface is directly accessible from the operating company’s location and provides a secure portal for company users to interactively search and browse their entitled data. The interface is highly

24 | Journal of the Professional Petroleum Data Management Association

modular and includes online tools for visualising geospatial, wells, seismic data and information in reports and documents. Wherever possible, contents of the repository can be previewed online without downloading to reduce the traffic to the bank and enable interactive access to bulk data (such as seismic datasets) without needing to transfer and load into a desktop application. The same modular web interface includes self-service data submission workflows that allow companies to comply with their data reporting obligations by submitting data to the service interactively online. Dynamic business rules are applied at the point of data submission to validate the incoming data packages against national standards, ensuring that only compliant data is loaded to the repository, reducing the costs of fixing bad data further down the line. Large datasets that cannot be transferred online are shipped on physical media to the customer’s incoming data area, where they can pick up the data submission workflow from their company location. Using a centrally-hosted web interface further reduces the cost of data management operations, as no software needs to be installed on the customer company desktop. The application itself is a pure HTML web application (using Google’s open Material Design framework), with interactivity provided using JavaScript rather than proprietary plugins. Avoiding plugins means that the service can be operated from any device in any location rather than just company PCs. This universally accessible interface


Cover Feature

promotes greater usage of the system and supports mobile, remote and home working, which is becoming increasingly important as companies move to flexible resourcing to optimise the way their teams are organised and distributed geographically.

We expect to see more use of public cloud service providers in the long run as data owners balance their fears about losing control and sovereignty of their data against the benefits of truly operationalised low-cost ‘elastic’ IT infrastructure as a service.

CLOUD HOSTING UPSTREAM DATA SERVICES

BEYOND DATA BANKING

The physical IT infrastructure for the existing service is a collection of high-end servers, storage area network, and tape robotics (along with the ancillary data centre equipment e.g. uninterruptable power supplies, networking and security appliances etc.). To eliminate the high capex costs of owning and maintaining this hardware, the IT is outsourced to TARGET by the regulator as part of a bundled monthly service cost. In this project, the private cloud infrastructure for the new service is hosted at an in-country commercial IT service provider who provides operational support of all equipment. Across two physical data centres, the new service uses mirrored high density hyper converged servers that provide the virtualization needed for our cloud-based application. It also implements a tape tier for bulk storage using high capacity LTO7 media (6TB per tape) in the datacentre. Use of an in-country service provider is necessary for this project, as there is a legal requirement for the data not to leave the country. However, the new application architecture is fully optimized for public cloud service providers as well (or for that matter on-premises application hosting for corporations). It abstracts away the differences between the various virtualisation providers (e.g. VMWare, AWS, Azure) by using application containerization (e.g. Docker), and is decoupled from the tape storage layer using the S3 cloud standard object storage API.

The regulator in this project is using the new cloud-hosted services as a launching point for other services related to the industry that go beyond data banking and provide a common digitalised environment for a variety of stakeholders across the business lifecycle. The first additional service launched was the ‘Living Data Room’ (LDR). This is an online service used to support data promotion as part of a licensinground. Using the same user interface and IT Infrastructure as the national data repository, master data from the repository is supplemented by addedvalue data, interpretation and reports, which are compiled into commercial data packages. These are published online to help investors assess new opportunities in the country. The LDR service is available globally, and provides full online automated workflows for registering new subscribers, checking and approving their credentials, previewing data packages, purchasing and invoicing, secure digital data distribution, and analytics for the regulator to monitor the success of the bid round. The latest suite of extended services being deployed on the platform for the regulator include a variety of workflow packages that automate interactions between the regulator and operating company community. The business goal of these online workflows is to provide the regulator with all the information they need, in a directly actionable format to enable them to monitor and govern the national hydrocarbon assets base. This package moves beyond technical data to include work program and

budgeting, online permitting, regulatory reporting and production allocation. It also includes a module for publishing new regulations and gathering feedback from the operator community.

PROMOTING INNOVATION AND GENERATING NEW INSIGHT The new platform includes an application programming interface (API) that allows third parties to write independent software modules that analyse the data in the repository. Having a simple API rather than a complex SDK, means that a broad community of users including service companies, operators, professional researchers and students can collaborate on the data very easily using their preferred analytical tools (e.g. R, python, perl). In the long run, it is expected that opening the service to third parties for analytics may generate new insight into the country’s hydrocarbon assets.

CONCLUSIONS Regulators are starting to construct online digital services that transform their traditional business model, primarily to reduce the cost of hydrocarbon asset promotion whilst at the same time improving the level of compliance. Innovative IT and open standardsbased data access are being combined to establish sustainable, integrated working environments that improve operational efficiency, stimulate collaboration between stakeholders, support good governance and safe compliant working, as well open the door for new entrants to identify new opportunities that generate additional in-country investment. About the Author Jamie Cruise is the President of Digital Transformation at TARGET Oilfield Services, a company which helps organisations around the world to innovate to improve their business.

Foundations | Vol 4, Issue 1 | 25


Community Involvement

CI Thanks to

our Volunteers

NOVEMBER 2016 Mike Skeffington The November Volunteer of the Month was Mike Skeffington. Mike is the Chair of the PPDM Denver Leadership Team, leading this group through not only their first Denver Data Management Workshop together, but also the creation and formalization of new processes and policies for PPDM events. Vice President of Business Development at EnergyIQ for nearly five years, Mike has also been a strong supporter of PPDM and our continued efforts to build the Data Management Community. Prior to joining EnergyIQ, Mike held a variety of Business Development, Marketing, and Managerial roles with Accenture, StrataWin, Qlip Media, Eplance, ClickCommerce, and others. Mike holds a Bachelor of Science in Computer Science from Colorado State University. “Mike is an amazing individual to work with. Wherever he goes, Mike brings his intelligence, experience, sense of humour, and friendly manner, making him easy to engage with and a pleasure to see. PPDM is incredibly lucky to have Mike as a supporter and Chair of our Denver Leadership Team,” says Pam Koscinski, Community Relations Coordinator with the PPDM Association.

DECEMBER 2016 Paula Jennings PPDM’s December Volunteer of the Month was Paula Jennings. Partner and co-founder at Fuzeium Innovations Inc., Paula and her team

co-design, build, and deliver interactive dashboards. Paula is a professional geophysicist with a geoscience and IT background and has worked in the Calgary oil and gas industry since 1985. Formally trained as an engineer, Paula has a variety of experience, spending time as a technical data management analyst at Shell, President of TriloByte Exploration Consulting, and in varying roles at other companies, including Suncor, Burlington Resources/ConocoPhillips, Landmark Graphics, and more. “Paula joined our Rules Committee earlier this year and has since brought her enthusiasm and excitement to this group as it continues to build this important foundation for the profession,” said Ingrid Kristel, Project Manager with the PPDM Association.

exploration, operations, corporate, and financial systems, to director roles for Petro-Canada’s East Coast, U.S., and North American oil and gas and oil sands business units. An industry leader, Art is also actively involved in the PPDM Association including as a PPDM work group participant in the development of the PPDM Data Model. For more than 13 years, Art served on the PPDM Board of Directors in various capacities, including Chairman for 10 years and treasurer. Currently, Art is actively engaged in promoting the importance of the regulators’ role in establishing and supporting standards, including data values, to promote industry adoption. The global appetite for shared information requires standards, and Art is pursuing this as co-chair of the Regulatory Data Standards Committee.

JANUARY 2017 Art Boykiw The January Volunteer of the Month is Art Boykiw, Vice President - Information Services at Alberta Energy Regulator (AER). At AER, he is responsible for establishing the IS strategy and organizational capability to meet AER’s integrated Information and systems vision, enabling AER’s regulatory mandate. Art is a key driver behind AER’s transition to a risk-based regulatory environment; as this project continues, it will result in faster and more effective regulatory management in Alberta. Art’s career in the petroleum industry began over 30 years ago with Dome Petroleum and Petro-Canada, progressing from programming regulatory production and royalty reporting systems in the 80s, through project management in

26 | Journal of the Professional Petroleum Data Management Association

Want to see your name here? Volunteer now with the Professional Petroleum Data Management (PPDM) Association volunteer@ppdm.org


Guest Editorial

Q&A with Joseph Seila By The Foundations Editorial Team

T

he Petroleum Data Management industry is continually growing and changing as time and technology progresses. Through these challenging times especially, projects are often started and abandoned, processes are created and ignored, and people are left confused and frustrated. The Foundations Editorial Team would like to take some time to celebrate some of the achievements that we experience in our industry. In December 2016, the Foundations Editorial Team sat down with Joseph Seila who recently stepped down from the Professional Petroleum Data Management (PPDM) Association Board of Directors after four years of service. Joseph, who is currently an E&P Enterprise Data Manager in Midland, Texas, discussed some of the achievements he has had over the last few years. Joseph has encountered both success and failure through his current and past roles, and was generous enough to take the time to share his experiences with us, for the benefit of our readers: Q: How did you get started in Data Management? I started as a Geotech in the Fort Worth Basin at Devon Energy working in the Barnett Shale in 2012. This was my first Oil and Gas job. One of my first tasks was adding newly permitted wells to our geoscience interpretation application project every Tuesday. We had two

monitors and the workflow was to open up the geoscience interpretation application on one monitor, bring up the data vendor’s website on the other and then type in what was in the data vendor’s website into the geoscience interpretation application. At the time there was a tremendous amount of new wells being permitted which could range anywhere from 25 to 90 new permits every week. This was basically what one person did every Tuesday. After a while, I noticed the data vendor’s website had an export option, and the geoscience interpretation application had an import option. I asked the other geotechs if it would be okay to try to automate this and export new wells out of data vendor’s website and then import them into the geoscience interpretation application. I took what used to be a half-day to all-day task down to five minutes. This was the first time I realized that there was a lot of room to improve data management tasks and workflows within our department. After that, a lot of the data management tasks were given to me, since I liked them. The second or third week at Devon I had a request to modify the bottom hole locations for our project by one of the geophysical techs. Without asking or thinking about the consequences I did as asked and messed up all the bottom holes for the entire project. In an effort to fix this, I accidentally deleted all the locations

out of the geoscience interpretation application project. Being so new I thought, “Well this was a good job while it lasted, wonder what I’m going to do after this.” Thankfully, I made a backup of all the geoscience interpretation application locations and was able to import them back in and fix it. That was when I realized how important good Data Management is and how easy it can be to mess up. After a couple months my boss, Herb Magley, asked me to find a better way to manage our data outside of geoscience applications and projects. I wrote a couple proposals and our VP bought into it. Long story short, I got some funding and the OKC Data Management Program was started. This was a multi-year program implemented to address three key areas of Data Management: Well Master Data Management, Data Governance, and Data Quality and Integrations. Like a lot of Data Managers, I didn’t set out to be one. I just had some great opportunities and a knack for it. Mike Rowe, host of the Discovery Channel show Dirty Jobs, likes to say, “Find something you’re good at and then develop a passion for it.” Even though I never set out to become a Data Manager, I enjoy the work and have developed a passion for it. Q: What was the most interesting project that you’ve worked on? The OKC Data Management program was the most interesting program I have ever worked on. It had several interesting projects and since it was the first major program I was ever involved in, it really taught me a lot. As I mentioned before, this threeyear program addressed three key areas of Data Management: Well Master Data Management, Data Governance, and Data Quality and Integrations. I’ve given a few talks about this at prior PPDM Data Management Symposia. (The PowerPoints may still be on PPDM’s site, for those interested). For each area our primary objectives were in essence: • Well Master Data Management

Foundations | Vol 4, Issue 1 | 27


o Implement a PPDM based Enterprise Well Master solution that would manage well identity across all Enterprise applications • Data Governance o Stand up Data Governance Councils to provide guidance on data management activities and implement Enterprise data standards • Data Quality and Integrations o Develop data rules and start measuring data quality against objective standards o Integrated several geoscience applications and key systems with the Well Master Database Q: This sounds like a huge program. Was it a success in your mind? Yes, the overall program was a success. It lasted longer than we thought it would and not every individual project was a success, but overall the entire program changed the way we managed data across the company. It was the first program to successfully implement a Well Master at Devon and the success of the program generated other programs across the company. Ultimately Devon created a Data and Analytics Division within the company, there are a lot of people other than me who should get credit for that, however I think it’s fair to say that the OKC Data Management program was the tip of the spear. Q: What would you say you learned the most from doing both of those implementations? Data Management is more about people and process than technology. Any technology can be implemented but if you don’t have the right processes in place and the employees aren’t bought into the new workflows then the technology will fail. Q: When you say people, who do you mean? Do you mean Executives? You definitely do need Executive buy in, but that’s usually not that hard to get. The hard thing to get is the people on the ground floor to buy in - the asset teams and business users, the people you really want to see benefit. They need to

see the big picture and make sure they are okay to go through a bit of pain to get to a better place long term. It’s really that component – getting people to start understanding the need to steward the data. It may add a little bit of work on your end, but it saves a lot of work downstream. It’s about people and processes. The integrations won’t work right because people won’t put the data in the right place - how do you compensate for that and improve the processes? If you need the data to flow one way but the people are not willing to do it, then you need to get creative. Most of your root causes for data issues are process issues or people. Q: What was the most challenging project that you’ve been involved in? I think it was trying to get our well master management system to do a near real time integration with our main geoscience interpretation application for well data. This was where, when you updated the master, the data would just funnel down into the geoscience projects. That was the most challenging because we were trying to make a technical solution where, frankly, the technology in the geoscience interpretation project just couldn’t do it – it wasn’t built for that. Even though we thought we had created components to make it work, in the end we were trying to make the application do something it wasn’t designed for. Ultimately that failed, it didn’t work at all. Q: So if it wasn’t a success, did you give up on it or do something different? We eventually gave up on it and changed our direction. We decided to pursue a flat file ETL option instead which the geoscience application could handle. It didn’t get us the nearreal time integration we wanted but it was a step in the right direction. Q: Is there any advice you’d like to pass along to someone undertaking a similar challenge? For anyone that wants to build up their Data Management systems and capabilities, realize that it is a journey the entire company is going to have to

28 | Journal of the Professional Petroleum Data Management Association

go through. It is going to take a long time, longer than you want, everyone in the company has to grow with the new data management capabilities and tasks-how you manage that will determine if you are successful or not. Stay the course, a lot of Well Master and other data management initiatives are set up for failure by having unrealistic delivery dates, they are not funded well enough, or lack the proper resources and expertise. Leave room for failure and realize that not everything will be successful the first time you try it. As Thomas Edison said, “I have not failed, I just found 10,000 ways that that won’t work.” Don’t rush the integrations, especially writing data back to the business applications. If your data isn’t good enough you will lose the business quickly by writing bad data back to their applications. I like to use a process I call Read, Clean, Write. Read the data into the Master, Clean it up there, and Write back only after you can prove it is better or equivalent to the other systems. Data Governance should be first; it is the most important piece. 95% of all Master Data Management projects fail because the data governance is not in place when it needs to be. How does the company make hard decisions about data standards when no one can agree? How do you enforce data quality rules? What if you need a team to handle data a certain way so everyone downstream can utilize it but that team won’t manage it the way you want them to? All of these issues and more will come up during your projects and if you don’t have the right people in place to help with the hard decisions and change management then your projects are likely to fail. Pick the technology after you know what the solution should look like. I’ve seen way too many IT initiated projects where a technology gets picked and then only after trying to implement it do people start asking what are we trying to do with this? The Foundations Editorial Team would like to thank Joseph for sharing his experiences with our readers.


Poorly managed data cost the industry dearly. Band aid solutions are not an option a business or the environment can tolerate. We know you will not either.

Certis values the importance of welldesigned and permanent data solutions. We understand the complete life cycle of Oil & Gas assets. Because our engineers and geoscientists contribute to every project, we know what matters and what works. Effective data management is our strength and passion. Excellence and sustainability are our standards. Call us for a presentation, case studies review or project references.

Address: 1301 Fannin St. S #2440, Houston, Texas 77002—USA. Email: info@certisinc.com

Foundations | Vol 3, Issue 3 | 29


One Person Can Do So Much

Together we can do so much more Now It’s YOUR Turn The PPDM Association needs volunteers like YOU to continue to identify professional development opportunities, grow certification programs, advance data management standards, and more. We have immediate opportunities for volunteers with a variety of skill sets and experience in both our Professional Development Committee and Petroleum Data Management Certification Committees.

Apply Now Contact volunteer@ppdm.org to find out how you can make an impact.


Upcoming Events

UE

LUNCHEONS APRIL 25,23, AUGUST 2017 2016 DALLAS Q3 DATA DALLAS/FORT WORTH Q2 DATA MANAGEMENT LUNCHEON

MAY 9, 2017 DENVER Q2 DATA MANAGEMENT LUNCHEON

MAY 23, 2017 HOUSTON Q2 DATA MANAGEMENT LUNCHEON

Denver, CO, USA

Houston, TX, USA

APRIL 26, 2017 CALGARY SPRING DATA MANAGEMENT LUNCHEON

MAY 12, 2017 PERTH DATA MANAGEMENT LUNCHEON

JUNE 1, 2017 BRISBANE Q2 DATA MANAGEMENT LUNCHEON

Calgary, AB, Canada

Perth, WA, Australia

Brisbane, QLD, Australia

Dallas, TX, Frisco, TX, USA USA

WORKSHOPS & SYMPOSIA FEBRUARY 28 – MARCH 1, 2017 HOUSTON DATA MANAGEMENT SYMPOSIUM & TRADESHOW

Perth, WA, Australia

Houston, TX, USA

OCTOBER 23-25, 2017 CALGARY DATA MANAGEMENT SYMPOSIUM, TRADESHOW & AGM Calgary, AB, Canada

SEPTEMBER 2017 DENVER DATA MANAGEMENT WORKSHOP Denver, CO, USA

Oklahoma City, OK, USA

5 12 19 26

6 13 20 27

7 14 21 28

MARCH 2017 S M T 5 12 19 26

6 13 20 27

7 14 21 28

APRIL 2017 S M T

AUGUST 10, 2017 PERTH DATA MANAGEMENT WORKSHOP

MAY 11, 2017 OKLAHOMA CITY DATA MANAGEMENT WORKSHOP

FEBRUARY 2017 S M T W

CERTIFICATION - CERTIFIED PETROLEUM DATA ANALYST NEW DATES ANNOUNCED MAY 10, 2017 NOVEMBER 2, 2016 CPDA EXAM

NOVEMBER 8, 2, 2017 2016 CPDA EXAM

(Application Deadline September March 29, 2017) 21, 2016)

(Application Deadline September 27, 21, 2017) 2016)

ONLINE & PRIVATE TRAINING OPPORTUNITIES Online training courses are available year round and are ideal for individuals looking to learn at their own pace. For an in-class experience, private training is now booking for 2017. Public training classes are available on demand.

All dates subject to change.

VISIT PPDM.ORG FOR MORE INFORMATION Find us on Facebook Follow @PPDMAssociation on Twitter Join our PPDM Group on LinkedIn

2 9 16 23 30

3 10 17 24

4 11 18 25

MAY 2017 S M T 7 14 21 28

1 8 15 22 29

2 9 16 23 30

JUNE 2017 S M T 4 11 18 25

5 12 19 26

6 13 20 27

T

F

S

1 8 15 22

2 9 16 23

3 10 17 24

4 11 18 25

W

T

F

S

1 8 15 22 29

2 9 16 23 30

3 10 17 24 31

4 11 18 25

W

T

F

S

5 12 19 26

6 13 20 27

7 14 21 28

1 8 15 22 29

W

T

F

S

3 10 17 24 31

4 11 18 25

5 12 19 26

6 13 20 27

W

T

F

S

7 14 21 28

1 8 15 22 29

2 9 16 23 30

3 10 17 24

Do you have a Data Management Horror Story to share? Names/Companies will be removed. Submit your story to foundations@ppdm.org to see it in our next edition.


WITHOUT knowledge action IS USELESS AND knowledge without ACTION IS futile. Abu Bakr

Power your upstream decision-making with customer-driven data, integrated software and services from geoLOGIC. At geoLOGIC, we help turn raw data into actionable knowledge. That’s a powerful tool to leverage all your decision making, whether it’s at head office or out in the field. From comprehensive oil and gas data to mapping and analysis, we’ve got you covered. Get all the knowledge you need, all in one place with geoLOGIC. For more on our full suite of decision support tools, visit geoLOGIC.com

geoSCOUT

|

gDC

Upstream knowledge solutions


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.