Foundations Journal of the Professional Petroleum Data Management Association
Volume 2 | Issue 1 | Winter 2015
More than managing Five things every oil and gas company should know about geospatial data PLUS
Photo contest This month’s winners and how to enter
Foundations Foundations: The Journal of the Professional Petroleum Data Management Association is published four times per year by JuneWarrenNickle’s Energy Group. CEO Trudy Curtis
Table of contents Volume 2 | Issue 1 | Winter 2015
COVER FEATURE
Senior Operations Coordinator Amanda Phillips Events Coordinator Bryan Francisco BOARD OF DIRECTORS Chair Trevor Hicks Vice-Chair Robert Best
More than managing Five things every oil and gas company should know about geospatial data By Tarun Chandrasekhar
10
FEATURES
The missing ingredient
Secretary Janet Hicks Treasurer Peter MacDougall Directors Mohamed Akoum, Trudy Curtis, Rusty Foreman, Paul Haines, David Hood, Allan Huber, Adam Hutchinson, Joseph Seila, Paloma Urbano Head Office Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Email: info@ppdm.org Phone: 403-660-7817
President & CEO Bill Whitelaw
GEOSPATIAL DATA MANAGEMENT
GUEST EDITORIAL
13
When discussing standards, we sometimes forget the most important part: people
In the eye of the beholder
4
Data quality can mean different things to different people By Jim Crompton
By Jim Crompton, data management and analytics consultant DEPARTMENTS
All together now
15
Transforming oil and gas operations and maintenance with collaboration and innovative IT and industry standards By Cliff Pedersen and Alan Johnston
Standards and technology
6
Industry news and updates
Photo contest
20
This month’s winners and how to enter
TECHNICAL ARTICLE
Value added
Editor, Special Projects Rianne Stewart Contributors Tarun Chandrasekhar, Gordon Cope, Jim Crompton, Alan Johnston, Cliff Pedersen Editorial Assistance Sarah Maludzinski, Sarah Miller, Sarah Munn
18
Data rules are the key to ensuring information brings value to a company
Upcoming events
22
Find us at events and conferences around the world in 2015
By David Fisher and members of the business rules work group
Creative Lead Cath Ozubko
Join the discussion on LinkedIn
Graphic Designer Linnea Lapp Ad Traffic Coordinator Lorraine Ostapovich
Calgary 2nd Flr-816 55 Ave NE Calgary, AB T2E 6Y4 Tel: 403-209-3500
ABOUT PPDM Edmonton 220-9303 34 Ave NW Edmonton, AB T6E 5W8 Tel: 780-944-9333
The Professional Petroleum Data Management (PPDM) Association is a global, not-for-profit society within the petroleum industry that provides leadership for the professionalization of petroleum data management through the development and dissemination of best practices and standards, education programs, certification programs and professional development opportunities. For 25 years, the PPDM Association has represented and supported the needs of operating companies, regulators, software vendors, data vendors, consulting companies and management professionals around the globe.
Foundations | Winter 2015 | 3
Guest editorial In the eye of the beholder Data quality can mean different things to different people By Jim Crompton, data management and analytics consultant
ow do you get one version of the truth? Without one operational data source that every person within a company agrees is reliable, employees spend a lot of time looking for the right answer, debating over different versions, arguing about whether or not data quality is a problem, and if it is, how big of a problem and whose responsibility it is to fix it. Having a known gold standard for trusted data is becoming more important than ever. But do we even have a common understanding of what data quality means? One method for finding a single truth is to simply declare one source to be trustworthy. If the appropriate governance picks a particular source to use for performance reviews, critical investment decisions and business planning, the attention of the community will begin focusing on making sure the most accurate data is available from this source. To avoid selecting a problematic data source, find an area where you know the limits of the data and make clear to everyone what those limitations are. With this starting point, the journey to good data quality can begin. The alternative is for competing teams to spend a lot of time collecting, cleaning and perfecting what they think is the right data, all while trying to convince others to embrace their version of the truth. The topic of data quality isn’t exactly the same as “one version of the truth,” but there are many similarities. Some
H L
4 | Journal of the Professional Petroleum Data Management Association
have a specific definition for data quality, such as completeness, validity, consistency, timeliness and accuracy. From a data-centric perspective, data quality should be addressed at the source and throughout its life cycle. Another method is to let technology do the job. Uncertainty is everywhere— in measured data, in processing and in modelling parameters. Instead of using person-hours to qualify data, we can use a computer to find all of the different versions of the truth and determine which data most affects the outcomes. This enables the interpreter to focus on the most relevant information. Probabilities need to be analyzed during the review process and reviews need to be done of the probabilistic maps, tops and log computations. Technology does have a role to play in understanding uncertainty. All data doesn’t have to be perfect to be useful in making good decisions. Knowing about the quality of the data is sometimes all you need, and there are good tools to help automate the process of moving data from point to point and in developing scorecards specifically about data quality. Sometimes the problem with data quality is one of poor communication, often between management and the technical community. Management generally wins these disagreements, even when it means low-quality data is used to make key decisions. A little more business knowledge from technical experts and a little more technological savvy by
management can go a long way toward bridging this gap. With all the attention on information quality and systems of record these days, our goal should not be the creation of a Fort Knox, where we spend considerable effort making sure no one can get in. Data is meant to be used, and it can be traded for other things that add value to a business, like good decisions.
Having a known gold standard for trusted data is becoming more important than ever. But do we even have a common understanding of what data quality means?
2015 HOUSTON DATA MANAGEMENT
SYMPOSIUM & TRADESHOW MARCH 9 - 11
Want to take part in the action? Here’s how:
Learn and Enjoy – Register to Attend Get Recognized – Become a Sponsor Spread your Message – Exhibit at our Tradeshow
Visit ppdm.org to register now!
To complete your experience, also register for the Houston Data Management Workshop and Field Trip on March 9. For more information contact us at: events@ppdm.org
Foundations | Winter 2015 | 5
Standards & technology
NEWS
The Petroleum Industry Data Exchange (PIDX) U.S. Fall Conference was held in Houston on Sept. 11, 2014, and centred on the theme “How standards accelerate the realization of value in oil and gas.” Hosted by Baker Hughes at its Houston Technology Center, the sold-out gathering attracted a host of speakers from oil and gas operators, oil service companies and solutions providers. Archana Deskus, chief information officer at Baker Hughes, kicked off the conference by challenging the oil and gas industry to take the journey to standardization. Having spent a good part of her career in the aerospace and finance industries, Deskus has witnessed the path to standardization and pointed out that costs, regulations and competition are the primary driving forces behind standardization. For the oil and gas industry, operational efficiencies, digital oilfield, and safety and environment are the key value propositions for developing and adopting standards. Regulatory compliance,
competitive pressures and innovation drive the realization of this value. When individuals in the oil and gas industry object to standardization due to the uncertainty of exploration and production projects, Deskus is quick to point out that such projects are no more uncertain than space flight. Chris Bien of Noble Energy shared an account of making the transformation from manual paper-based processes to electronic invoicing. Paper invoices pose the challenges of limited visibility into spend across the organization, difficulty in paying suppliers on time and the need to make a high volume of inquiries from suppliers. Noble’s phased approach to automation focused first on enabling suppliers with electronic invoicing, second on ensuring validation and compliance with negotiated contracts, and finally on expanding the solution footprint across the globe. In the course of this program, Noble successfully migrated 96 per cent of
6 | Journal of the Professional Petroleum Data Management Association
its domestic invoices from paper to electronic. Consequently, the approval cycle dropped from 27 days to two days, which resulted in the ability to negotiate additional discounts with suppliers. Later phases of this program enabled Noble to create price books for high-end suppliers to eliminate pricing errors and monitor compliance for over- and under-charging for goods and services procured. David S. Schorlemer, chief financial officer of Stallion Oilfield Ser vices, co-presented a paper titled “Transformative technologies for improving cash f low and operations in the oil patch” with Amalto Technologies and Salesforce. Schorlemer pointed out that, with accelerating oilfield activity in the U.S. onshore market, the current process of “controlled chaos” is unsustainable. Service companies are constantly pressuring operator resources to sign and approve field tickets. The inability to get a relief worker to approve services ordered during a prior rotation is one example that leaves documents in limbo, causing delays in invoicing. Such processes result in inaccuracies and audit exceptions during end-of-month accounting. Schorlemer argues that, nowadays, operators seek to improve efficiencies and, thus, lower cost, while supply chain professionals require higher standards for safety and asset management. For operators, accelerating supplier invoicing helps them enable quicker joint-interest billing and cost recovery from working interest owners. After upgrading their order to cash systems, Sta l lion t ur ned to PIDX
IMAGE: IS TOCK .COM/MEINZ AHN
PIDX 2014 fall conference takes place in Houston
Standards & technology
members Amalto and Salesforce for their PIDX-enabled solution. Stallion now uses the A malto-Salesforce solution to facilitate eBilling B2B solution with its customers and use price book
management architecture, and the company is now evaluating electronic field ticket approval solutions. PIDX holds two conferences (spring and fall) in Houston and Europe. ConocoPhillips
S&T
will host the U.S. spring conference at the Westin Memorial Hotel in Houston on April 9, 2015. For a full copy of the conference proceedings, visit pidx.org.
PPDM welcomes the new board of directors at the 2014 Calgary Data Management Symposium and Tradeshow in Kananaskis The Professional Petroleum Data Management (PPDM) Association is proud to announce that Mohamad Akoum of the Abu Dhabi Company for Onshore Oil Operations (ADCO), David Hood of geoLOGIC systems, Adam Hutchinson of Stonebridge Consulting, Peter MacDougall of IHS, Joseph Seila of Concho Resources and Paloma Urbano of ConocoPhillips have been elected to the PPDM Association’s board of directors. The election took place at the annual general meeting (AGM) held during the 2014 Calgary Data Management Symposium and Tradeshow in Kananaskis, Alta. The newly elected members join the existing directors: Trevor Hicks of Noah Consulting, Robert Best of PetroWEB, Janet Hicks of Landmark Software & Services, Paul Haines of Noah Consulting, Rusty Foreman of BP and Allan Huber of Shell. Trudy Curtis, chief executive officer of the PPDM Association, is delighted at the board’s additions. “Each of the incoming board members brings a valuable perspective to the
PPDM Association’s community. Adam is a champion of best practices in petroleum data management, while Paloma brings a wide range of experience in strategy, having worked for both operators and service companies. Mohamad, in his role at ADCO in Abu Dhabi, will enable the PPDM Association’s community to widen its base in the Middle East. We are so pleased to welcome new people on board
and look forward to working with them in the future.” The 2014 Calgary Data Management Symposium and Tradeshow and the AGM were held at the Delta Lodge at Kananaskis Oct. 21–23, 2014. Over 140 attendees met to participate in informative presentations and panel discussions, visit exhibitors at the tradeshow and enjoy networking opportunities.
Foundations | Winter 2015 | 7
S&T
Standards & technology
Recap of NDR 2014 By Nicholette Ross, marketing and communications manager, Energistics
8 | Journal of the Professional Petroleum Data Management Association
as part of an enhanced Energistics regulatory special interest group, which will formalize the running of the conferences and increase involvement. The chair of the 2014 meeting was Tirza van Daalen from TNO – Netherlands. Van Daalen said post-event that she felt the meeting exceeded everyone’s expectations and saw a real desire for regulators to cooperate in various data standards areas. Progress in that area is expected before the next North American meeting in 2016. Dr. Lee Allison, director of The Arizona Geological Survey, was appointed as the new chair for the next 18 months. Allison said his first goal is to expand on interactions and collaborations among the data repositories between meetings. Secondly, he would like to get involvement from more U.S. state and federal oil and gas regulatory agencies in advance of the spring 2016 North American conference. Information, including location, for the 2016 event will be announced in early 2015.
For more information on the NDR work group and NDR 2014, please visit energistics.org/regulatory/ national-data-repositoryndr-work-group. IMAGE: NDR
Oil regulators from around the world are challenged with managing growing volumes of data generated by the industry. Establishing national data repositories (NDRs) has typically been the solution implemented and was the topic of the 12th annual NDR work group meeting (NDR 2014) held in Baku, Azerbaijan, in early October. The event was hosted by the State Oil Company of Azerbaijan (SOCAR) and was held in view of the world’s first commercially drilled oil well. Established more than 20 years ago, NDR work group meetings have been held
in nine countries around the world so far, including the U.K., Norway, Canada, the U.S., Colombia, South Africa, India, Brazil and Malaysia. At this year’s meeting, regulators from 22 countries, plus delegates from major oil industry service companies and software suppliers around the world, met to share best practices and updates on progress made on collaborative projects since the last meeting. Most importantly, attendees discussed how to improve the quality of data delivered to regulators, how to improve the profile of NDRs and how to enhance the value of the data. Best practices discussed included increasing efficiency, leveraging new technology developments and improving industry compliance while promoting economic development and protecting the environment. New to NDR 2014 were reports from each country, which provided a quick update on each region with focus on a specific topic, such as a successful project or a problem that needs resolving. For countries new to the conference, the reports focused on information about their region’s number of wells, use of seismic, status of NDRs, problems and issues. Several breakout sessions took place, including a discussion on production data that led to an active work group getting very close to refi ning a regulatory standard for the exchange of production data. This work group is hoping to identify other topics that will emerge as future areas of collaboration. The executive committee has recommended that future conferences take place
BECOME A PPDM
MEMBER One membership fee; over $100 million worth of knowledge. Now that’s ROI.
Members of the Professional Petroleum Data Management Association get immediate access to a wealth of tools and knowledge in the field of oil and gas data management standards, best practices, and education, including: Standards • PPDM 3.9: The leading open standards data model in the industry • What is a Well? • Well Status and Classification
Business rules: best practices for creating and managing master data stores containing the most trusted data for an organization
Knowledge sharing • Training and certification • Events: workshops, conferences, user group meetings • Forums, wiki pages, blogs
Quarterly subscription to Foundations
Membership in the PPDM Association is one of the best investments you can make. For more information on membership packages, visit us online at ppdm.org. Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Canada Email: info@ppdm.org Phone: 403-660-7817
More than managing Five things every oil and gas company should know about geospatial data By Tarun Chandrasekhar
A
s a petroleum geographic information systems (GIS) professional, I am often asked about the intricacies of geospatial data management, as though it is some mysterious, exotic realm. Truthfully, while the tools that GIS professionals use are different than those used by managers of other types of data, the principles are quite similar. In fact, geospatial data is an important part of every aspect of an oil company’s activities, and although its successful management requires a broad base of knowledge and experience, its value to the bottom line is significant. It should, therefore, be considered a critical part of the company’s data management strategy. Here are five questions about geospatial data every oil company should know the answer to.
1. WHAT IS GEOSPATIAL DATA? Generally speaking, geospatial data is information about things that are in or on Earth and is referenced by geographic (or spatial) coordinates, either above ground, at surface and sea level, or subsurface. Geospatial data represents the specific locations of wells, seismic lines, leases, or infrastructure and assets, such as pipelines and storage tanks. Geospatial data also includes contextual, background layers representing cultural, topographic, bathymetric, basin,
field, prospect, play and lead outlines, economic, demographic, geologic and interpretive output. It can be in many formats, including points, lines and polygons (vector data) or Lidar, grav/ mag, and satellite and aerial imagery (raster data). If the positional integrity of geospatial data or derived information becomes unreliable, it can cause risk to business decisions and operational safety. Therefore, geospatial data needs to be managed in enterprise geospatial systems or GIS systems.
10 | Journal of the Professional Petroleum Data Management Association
2. WHY SHOULD WE MANAGE GEOSPATIAL DATA? There are three primary reasons for enterprise-wide geospatial data management: • Risk mitigation: The accuracy of location data and geodesic integrity (meaning the correct coordinate reference system is known and used properly) are critical to avoiding million-dollar mistakes. When planning well pads or complex wells in which wellbores will be in close proximity, knowing the exact 3-D location is critical to avoiding collisions. Pipeline rights-of-way can be optimized to reduce environmental impact and maximize integrity. Disputes over land boundaries can be effectively resolved. If an emergency arises, response teams can accurately pinpoint the location. Data will also help companies comply with the required operational regulations and environmental reporting. • Process simplification: Geospatial data is not limited to a particular
Cover Feature
WHAT
WHY
WHERE
CONSIDERATIONS FOR SUCCESSFUL GEOSPATIAL DATA MANAGEMENT
WHEN
HOW
function or discipline. Every discipline uses geospatial data and 70 per cent of the data each discipline uses is shared across the organization. This means that every discipline uses wells, leases, pipelines, cultural data, and health, safety and environment (HSE). Enterprise-wide management of geospatial data will ensure that each discipline in the organization has access to the most up-to-date spatial data without needing additional resources in each team to gather, prepare, ensure quality control and load spatial data. • Increased efficiency: Geospatial data increases the efficiency of prospect analysis, field operations, portfolio management and a host of other processes. A detailed network model for pipelines allows users to look at disruption in connectivity by modelling the impact of shutting down a valve. Engineering and HSE teams can run simulations showing the impact of an incident on population, structures and environment and can help design safer systems for operations.
In the sub-surface discipline, exploration teams can leverage updated leasing data from the land department and performance data for active wells from the operations department to enhance their prospect analysis.
3. WHERE SHOULD GEOSPATIAL DATA BE MANAGED? Two different points of view exist in the data management community regarding where geospatial data should be managed. One school of thought considers geospatial data as nothing more than location attributes (x.y or lat, lon) that should be a part of a record describing other attributes of the well, seismic line, lease or pipeline. Recent storage type advances in Oracle Spatial and SQL Server Spatial allow the spatial geometry to be stored as a part of the initial record, allowing many mapping systems to directly read the record. Traditional data architects and data managers tend to prefer this approach. GIS analysts, specialists and architects who have dealt with geospatial data
all their lives generally have a different mindset: they believe that geospatial data is more than just location coordinates, that the spatial relationship between objects, such as adjacency, connectivity and clustering, needs to exist in defined geospatial data structures within GIS systems. There are three types of geospatial data—mastered, spatialized and derived— and the differences are based on where they are managed. • Mastered data sets are fully managed in a GIS database. This is applicable to data that is inherently spatial in nature and for which topological relations need to persist. Examples include pipeline data and road networks. Culture data has traditionally also been managed in GIS databases because it is geospatial in intent and historically has not belonged elsewhere. • Spatialized data sets are geospatial representations of data that is mastered and managed outside of the GIS system in traditional oil and gas data repositories. Examples include all
Foundations | Winter 2015 | 11
Feature
Geospatial data is an important part of every aspect of an oil company’s activities, and although its successful management requires a broad base of knowledge and experience, its value to the bottom line is significant.
aspects of well locations, such as surface hole, bottom hole, wellbore path, well pad, first take and kick-off point. GIS databases then present the spatial representation. • Derived information is information that is mined or analyzed from spatialized or mastered data sets. This information is then collected in a GIS database to allow quick access to built-in answers to commonly asked questions. These could be value-added attributes calculated based on business rules or summarized information that is frequently accessed. An example would be an interpreted play fairway.
4. WHEN SHOULD GEOSPATIAL DATA BE MANAGED? One of the biggest challenges all data managers face is helping the rest of the company understand the time estimates for data management projects. Some organizations have multi-year projects to clean up existing data and move it to a managed system. Others have a more agile, iterative approach. The truth is that managing geospatial data is more like maintaining a healthy weight: it is a lifestyle change, not a quick fix. Oil and gas companies spend millions of dollars a year on data subscriptions because they understand the need for data. However, many companies fail to notice the hours wasted in accessing this data or the duplication in data preparation and quality control that comes from an ill-managed system. Spatial data is a very important asset for an organization, and it needs to be managed all the time. Spatial data should
be handled with the same amount of care and attention as every other type of data. When a company embarks on a geospatial data management initiative, there is a defined initial investment (akin to capital expenditures) that most data managers are good at estimating and receiving. However, we should be spending an equal amount of energy establishing an annual data management budget (allocated as operating expenditures) to continue to maintain and enhance this data. Geospatial data changes over time just as any other type of data.
5. HOW SHOULD GEOSPATIAL DATA BE MANAGED? The hardest question of all does not really have a silver bullet for an answer. There are many approaches that have worked for companies. In BP’s Lower 48 onshore business, we began by defining the enterprise geospatial organization, its reporting structure, relationship to IT, petro technologists and geoscientists, and then defined the processes of gathering, managing, improving and disseminating geospatial data. Only then did we focus on the technology and the tools for information management and dissemination. We are not married to a particular data model or a rigid approach. The only rules we follow are: • All processes put in place need to be sustainable, long after the people responsible have moved into different roles; • If it can be automated, it should be automated; and • If a data management process is not geared towards reducing risk or
12 | Journal of the Professional Petroleum Data Management Association
improving efficiency, it needs to be challenged. Like all aspects of corporate data management, geospatial data has hurdles. When it comes to the adoption of new standards or applications, for instance, inappropriate estimation of effort and inaccurate allocation of time and resources can lead to problems. The actual schedule of a typical 18-month program, for instance, balloons out to three years; additional resources must be allocated, or worse, the scope must be reduced. Another common mistake happens when scope creep occurs; what I like to refer to as “let’s rope the moon syndrome.” As a result, multiple critical processes are affected, making the management of change very onerous. One last pitfall to avoid is allowing technology to be the priority; in effect, implementing, upgrading, enhancing the technology before working on processes, workflows and people. When you take into account the delays that are already inherent in process change, by the time you can begin to work on the people, the technology is already outdated and needs to be re-updated!
Tarun Chandrasekhar is co-chair for the Geospatial Petroleum User Group and Geospatial Capabilities Team Lead for BP, supporting the U.S. Lower 48 onshore region. Chandrasekhar has over 15 years of diverse experience working in GIS and sub-surface data management. He has a master’s degree in urban planning and GIS from the University of Illinois.
Feature
The missing ingredient M
ost articles about standards focus on definitions and taxonomies. They discuss models, architectures and technologies. We worry about the hard stuff and firmly believe that if we get the facts right, it will be smooth sailing for the adoption of all of our great work. But if we want to get serious about the adoption and use of standards, we cannot forget the role of people in our organizations, people who often don’t behave according to a rational, full life cycle view of the industry. Let’s take a look at the challenges that make up the human factor for standards adoption.
First, what about the people who define the standard? The standard should be developed by the key stakeholders, who are subject matter experts. Standards organizations facilitate the process through vendor-neutral organizational processes. This ensures that the standard is driven
When discussing standards, we sometimes forget the most important part: people By Jim Crompton, data management and analytics consultant
by the business. This is the aim of all standards groups, but let’s look a little closer. Do the representatives of the business community ever show up? With weak sponsorship from their organizations and little recognition for the work they donate, there are a precious few of these voices-of-the-business that can participate. But when they do, their contributions are invaluable.
Many of the standards committees are comprised of a few consultants and representatives of technology companies. Their stake in the game is also important, but are these experts thinking about the advantage of the full community or the advantage that can be gained by their firm? This is a difficult balance. There are many who rise above the commercial temptations, and we are thankful for them. But it is difficult to avoid the pressures that fall on most of us to look for a short-term
sales advantage over a full value-chain, value-industry benefit. I lived through an experience where a large technology company joined my standards program just to torpedo it from the inside.
What about the people in the technology companies who develop and sell products and services based on the standards? Do they make more money from functionality enhancements or the integration of standards to make their products easier to use? How many of their customers really ask for a standards-compliant version over one with the latest bells and whistles? Standards processes tend to move slowly, so the temptation to get ahead of the standards and create a differentiated position is strong. The power of platform lock-in by de facto standards dominating the market has been a real factor in software adoption, probably more than the driver of standards compliance. Microsoft, Oracle, IBM and Google do it, so why can’t I?
Foundations | Winter 2015 | 13
Feature
What about consultants and service companies working with oil and gas operators and the challenge they have in recommending a standards-based solution versus a custom-fit one? Most consultants will look to see if a standards-based approach will work, so this discussion is held. But what if the standards approach is only 80 per cent of what the client needs? What if the standards version is several years old and doesn’t incorporate a new technology development? What if the standards approach doesn’t fit the clients’ existing processes and adopting it would require a large change-management program? The temptation to give them what they want, rather than lobby for the industry standard, can be overwhelming.
What about the managers who have to make the decisions to adopt or deploy standards? Can subject matter experts convince their employers to make the investment
to adopt standards? Do they have a loud enough voice in their own organizations? Does the idea of a proprietary approach to provide a competitive advantage or a differentiating value proposal ring stronger than a standards-based approach? Does a business case for a system upgrade in order to comply with a new standard create enough direct bang for the buck? How many managers are really standards champions in their organizations and how many just pay lip service to the idea?
And finally, what about the people who use the technology? How much of a pull is there to use standards-based solutions? How much discipline to follow the standards exists? Especially if it means a change in how a person has always done a task or if the cost of implementing a standard falls to one person, but the benefit falls somewhere else in the organization? I guess what I am saying is: the spirit is willing, but the flesh is weak. Even when
14 | Journal of the Professional Petroleum Data Management Association
we get the hard stuff right, there is still a change-management challenge we face to get the standard adopted. There will always be commercial, political and competitive forces at work. The perspective that standards are good for all is certainly true, but altruism is not always the most powerful force in the trenches of our businesses. For standards committees: think about people. Think about what drives them and how standards will benefit their daily lives. If it has to be a top-down mandate, think about the manager and give him or her a reason to risk some political capital on backing it. Think about the developer and the marketing person of the technology and the service companies who build the products that should embody our standards. Think about how standards enable these people to build a better product or service. Get the definitions right, get the data architecture right, get the business process right, get the technology right, and then get the human factors right. Wow, this isn’t so easy after all.
Feature
operation has to be changed to deliver maximum results with minimum resources in near real time, and quickly—but how?
EXPECTATIONS FOR GREAT CHANGE
All together now Transforming oil and gas operations and maintenance with collaboration and innovative IT and industry standards By Cliff Pedersen and Alan Johnston
A
s recent major incidents in both the upstream and the downstream sectors of the oil and gas industry have illustrated, the need for near-real-time data access and exchange has now become paramount. But the same system-integration solutions have been touted for the last three decades, and the need largely remains unfulfilled. System vendors come and go, and, while computer-based technologies and software continue to evolve at a manic pace in the consumer products industry, the heavy industries remain encumbered with fragmented, isolated systems that are fraught with onerous manual record keeping and reporting. Owner-operators continue to suffer from the highly chaotic, inefficient and often ineffective handover of information
from hugely expensive capital projects that still use paper and digital documents or spreadsheets that are not machine interpretable for finding, accessing and processing core asset information. Exacerbating the situation further, demands to monitor and comply with ever-increasing safety, health and environmental regulations increase the pressure to deliver maximum results while minimizing risks, improving reliability and lowering costs. What is a beleaguered manager to do? At first, the implementation of computer-based systems and integration of software applications seemed to provide the answer. However, that panacea has proven to be massively expensive and fragile with high costs to both maintain and upgrade the proprietary legacy systems that are currently the status quo. To survive, the
The Internet has created an expectation of transformation in all industries. Industry standards have the potential to enable similarly positive transformations within the oil and gas industry, but there are many complicating factors, including the silo-centric nature of the oil and gas industry, the long-lived physical assets that are already in the field and competition between standardization activities. The Internet relies on standards, which enable thousands of suppliers to build hundreds of thousands of products that work together without the need for individual suppliers to integrate their products with each other. This basic principal has led to massive efficiency gains in industries closely tied to IT. Meanwhile, consumer devices of almost every imaginable type are becoming connected and interoperating with each other, opening up many new business opportunities. The oil and gas industry has already applied many Internet standards to improve a variety of its IT functions. Of course, a small problem with a website or an internet query is that it does not normally have the potential to result in a health, safety or environmental disaster, while core operations and maintenance (O&M) activities in plants, complex platforms and capital facilities do present these sorts of risks. As a result, the people, processes and systems responsible for O&M are much slower to adopt new technology that may seem to be unproven in their domain. Oil and gas companies also do not generally have the ability to rip out and replace significant amounts of existing plant, platform and facility infrastructure, simply because they want to gain new capabilities. Since the oil and gas industry is facing significant challenges to bring about needed transformations, it is useful to see
Foundations | Winter 2015 | 15
Feature
ENTERPRISE BUSINES S SYS TEMS
M AINTENANCE
OPER ATIONS
Enterprise Resource Planning (ERP), Enterprise Risk Management
OpenO&M
TM
PHYSICA L AS SE T CONTROL Real-time Systems
The OpenO&M Initiative brings people, processes and systems together. how other industry groups facing similar challenges solve their problems.
OTHER INDUSTRIES The aerospace and defence industry, like the oil and gas industry, has been under intense pressure to improve productivity and operational risk management while at the same time reducing operating costs, due to the pressures of commercial competition and regulation on a global basis. Additionally, the complexity and scale of the physical assets used in aerospace and defence and the globally distributed nature of core O&M is similar to that of the oil and gas industry. Given the similarities, it is interesting to consider how the aerospace industry has gained business value with innovative approaches to help it manage its major operating assets on a full life cycle basis. In response to similar business pressures, the aerospace and defence sector and its associated systems engineering community have developed a paradigm called System of Systems (SoS). SoS leverages a system architecture providing a set of rules that define both the roles of the individual systems and how each system interacts with every other system, while the systems work together to perform complex operations.
Aerospace and defence platforms may incorporate thousands of systems from hundreds of suppliers into a single program and often must show the ability to replace a system from one supplier with a system from another supplier without degrading required platform capabilities. The SoS paradigm is now well established in the aerospace and defence sector, and the methodology provides a basis for interoperability meeting the complex O&M equipment of many asset-intensive industries, including the oil and gas industry.
OPPORTUNITIES AND PROGRESS Given the scale, complexity and heterogeneous nature of plants, platforms and capital facilities in the oil and gas industry, a supplier-neutral effort focused on solving well-defined industry problems that have been captured and prioritized by owners and operators seems to offer the best path forward. Leveraging the standards already used by the industry while embracing innovations like the SoS from similar industry groups is critical if the desired degree of transformation is to be achieved. Fortunately, this opportunity was recognized a number of years ago and has resulted in a series of interrelated industry
16 | Journal of the Professional Petroleum Data Management Association
standardization activities starting with The OpenO&M Initiative. The OpenO&M Initiative started in late 2004, when leaders of the ISA-95 Committee, MIMOSA and OPC Foundation, among others, began discussing collaboration on enabling standards-based interoperability for O&M-related processes, systems and applications. This combination of standards had been recognized by key owner-operators in asset-intensive industr y groups, which encouraged cooperation between leaders of the standards organizations to enable combined use of the standards. The resulting dialogue led to the OpenO&M Initiative, which was formalized in 2007 through a memorandum of understanding between ISA, MIMOSA, OAGi, OPC Foundation and WBF/B2MML. The OpenO&M Initiative has been driven by a core set of industry use cases, which were provided and prioritized by participating owner-operators from industries including the oil and gas, chemical, public utilities and aerospace and defence industries. The initial portfolio of use cases were selected because they appeared to be common across the participating industry groups and solvable based on a common set of methods and standards. The use cases were then documented and functionally decomposed into scenarios, which are ultimately associated with service definitions. The resulting scenarios and service definitions can now be reused in other cases and have a substantial focus on physical asset management, as these activities were very similar across the multiple industry groups, while operational models and associated markup languages were substantially more diverse.
Feature
INITIAL OPENO&M USE CASES USE CASE 1
Information Handover from Engineering/Procurement/ Contractor to Owner/Operator
USE CASE 2
Recurring Engineering Updates to O&M
USE CASE 3
Field Changes to Plant/Facility Engineering
USE CASE 4
Online Product Data Library Management
USE CASE 5
Asset Installation/Removal Updates
USE CASE 6
Preventive Maintenance Triggering
USE CASE 7
Condition-Based Maintenance Triggering
USE CASE 8
Early Warning Notifications
USE CASE 9
Incident Management/ Accountability
USE CASE 10
Information Provisioning of O&M Systems
Once the industry use cases had been documented, a systems analysis process was used to standardize as many factors as possible. It was important to ensure that the required adaptability was provided, largely via metadata, and a separation was maintained between the business process definition, orchestration and governance layers and the standardized application program interfaces, service definitions and data models. This resulted in a systems architecture covering the key layers of the Purdue Model, where the operational objectives of reference standards, such as ISA-95 and ISA-88, were implemented using the implementation standards included in The OpenO&M Initiative.
Substantial work took place from 2007 through 2010, during which time the need for two more key specifications, the Information Service Bus Model and the Common Interoperability Register, were recognized and were subsequently developed through a collaboration of the OpenO&M Initiative team members. The portfolio of standards associated with The OpenO&M Initiative provides a core for the key standards required by O&M systems in asset-intensive industries, including the oil and gas industry. The development of this portfolio approach to standardization also enables a new industry solutions methodology, based on SoS, standards-based interoperability and an open, supplier- and application-neutral ecosystem, rather than one based on traditional systems integration. Perhaps as important, the cooperation between the standards bodies holds real promise for helping suppliers deliver more business value to owner-operators on a more collaborative, adaptable and
About the authors: Cliff Pedersen is a chemical and systems engineer who has spent his professional career helping to plan, build and operate computer process control and IT systems in a wide assortment of oil and gas facilities, ranging from refineries in Ontario to bitumen upgraders in northern Alberta. A substantial portion of his career has involved industry standards activities. Pedersen has a thorough understanding of the business value that owners and operators derive from standards, which enables them to
sustainable basis that is truly supplier and product neutral.
LOOKING AHEAD Starting in late 2010, the Oil and Gas Interoperability (OGI) Pilot, managed by MIMOSA in cooperation with POSC Caesar Association and Fiatech, has mapped out a pragmatic path forward for the oil and gas industry. The OGI Pilot uses the next generation of the open systems architecture derived from the OpenO&M Initiative and incorporates reference data standardization based on ISO 15926, providing a full life cycle approach to standards-based interoperability for the oil and gas industry. A follow up article will focus on the how the OGI Pilot provides key enablers for the transformation of the oil and gas industry and will discuss increasing alignment with the Standards Leadership Council. More information about the OGI Pilot and the associated elements of the OpenO&M Initiative can be found at mimosa.org.
implement more effective operation and maintenance solutions in plants, platforms and facilities. Alan Johnston is a physical asset management and industrial IT industry standards expert with a lengthy career in developing and implementing life cycle asset management solutions in the integrated energy, petrochemical, public utilities, aerospace and defence industries. He currently maintains leadership roles in a variety of industry standards activities and is convener of ISO TC 184/WG 6, which is developing the ISO Oil and Gas Interoperability (OGI) Technical Specification.
Foundations | Winter 2015 | 17
Value added Data rules are the key to ensuring information brings value to a company By David Fisher and members of the business rules work group
T
he exponential growth of data in the oil and gas sector in the last several years has transformed the very nature of how we acquire, store and use information. Data is now a foundation stone for virtually every aspect of the modern oil company; any deviation from the highest quality or accuracy of that data can quickly evolve from a minor nuisance to a major impediment. As an industry, we spend hundreds of billions of dollars to acquire data each year. While it may seem obvious to data managers that developing and adopting open, standardized data rules are keys to good quality data, it is not always readily apparent to those who use and rely upon that information. Communicating what is important—and why—to the entire corporation is critical. An important first step of data rules is acquiring the correct data right at the beginning. For example, a well log header should include the date-of-run and the wellbore depth. These are typically specified in the service contract, but data rules can check that this information was actually delivered. There are many more complex rules that are also important but not as obvious. One way to ensure complete data is collected is to specify within the contract
the type of data—what header data, log curves, scales, etc.—and the data rules that must be passed before the customer approves payment. While it would be difficult for a service company to adhere to a different set of data rules for each client, an industry standard set of rules establishes consistent expectations of data quality from operators. Enforced, the result will be more consistent and higher quality data from the time of creation. To help aid this process, the Professional Petroleum Data Management (PPDM) Association is working to build a set of rules that would serve as an industry standard, which is available online at rules.ppdm.org.
FIRST THINGS FIRST Once data is acquired, it has no value unless it is used; the more and the longer it is used, the better. Some well logs are decades old and still in use. One cannot assume that data is “good enough for now”; data managers have to keep in mind that any discipline might be using it in the future. Data rules ensure that the information is complete and usable from the start. Every time we transform or convert our data, we risk losing some of the context or meaning. As an empirical rule, any conversion
18 | Journal of the Professional Petroleum Data Management Association
from one format to another results in the loss of data required by technical experts for exploitation or reprocessing. For instance, because a unique digital-log data-exchange standard does not exist, many data exporter software programs have corrupted files, which then reside incognito in our archives. But we do not have to transform or convert data if it is in a standard definition and format in the first place. This can be achieved using an industry-standard data model (such as the PPDM Association’s) and ensuring that data passes certain quality tests. It is often said that a geoscientist spends 70 per cent of the time finding and assuring quality of data and 30 per cent of the time interpreting it. Even worse, not finding a data item when it is needed can result in severe opportunity loss and additional costs. Although it may not appear in the records, more than one well has been drilled to re-acquire data that could no longer be found. Any effort that reduces time spent finding and qualifying data can increase the time for analysing and interpreting it. This improves the capacity of a company. Adopting data rules increases everyone’s ability to get things done better—and faster. Data rules and business rules capture knowledge in a systematic way. A set of
Technical Article
DEFINITIONS
rules is a strategic means to train young data managers with clear knowledge on how to recognize high-quality data and processes. Part of the PPDM Association’s mission is to build professional competence in data management, thus making data managers more effective and enhancing their career paths. Data rules can also improve the competency of a company. Because we are dealing with remote subsurface locations, there is never 100 per cent confidence in what’s down there. The best decisions are based on the full extent of available information. Just as a high-resolution camera shows greater detail on a subject, data of the highest available quality enhances the resolution of the subsurface. This can lead to greater success.
INDUSTRY-WIDE BENEFITS Finally, universal, standardized data rules benefit the entire sector and not just one company. The days of guarding all data in the name of competitive advantage are gradually disappearing. Instead, cooperation in improving the vast storehouse of oil and gas knowledge will, like an incoming tide, lift the fortunes of the entire industry. Today’s competitor is tomorrow’s partner; the best companies
will not be those with the most data, but those who leverage knowledge and collaboration to make the best decisions. How do we let others know about the importance of data rules? Communicating the above points in clear and concise terms is a good start. More importantly, formally involving the geoscience and engineering professions in the data process right from the initial act of acquiring and qualifying data builds a better understanding and appreciation. When it comes to raising awareness about the value of data rules, there are many roads to travel, but without a doubt, the journey is worth it.
More information about data rules and business rules is available at ppdm.org/ppdm-standards/ business-rules.
DATA RULES A data rule is a statement that provides the opportunity to validate compliance to expected conditions or to identify and review exceptions. It always resolves to either true or false. Data rules are intended to ensure that a program operates on accurate and useful data providing higher user confidence in the quality of the data. Data rules apply to data and information, not workflows and processes. They provide a method to define specific tests, validations or constraints associated with data items. Data rules are atomic: they each verify one item. An example of a well data rule is that the kelly bushing (or KB) should always be higher in elevation than ground level. BUSINESS RULES A business rule is a statement that defines or constrains some aspect of the business. Business rules describe the operations, definitions and constraints that apply to an organization and are put in place to help the organization achieve its goals. Business rules can apply to workflows, policies, procedures, regulatory compliance, computing systems, individual behaviour or corporate behaviour. An example of a business rule is that a well must not be spudded without a valid drilling permit and compliance to all regulatory requirements and permissions.
Foundations | Winter 2015 | 19
Photo contest
Foundations photo contest
“ WORBARROW BAY, DORSET” BY JACQUELINE SPALDING 2nd place of the January 2015 Foundations photo contest Whilst sailing along the world heritage Jurassic Coast we were fortunate enough to be able to anchor one night in this beautiful spot. This end of the bay is chalk, the other is sandstone.
20 | Journal of the Professional Petroleum Data Management Association
Photo contest
On the cover:
“ GHOSTS OF WINTER” BY JEREMY CALOW 1st place of the January 2015 Foundations photo contest
Overlooking the Sheep River valley in early winter prior to the annual road closure. The area is normally quite busy with wildlife but now has gone quiet with only a few birds reminding me of their presence.
Enter your favourite photos online at ppdm.org for a chance to be featured on the cover of our next issue of Foundations!
Foundations | Winter 2015 | 21
Upcoming events
UE
SYMPOSIUMS
WORKSHOPS
MARCH 9–11, 2015 HOUSTON DATA MANAGEMENT SYMPOSIUM & TRADESHOW
Houston Data Management Workshop
MARCH 9, 2015 Houston, Texas, U.S.
Calgary Data Management Workshop
Houston, Texas, U.S.
APRIL 2015, TBA Calgary, Alta., Canada
AUGUST 5–6, 2015 PERTH DATA MANAGEMENT SYMPOSIUM
Denver Data Management Workshop
MAY 5, 2015 Denver, Colo., U.S.
Perth, Australia
OCTOBER 19–21, 2015 CALGARY DATA MANAGEMENT SYMPOSIUM, TRADESHOW & AGM Calgary, Alta., Canada
JANUARY 2015 S M T W 4 11 18 25
5 12 19 26
6 13 20 27
7 14 21 28
FEBRUARY 2015 S M T W
T
F
S
1 8 15 22 29
2 9 16 23 30
3 10 17 24 31
T
F
S
3 10 17 24
4 11 18 25
5 12 19 26
6 13 20 27
7 14 21 28
MARCH 2015 S M T
W
T
F
S
4 11 18 25
5 12 19 26
6 13 20 27
7 14 21 28
1 8 15 22
1 8 15 22 29
2 9 16 23
2 9 16 23 30
3 10 17 24 31
APRIL 2015 S M T 5 12 19 26
6 13 20 27
7 14 21 28
MAY 2015 S M T 3 10 17 24/31
4 11 18 25
5 12 19 26
LUNCHEONS
JANUARY 20, 2015 MIDLAND Q1 DATA MANAGEMENT LUNCHEON
FEBRUARY 3, 2015 OKLAHOMA CITY Q1 DATA MANAGEMENT LUNCHEON
APRIL 14, 2015 OKLAHOMA CITY Q2 DATA MANAGEMENT LUNCHEON
Midland, Texas, U.S.
Oklahoma City, Okla., U.S.
Oklahoma City, Okla., U.S.
JANUARY 27, 2015 DENVER Q1 DATA MANAGEMENT LUNCHEON
FEBRUARY 10, 2015 DALLAS/FORT WORTH Q1 DATA MANAGEMENT LUNCHEON
APRIL 21, 2015 MIDLAND Q2 DATA MANAGEMENT LUNCHEON
Denver, Colo., U.S.
Dallas/Fort Worth, Texas, U.S.
Midland, Texas, U.S.
JANUARY 29, 2015 HOUSTON Q1 DATA MANAGEMENT LUNCHEON
FEBRUARY 17, 2015 TULSA Q1 DATA MANAGEMENT LUNCHEON
Houston, Texas, U.S.
Tulsa, Okla., U.S.
UPCOMING PUBLIC TRAINING SESSIONS
W
T
F
S
1 8 15 22 29
2 9 16 23 30
3 10 17 24
4 11 18 25
W
T
F
S
7 14 21 28
1 8 15 22 29
2 9 16 23 30
6 13 20 27
JANUARY 19–20, 2015 DALLAS PUBLIC TRAINING
FEBRUARY 2–4, 2015 CALGARY PUBLIC TRAINING
APRIL 15–17, 2015 DENVER PUBLIC TRAINING
Dallas, Texas, U.S.
Calgary, Alta., Canada
Denver, Colo., U.S.
JANUARY 21–23, 2015 HOUSTON PUBLIC TRAINING
APRIL 13–14, 2015 MIDLAND PUBLIC TRAINING
MAY 18–20, 2015 HOUSTON PUBLIC TRAINING
Houston, Texas, U.S.
Midland, Texas, U.S.
Houston, Texas, U.S.
All dates subject to change. ONLINE TRAINING COURSES AVAILABLE ALL YEAR ROUND! If your company is interested in private training, we are booking for 2015–16 now.
VISIT PPDM.ORG FOR MORE INFORMATION 22 | Journal of the Professional Petroleum Data Management Association
Join PPDM on LinkedIn Follow @PPDMAssociation on Twitter
Warning: Our data has gone mobile
Now, get geoLOGIC’s value added data almost any place, any time, any way you want it. Available through gDCweb on your tablet, smartphone, or computer. With 30 years of data experience behind it, the gDC is the source for high quality, value-added well and land data from across Western Canada. Another plus - our data is accessible through an expanding range of industry software utilizing our own easy-touse gDC GIS and our geoSCOUT software. View, search, import and export well, land and production data, documents, logs and more from almost anywhere. For more information, visit our website at www.geoLOGIC.com.
Leading the way with customer-driven data, integrated software, and services for your upstream decision-making needs.
geoSCOUT | gDC | petroCUBE at www.geoLOGIC.com