The 21st PNEC conference in Houston, Texas - the International Conference and Exhibition on Petroleum Data Integration and Information Management - has just wrapped up.
As ever, it felt part-conference, part-family reunion - petroleum data management is a small and specialised discipline. A few-hundred of us keep meeting, year after year, to share our learnings and stories, successes and failures, in the hope that we can improve data management practices in oil and gas.
No pain, no gain
Oil and gas data is not easy to work with. Compared to our industry’s data management, only Life Sciences offers an equal number of problems opportunities. Massive data volumes; data kept (and remaining relevant and important) for generation after human generation; measurement data that’s only useful when coupled with the all-important units of measure and quality indicators, and can only be trusted when all conversions and transformations are understood and verifiable. And it doesn’t help that (as Peter Black stated, succinctly) “all oil and gas software is crap”.
The technical and scientific software market is too niche to spark the kind of revolution you see in modern software tools like Slack, Salesforce, and IFTTT. Which means those who don’t want to write their own applications are held to ransom by a few large oil service companies concerned only with retaining market share at the lowest cost to themselves. We’re still relying on 10- or even 20-year-old software development techniques – and 10- or even 20-year-old software products in fact, because there’s not a lot of new software product (or SaaS) in this space.
Some things are changing
Cloud is finally being embraced. It seems the oil industry has learned that the cloud is probably more secure than its own data centres. Or maybe it’s just that with oil at $40 a barrel, they don’t care so much about keeping this part of the business in-house. Data lakes are being discussed as an alternative to offsite tape archive. Agile development techniques are creeping in.
Analytics featured more prominently in presentations this year. Oil companies are reducing their reliance on poor software products and starting to work the data themselves. The sustained low oil price has helped I’m sure, but big data technologies are maturing to the point where they can deliver on their promise, too. And open source helps - we can find open source code to manipulate common industry file formats, and numerical analysis methods are readily available in NumPy, SciPy, and other standard packages.
And some things stay the same
It’s not all about change though. PNEC has many traditions. One of my favourites (introduced by Jess Kozman a few years back) is The Business Case.
You see, petroleum data managers tend to see themselves as librarians or custodians - separate from the business; an undervalued support function – complainers, basically. Jess’s stance was that if we want our work to be appreciated, it’s our job to explain how what we do directly adds business value.
That means no longer talking in terms of fuzzy concepts like quality, completeness, and saying something should be done because it’s ‘the right thing to do’. Instead, we should be shouting about the things we enable oil companies to do differently.
So, when someone presents their project/study/piece of work at PNEC and articulates, clearly, the business benefit - how their work added value through cost savings and new revenue - Jess presents them with a ‘Business Case’, (actually, a case of beer – ‘Probably the Best Business Case in the World’).
This year, I cracked Jess’s Business Case
I presented a case study detailing a 6-month project worked by six Teradata consultants, and six geoscientists, data scientists, and processing geophysicists from our favourite Norwegian oil company. We used analytical techniques to study the operational aspects of a seismic source and I explained how the analytical results will:
- Bring OPEX savings through reduced vessel time.
- Eliminate the need for expensive engineering research projects to improve aspects that do not need improving.
- Increase the quality of the seismic data so that 4D interpretation will be quicker and better, allowing for more timely interventions to injection regimes (harder to quantify but definitely valuable).
Of course I did. Because at Teradata, demonstrating the value of analytics is central to the way we work - if you’re asking people to change the way they work, you need to be able to show that it makes good business sense.
I mean, if analytics failed to deliver business value, well, we wouldn’t be in business.
Would we?
Originally from an IT background, Jane McConnell specialised in Oil & Gas with its specific information management needs back in 2000, and has been developing product, implementing solutions, consulting, blogging and presenting in this area since.
Jane has done time with the dominant market players – Landmark and Schlumberger - in R&D, product management, consulting, and sales - before joining Teradata in 2012. In one role or another, she has influenced information management projects for most major oil companies across Europe. She chaired the Education Committee for the European oil industry data management group ECIM, has written for Forbes, and regularly presents internationally at Oil Industry events.
As Practice Partner for Oil and Gas within Teradata’s Industrial IoT group, Jane is focused on working with Oil and Gas clients across the international region to show how analytics can provide strategic advantage, and business benefits in the multi-millions. Jane is also a member of Teradata’s IoT core team, setting the strategy and positioning for Teradata’s IoT offerings, and works closely with Teradata Labs to influence development of products and services for the Industrial space.
Jane holds a BEng in Information Systems Engineering from Heriot-Watt University, UK. She is Scottish, and has a stereotypical love of single malt whisky.
View all posts by Jane McConnell