Transforming Subsurface Science


[PDF]Transforming Subsurface Science - Rackcdn.com83a7383a5e33475eed0e-e819cda5edf0a946af164bb0b2f2ae3c.r0.cf1.rackcdn.com/F...

0 downloads 139 Views 4MB Size

Data science meets traditional science Getting seismic off tape How meterologists do it Mapping how people work Using analytics to save on proppant Developing your own software modules Event Report, Transforming Subsurface Science, Apr 18, 2016, London

Special report

Transforming Subsurface Science Apr 18, 2016, London

Event sponsored by:

Transforming Subsurface Science

Transforming Subsurface Science New techniques subsurface people can use to help their companies drill profitable wells

This is an Event Report from our forum in London on April 18, 2016, “Transforming Subsurface Science”

Event website www.findingpetroleum.com/event/c8621 .aspx Some presentations and videos from the conference can be downloaded from the event website. Report written by Karl Jeffery, Editor of Digital Energy Journal [email protected] Tel 44 208 150 5292

Sales manager Richard McIntyre [email protected] Tel 44 208 150 5291

Finding Petroleum’s forum in London on April 18, “Transforming Subsurface Science,” looked at better ways that subsurface people can help their companies to drill profitable wells. This means predictable and repeatable wells.

We had case studies including using analytics to reduce the amount of proppant you need on a well, analysing public domain New Zealand data, mapping rate of penetration with rate on bit, understanding drillbit wear, and cleaning up old oil and gas data in Nigeria.

We also covered learning from how meteorologists do it, getting seismic data off tape, working with broadband seismic, seismic interpretation without ‘a priori’ knowledge, mapping how people work with information, indexing old information, and developing your own software modules.

Note: the event web page has the full agenda and links to some of the videos and slides of the talks – see

Topics covered include how (conventional) scientists can best work together with data scientists, where analytics has been proven to add value, and the best techniques for getting the most from data analytics.

Common themes included the importance of data scientists working together with domain experts (people who understand the oil and gas industry), and the importance of setting your objective before you start.

www.findingpetroleum.com/event/c8621.aspx

Conference produced by David Bamford Conference Chairman: Duncan Irving, Teradata Layout by Laura Jones, Very Vermilion Ltd Cover art by Alexandra Mckenzie

Digital Energy Journal www.d-e-j.com Future Energy Publishing, 39-41 North Road, London, N7 9DP, UK www.fuenp.com

2

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

Transforming Subsurface Science

Duncan Irving – the challenges of science and analytics Bringing together ‘traditional’ science, such as geoscience, together with data science, can be difficult, particularly when data scientists try to do something which is traditionally in the science domain. Teradata’s Duncan Irving has some ideas for how to do it There can be enormous value from bringing ‘traditional’ science together with data science, or data analytics. There can also be some friction, said Duncan Irving, oil and gas consulting team lead for EMEA / APAC with Teradata.

Much of the development in data science came from Silicon Valley companies, where it was used by internet companies to try to get some understanding from the enormous amounts of data they had, about how people behave on websites. For example, eBay uses data mining to try to work out which colour buttons on its website lead to the most sales.

Also, “I don't think we [geoscientists] are very good at scale out computing, [working out] how to put it together in a way that scales at speed, scales at size, with the complexity of the system we are trying to understand.”

This is quite a different environment to oil and gas subsurface, he said.

Mr Irving has a Phd in glacial geophysics, and first worked in the oil and gas industry on subsurface projects, which led to consulting work in data management and workflows, and then ‘big data’ projects. He has been with Teradata since 2008.

Meanwhile, there has also been a lot of criticism of data driven approaches, especially when they come into conflict with areas traditionally understood using ‘hard science’.

It may be useful to compare oil and gas subsurface with weather forecasting (meteorology). There are many similarities between the two. Like subsurface people, meteorologists run big computer simulations on high performance computers, and use these together with real time observations (such as, it is raining in Cardiff now).

Mr Irving’s company, Teradata, has been involved in ‘big data’ for 20-30 years, much of which is in the consumer / retail sector, helping companies understand customer behaviour. For people trained as scientists, the ‘scientific method’ is well understood. “Scientists are very good at looking at data and developing a hypotheisis that leads to some understanding of the system they are studying,” he said. As scientists, “we like to capture data, and the more data we have the more robust our scientific insight is.” Geoscientists use data to make predictions about how the well will flow and how long for. The oil and gas industry has complex workflows which use ‘high science’, in understanding the subsurface. Geoscientists also study ‘geostatistics’ at their university courses, although not many use it in their day jobs. Now, we have a new breed of scientist called the ‘data scientist’ he said. Data science can also be known as “playing around with data to see what's in it,” he said.

An example is when Google claimed to be able to predict where the next flu outbreak in North America would be, based on studying people’s searches. It looked very exciting, “until it was proven that they got it wrong,” he said. “They were just showing when it was winter”. Statisticians, which is what data scientists really are, “don't get it right all the time,” he said. A lot of people thought there was a degree of arrogance in some of the data analysis, and were pleased to say, “Hey, you Silicon Valley guys, you don't get it right all the time.” Perhaps, when looking to bring together science and data analytics on a large data set, the right question is, “what is the least amount of physics you need to put in there to impart scientific understanding,” he said. For example, geoscientists have some very elaborate models, “but do you need all of that to explain your production forecast? Do you actually need all the degrees of freedom, all the petrophysical parameters in there?” “I don't think geoscientists are particularly good at statistics, particularly when compared to a room full of bio scientists,” Mr Irving said. “Geoscientists are very good at domain insight but as an industry we are too narrow.”

Weather

However meteorologists might be better than subsurface people at putting everything together with other data, including historical data. For example a meteorologist could work out that there have been several storms in the same region in the past few weeks, and so the ground is likely to be already waterlogged, and the next storm might cause a flood in a certain city. Meteorologists might be better than subsurface people at learning from the past, for example tracking what happened the last 30 times a weather front came through like the one which is forecast to come through tonight. “It is about marrying the science with big data and real time observations, using data mining and understanding what the impact will be,” he said. Reservoir engineers could use similar thinking, if they were asking questions like, “When did I last see de-pressurisation of this scale,” he said.

Oil and gas industry It can help the oil and gas industry, which is acquiring data faster than it can process it, and working with many more data types.

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

3

Transforming Subsurface Science For example, in the unconventional oil and gas sector, companies might have just 30 days to try to get insights from data, which will satisfy engineers, scientists and business people.

A problem is that the oil and gas industry, like many heavy industries, keeps data in silos. This makes it hard to bring it together and do crossfunctional analysis. “It is difficult to get trustable data from one business silo to the next,” he said For example, it would be useful if you could gather all of the data from all of your compressors in the North Sea, and also compressors operated by other companies, so you could see if one was operating in a way different to the others (giving early indication of a problem).

But this is hard because typically the compressor on an offshore platform will send data back to a data centre which gathers all of the data from that specific platform, but does not have data about any other compressors.

There are many sensors in the oil and gas industry, some of which have been in place for decades, for example measuring flow and enabling better control. The industry might get more interesting insights if it had ways to use data from different sensors together, for example combining sensor data with production historian data.

The oil and gas industry also has structured data, unstructured data, and data which is a mix of both (sometimes known as ‘multi-structured’). The various ‘tribes’ in the industry speak different languages.

However there are various languages and tools which can be used to delve into data in different formats, for example Python and the statistical language R. “They let you get into the data and find things out you weren’t' expecting to see,” he said. Many people start on a data analytics project by trying to get the company data as organised as possible. But this can be an expensive project with no obvious business benefit. “That's doesn't fly in the current climate,” he said.

A better approach is to just try to find something useful out of the data, working through it using languages like Python and R on an analytics system.

4

Case studies

dent, it indicates what an oil company could do with a larger experienced team.

Mr Irving presented six case studies of Teradata projects, where the company has “taken stuff [data] and thrown it into a big bucket, and found something useful,” he said.

The second case study was to look at the relationship between “rate of penetration, “weight on bit” and “borehole calliper” (diameter of the well) for 1900 wells in the UK North Sea.

The first case study was a project where Mr Irving, together with a Masters student at the University of Manchester, took a whole basin worth of publicly available data from a New Zealand government website. The aim was to find out how much useful new insight could be gained from the data in just 6 weeks work. The basin had 2,500 wells, each with about 12 logs.

There were many words describing the same thing, so the first task was to do text analytics to change the terminology on the data so that the same formation was always described with the same time. “That was a good start, a lot less words were required,” he said. Next, the headers from the well logs were replaced by a standard system describing the well location and what kind of well it was. The next step was to analyse the logs themselves, putting all of the data in a single analytics system.

The experiment was to see if it might be possible to find some hot shale rock, which had not been previously identified. The researcher found about 24 of them. It was possible to automatically classify what sort of rock each section of the well log referred to, such as interbedded sand and siltstone, or interbedded mud and siltstone. The well logs had previously been classified manually, and many of them had been interpreted wrong, Mr Irving said. After this work, the researcher had a much clearer and simpler model of the whole basin. Using this, it became clear that one reservoir was actually a continuation of another reservoir, not a reservoir on its own. If work like this can be done with a Phd geophysicist (Mr Irving) together with a MSC Stu-

It generated a picture which is useful to drilling engineers, perhaps to support something they already believe but do not have data to back up, Mr Irving said. The third case study is from a US unconventionals driller who wanted to reduce the number of unscheduled ‘trips’ it needed to make (taking the drill bit out of the hole) in order to change the drill bit, while doing the horizontal section of wells. The rate of drillbit wear is related to both the choice of drillbit and the geology it is drilling through. The drillers were making visual inspection of the drillbit every time they took it out of the hole, recording its condition with various codes. The analytics work mapped these codes with the drill logging curves. Certain patterns became apparent, for example that some drillbits showed faster wear when being used in a certain way. It was possible to put numbers into a qualitative assessment, ie demonstrate that what people believe is correct. It was possible to draw a ‘path’ of the various torque settings and limits of ROP (rate of penetration) which will lead to the drill bit wearing out faster than usual. This insight led to an indicator system on the drilling software which would light up when drilling operations were going to wear the drill bit out faster. The light would basically mean, “You’re doing this wrong, read the rule book,” he said. The fourth case study was in 4D seismic (a repeated 3D seismic survey to understand how a reservoir is changing). There was a dispute between geophysicists over the reason for an unexpected change in the data. One geophysicist thought it was due to pressure

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

Transforming Subsurface Science changes, another thought it was because of a velocity effect in water flood. The analytics showed who was right. It also showed the client and effect which was dominant in their reservoir which they were unsure about, thus helping de-risk field development. A fifth case study is working with GPS data from the recordings from seismic streamers. This could be used to evaluate the sea state (if the sea is rough, the GPS sensor will move up and down more). Knowledge of the sea state could then be used to help in the seismic interpretation, for example understanding what happened last time a survey was made with a sea state like this, how the sea state affects the ‘ghost’ recording (when the seismic waves are bounced up to the sea surface and down again). GPS data is normally only used to check that the streamers are in the right place. The sixth project was to see if it was possible to detect a collapse of well casing from passive seismic data (a recording of a seismic wave field created naturally, not from a seismic source). This was taken from several thousand sensors on the sea floor, with fairly high amount of noise in the data. Teradata consultants were told that at some point in the recording there was a failure of casing in a well, but not given any further information.

The consultants were able to find the casing failure and the location of it in the passive seismic. They could also see changes to the seismic wave field in the period before the casing failed. It could see that the seismic velocity was changing, which indicated something was happening in the fluid. A system like this could possibly be used to predict a casing which is about to fail, as an early warning system.

Variety of skills To get insights from data you need a variety of different skills, including statistics / mathematics, science, computer skills (how to work with the software and program in SQL), Mr Irving said. You need someone who can understand the domain itself (subsurface / offshore operations), to make sure the project has a good scope, and will lead to a value proposition. “You have to have someone who's going to present the business impact to someone with a budget.” You may need to work together with other companies. For example, Teradata works together with analytics company Tessella. “I haven't seen a service company that can do more than 2 or 3 of these things particularly well,” he said.

Useful techniques It helps if you can start with the data in the most granular form, with every single measurement, with its own time stamp. There is no need to simplify the data to make it easier to store or process, because data storage and computer processing are very cheap. You need to write down what you are doing, as you go along. Mr Irving calls the process of understanding how data moves from step to the next during analytics “data lineage”. You want to get the data into some kind of data system. “Don’t play around with it in Excel, because it stays in Excel, you'll never find your way out of Excel,” he said. You have to write down what you are doing. In some companies, staff change jobs very frequently and there may be someone else who needs to understand what you did. You need to understand if you are trying to find something which happens as a one-off, or if it is something the company should look out for continuously. Once you have the insight from data science, then you need to ‘operationalise it’ so the company does it all the time. Download Duncan Irving’s slides at

www.findingpetroleum.com/event/c8621.aspx

Getting seismic data off tape Managing seismic data would take a big step forward if people used disk drives rather than tape, communicated it electronically rather than by physically transporting disk and tape, or better still, did not move data at all. Alan Smith explained Companies still manage data in very similar ways to how they did it 20 years ago, and subsurface staff might spend more than half of their time looking for it, or reformatting it, said Alan Smith, data management consultant with Luchelan Ltd.

Dr Smith is a former principal consultant who has worked with Shell, the Brunei Prime Minister’s Office, and has worked in a number of interim management roles including as CIO for E&P with oil company OMV. He was speaking at the Finding Petroleum forum in London on Apr 18, “Transforming Subsurface Insights”. In 1991, an article was published in the Oil and Gas Journal, saying that geophysicists and

geologists were spending 60 per cent of their time looking for data (or getting it ready to work with), 18 per cent in ‘useful work’, 5 per cent in ‘meetings and presentations’, the rest in training and coffee breaks. Since then, not much has changed. This might be because our systems to manage data also haven’t changed very much. Consider that in the 1990s, data was delivered

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

5

Transforming Subsurface Science on big tapes, lots of data was stored in boxes, or on tape on shelves. People searched for data by typing in codes on a text based screen. Data could be transported using ‘Exabyte’ tapes, which held 60 to 150 GB, but only had a 6 month life. In 2015, data is still stored in boxes and tape is in racks. There are a few higher density tapes available and data is sometimes delivered on disk. Companies still search for data using text based systems, or sometimes a GIS [geographical information system] front end to help you find data. Many companies still rely on people who know where to go to find data. There are some parts of the world where data can be directly accessed over the internet, for example with Norway’s “Diskos” system. “It’s the exception rather than the rule,” he said.

Data challenges Managing data is getting more and more complicated. As an example, consider seismic company PGS, which made 60 per cent of its 2015 revenue from selling ‘multiclient’ data, where the same seismic data is sold to different clients. Historically the data starts its life on a seismic vessel, which has multiple streamers recording seismic data onto tape. The tape is sent to a processing centre, which puts the data onto a data interpretation system. There are interim products in the seismic interpretation process, which get stored on tape somewhere. “There’s lots of tape handling, lots of scope for errors,” he said. The final interpreted seismic data ends up getting stored on a shelf somewhere, probably both by PGS and by the oil company customer. PGS had used the same data management system since the 1990s, provided by an outside company. “Getting data out of that system was quite difficult. It didn't store data in industry standard formats.” 6

“If you want to re-process you have to get the tapes back from store. There are lots of points where things could go wrong.” It was really a ‘trace handling’ system for storing and handling seismic recordings, not a data processing system, he said. PGS is moving to an advanced data handling systems, based around disk drives and electronic data transfers. The latest seismic vessels today have streamers of the order of 8km in length, recording large amounts of data. The data will be stored on a disk drive on the vessel. “The data is generated so fast, you can't actually put it onto tape fast enough,” he said. The data may be spooled onto a tape to send it to the processing centre, or it might be transferred by disk. “In theory it doesn’t need to touch tape,” he said. Once the seismic processing is finished, the data is stored in a storage system, on a disk drive, without any tape being used. The data can be delivered to oil and gas companies electronically, if they want. There are software tools which allow data transfer at 2 to 5 times faster than with File Transfer Protocol (FTP), he said. However many oil and gas companies do not have capability to receive large files electronically. Data is only put onto the standard SEG-Y format tapes for long term storage, or it is kept forever on disk. All the tapes are registered into a system with a ‘check-out’ process, so people can see what is available and where the tapes are. When tapes are checked out, backups are made. The offices have robotic systems which can take tape from shelves and send the data online. Quality Control (QC) s part of the process, making sure that data as it comes in is correct, making sure the data as it leaves the processing centre is correct. This is a big change from the past. “Often in the early days the only time data got Quality Controlled is when it got delivered to the client. The client had the problem of correcting mistakes, to allow headers to be loaded properly.”

PGS works together with Ovation Data to manage the network. It has a network connecting PGS offices in Houston and London, and Ovation Data offices in Houston and London, with 10 gpbs data communication links. Altogether 120 terabytes a day can be moved around, he said. Data can be backed up in multiple locations. “The chances of loss are quite small,” he said.

Clients today Today, PGS’ oil company customers are not only interested in ‘post stack’ seismic (where you group together all of the seismic wave fields which have passed through a single point in the subsurface). Companies are asking for ‘pre-stack’ as well (the seismic data before this task has been completed). Typically clients will ask for a seismic data covering a slightly different geographical area, and data at varying stages of processing, and PGS will need to be able to provide it. For example, prestack data for a specific polygon. This means that the data needs to be cut to the correct co-ordinates. Historically, this meant a lot of manual handling and intervention. Now, it is virtually all automatic. A few years ago, it could take from days to months for a data request to be delivered, depending on the complexity, and there was lots of room for error, and a lot of staff time involved. Now, if you order post stack data you can be getting it within a few minutes of ordering it, or it can be spooled to disk within a few minutes. “It has far fewer errors, it is significantly cheaper. Customers are very pleased with that type of service,” he said.

Leave data where it is In the future, it would be better if the amount of moving of large data files was reduced. When data is moved between systems and transferred into different formats, information is often lost. It should be possible for computer software to interpret data which is stored somewhere else.

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

Transforming Subsurface Science “Large volumes of data can be 'live' on the internet, they don't have to be sitting on your laptop or workstation,” he said. Data may need to be converted to a different format for a different computer software package to work with it, but the reformatting can be done in place (as the information is required. “I'm not saying it’s simple and straightforward, but it’s one option,” he said.

voir is depleting, a new survey is done every few months. If it takes 3-6 months to process seismic data, it means that you are acquiring the next survey before anyone has been able to look at the previous survey. A challenge with working on data stored on the other side of the world is latency (the time to communicate instructions and receive a reply). If you are interpreting data from one side of the Atlantic with the data stored on another, “it takes time for my mouse to move.” But “people are looking for ways to get over that, it’s not such a big issue,” he said.

This would mean that geoscientists could start work interpreting data immediately after the survey. This would avoid problems we see today, where it can take 3-6 months to process a new seismic survey.

Automated interpretation

In “4D seismic” surveys, which show how the subsurface is changing over time as a reser-

In future, seismic interpretation will probably become more automated, because there are

fewer and fewer people in the industry who are capable of doing it manually, he said. “Analytics will play a far larger part in what we do and say, not just the scientific side of building models.” “It will come, there will be luddites who don't like the idea.” However we will still need a few people with a geological understanding. You need domain expertise to be able to tell which of the correlations the computer discovers might be helpful, and which are just chance, he said. “You've got to ground everything in reality at the end of the day, otherwise you end up with statistics,” he said.

PGS – working with broadband seismic Seismic service company PGS is encouraging oil and gas companies to record seismic over a wide range of frequencies, or ‘broadband’, because it enables them to get a much better understanding of the subsurface

Recording more high frequency seismic makes it possible to understand the subsurface with more accuracy. Recording low frequency seismic can be used to calculate actual reservoir properties about the reservoir, said Cyrille Reiser, reservoir characterisation director with PGS. He was speaking at the Finding Petroleum forum in London on April 18, “Transforming Subsurface Science”. If you look at a real outcrop, you will see it has a very complex structure, with rock layers centimetres thick and with various rock properties. But seismic data can get a maximum of 10s of metres resolution, and usually about 25-30m depending of the reservoir depth.

The higher the seismic frequency, the more detail you can understand the subsurface, because the distance between peaks of the wavelet becomes shorter.

With interpreted high frequency seismic data, you can start to pick out the exact top and base of the reservoir, he said. You can also get a better understanding of rock properties, for example the sandiness or the porosity of a reservoir. Broadband frequency seismic data (rich on the low and high frequency) help significantly to make the absolute measurement of subsurface properties, such as elastic properties easier and with less ambiguities and more predictability away from a hard control point,

This enables you to reduce the amount of “a priori” or previous information required to do an interpretation, quantitative seismic interpretation. Usually, seismic interpreters use data recorded from well logging tools to help with seismic interpretation. Also, the more you can directly calculate the reservoir properties from the seismic data, the less subjective interpretation is required. Subjective interpretation takes a great deal of geoscience understanding, which is becoming less and less available these days, Mr Reiser said.

Ideally, with broadband data (if acquire and process adequately) you could get all the information you need to make a decision mainly based on the seismic survey, he said. PGS’ believes that broadband seismic (and more specifically the multi-component dual-sensor streamer) can provide more insight to the subsurface without making too many ‘a priori’ information a company might have and increase their probability of success. The data workflow is usually to develop a simple ‘low frequency’ model of the subsurface (a simple model based on low frequency seismic), and use this as a basis for the pre-stack seismic inversion and the understanding the lithology (rock structure) and fluids which might be contained in it.

More data High bandwidth seismic means recording much more data, he said. You now have two sensors (hydrophone and vertical particle velocity sensor) recording seismic, recording the wave field

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

7

Transforming Subsurface Science up-going and down-going, where previously there was just one (the combination of the two).

gether with the usual hydrophone for seismic recording.

The data is recorded with many more sensors, located on longer streamers, and you have more streamers (more than 12 streamers). The recording sampling rate can be 2ms instead of 4ms.

With a little processing, you can use this to separate the seismic waves which are reflected from the subsurface from all the other recording, he said.

Altogether, you can be recording twenty times more data than 10 years ago, he said.

North Sea example

The seismic processing is much more complex, with different products generated at different stages of the processing.

Notches Another problem PGS has solved is the “notches” which are found in seismic recordings at specific frequencies, perhaps every 50 Hz depending of the streamer/seismic and source depth. This means you get zero amplitude response at these frequencies, due to the seismic wave bouncing up to the sea surface and down to the streamer. PGS has come up with one solution to this, a dual-sensor or multi-component streamer (hydrophone and a vertical particle velocity sensor) which allow the separation of the the up-coming and down-going wavefield. This is combined to-

8

Mr Reiser presented an example of how broadband seismic was used to try to de-risk a possible reservoir prospect off the West of Shetland Islands, UK. The information most important to geoscientists is the size of the reservoir, its porosity, and the ‘net to gross’ (how much of reservoir actually contains oil). In the seismic interpretation, it was possible to compare the P wave velocity with the S wave velocity (Vp/Vs) for different areas of the prospect. The Vp/Vs ratio gives a good indication of the lithology and fluid. From your understanding of lithology you can assess porosity. The low frequency model (based on the 0-2 Hz data) can be used to get acoustic/shear impedance low frequency model. The absolute properties can be used to make some “simple cross plots”,

which will show for example which of the acoustic impedance and Vp/Vs calculations are for shale and which are sand. The lowest acoustic impedance and Vp/Vs is probably hydrocarbons. “It’s a very efficient way to do it,” he said. Using methods like this, geoscientists they can get useful data as quickly as possible from 3D seismic, without introducing too much interpretation and more importantly some a-priori or preconceive ideas in their decision making process. Mr Reiser also showed the workflow being used on a tertiary reservoir in the Forties Sand (North Sea). One well had already been drilled in the area, and did not find the hydrocarbons which were expected, predicted by this acquired broadband pre-stack seismic inversion. PGS’ seismic processing “predicted very nicely the presence of hydrocarbons,” in other locations he said, and it also did not show any hydrocarbons in the reservoir which the company had previously drilled. “Based on this type of analysis you probably would think twice about drilling this one, you get a flag, there might be a problem.” Once you have all the results, you can use the well data for final refinements, he said.

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

Transforming Subsurface Science

Andrew Zolnai – how people work with information Consultant Andrew Zolnai is offering a service to map how people in your company actually work with information, and how the information flows, which he calls the ‘information supply chain’ Consultant Andrew Zolnai is offering a service to help companies understand their processes of how they actually work with information, which he calls the ‘information supply chain’.

The concept of an ‘information supply chain’ has been used in many industries. You can look at the systems which support this information supply chain, and what the people do, and map all the complexity, how the company gets from input to output. “Work” in the oil and gas industry basically comes down to creating information, acting based on that information, and in doing so tying in people and processes he said. This builds up to form a workflow of our industry’s complex processes.

For example, consider a company deciding whether to drill in a certain region. The work process might start with a technician searching all the data the company has about that region, and putting it together in a ‘master map’. Then a geologist starts doing a play fairway analysis, to see if there might be the components which lead to an oil reservoir (a source rock, charge, reservoir and seal). At the end the exploration manager has various

prospects : that make up the portfolio, which the oil company takes to potential partners or permit submissions. The oil and gas industry is often not very good at understanding what its processes really are. As an example, consider that oil companies might spend a great deal of time choosing and implementing a software system to manage their procurement, but then no time at all working out how the different suppliers will work together. This can lead to poor communication systems and duplication of work, he said. And the work oil and gas people do has been getting more and more complicated over the past few years, and there is less time and money to do it. “There are ways that can help us organise ourselves,” he said.

Mr Zolnai’s service involves visiting the company, and sitting down with people who actually do the work, to try to find out exactly what they do. Mr Zolnai maps the processes using online software made by a company in Wellington, New Zealand, called LINQ Ltd. (www.linq.it). Once you have a map of the process, it becomes easier to see what bottlenecks might be, where the whole process is held up by one time consuming task. You can also make better predictions of how long it might take to do a complex task, and see if it might be possible to do it in time for an upcoming licensing round.

This also means that the company can see which parts of its overall process needs technology investment to get the work done better. It is much better to implement technology to satisfy a known business need, rather than buying technology and working out how to use it afterwards, he said. This can also help companies make sure their overall ‘digital asset’ works properly, he said.

Mr Zolnai did a project for Italian oil and gas company AGIP KCO, which wanted to better understand its ‘digital asset’ for the Kashagan Field, offshore Kazakhstan. It turned out to be a “very complex data flow, with a large portfolio of systems”, he said.

Mr Zolnai did a similar task for Aera Energy in the US, helping the company understand data flows. In a six month project, his team spent the first half of the time doing interviews with staff members, to make sure they really understood the processes. Understanding how people work with data can be more important than the technology itself, he said. It is important that people feel that the person trying to map out the processes is on the same ‘side’ as the employees, he said.

Download Andrew’s slides and view his talk on video at

www.findingpetroleum.com/event/c8621.aspx

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

9

Transforming Subsurface Science

OAG Analytics – workflows for core planning functions OAG Analytics of Austin, TX has developed a number of ‘workflows’ for oil companies that can help them achieve a specific objective from their data, for example value an area of interest or optimize the completion design for a new well.

OAG Analytics of Austin, Texas, has developed analytics workflows to extract meaning from complex data, which can help companies improve core planning functions like optimizing proppant in an unconventional well and rapidly understand the value of their acreage. In less than one hour OAG’s Insights Workflow helped a firm identify an opportunity to save $400,000 per well by using less proppant to achieve virtually the same level of production, said Luther Birdzell, founder and CEO of Houston based OAG Analytics. This insight alone would produce a material return on investment in OAG’s software for this customer. He was speaking at the Finding Petroleum forum in London on April 18, “Transforming Subsurface Science”. The company has also developed workflows which can help companies value acreage in 4 hours, where it previously took 5 days.

It can take as little as an hour for an oil company to get value from the software, analysing its own data, in combination with public data and subscription data, he said. OAG Analytics is helping companies do automated data management, particularly regularly updated data, such as production logs and pressure data.

Mr Birdzell has a background in electrical engineering, and previously worked in the software industry, helping companies optimise corporate IT.

He started Oil and Gas Analytics in 2014. OAG started building a customer base with onshore North America and has expanded to international onshore and offshore.

Management, analysis and activation To get value from analytics, you need good data management, data analysis, and ‘analysis activation’ – doing something with the results of your analysis, he said.

10

Your data management strategy must include providing access to the data, managing the quality of the data, and data storage. To scale, firms usually need more automated data quality control than they typically have, he said.

Right-sized data analysis, includes analytic techniques with the minimum level of ‘complexity’ required to extract reliable signals from the data. If you just have a big cloud of complex raw data, with weak correlations between different aspects, it is very difficult to use it to reduce costs and increase profitability, he said. Then the analysis needs to be used together with appropriate software, so it can deliver value to the company decision makers.

High and low consequence It may be useful to note that much of the risk management in the oil and gas industry includes relatively low volumes of high consequence decisions, he said. In this, it is similar to mining, medical, pharmaceutical, and some areas of finance. But it is different from many of the industries that have lead the acceleration big data and advanced analytics, such as web search engines, social networks, and online book and video stores (Netflix, Amazon). They typically have the opposite, a large volume of low consequence decisions, which often lend itself more readily to statistics-centric analysis.

Statistics based techniques are often easier to operationalize in industries with high volume of low consequence decisions, he said. However, make no mistake, there is a huge amount of untapped value in data that most oil companies already have, but most firms need help to unlock the hidden value in their data.

Higher consequence decisions require a different approach to advanced analytics, most notably more collaboration with domain experts. Effective collaboration requires more transparency in the analysis so that domain experts can validate that the results make sense.

Workflows The company has an “Insights Workflow” aimed at finding ways to optimise your project, by maximising net present value (NPV), or internal rate of return (IRR), or maximum production. It starts by gathering together all the various raw data sets in an oil company, including subsurface (G+G) data, drilling data, completion data, well location data, other well data, production data, and financial data.

It has a workflow for unconventional wells, aiming at helping companies make better predictions of production from a certain well. There is frequently a wide variety in production levels from different onshore oil wells with similar designs and costs. This links to the decision of how much proppant to use. For fractured wells, in general, the more proppant you use, the more production you will have. But many of the high proppant unconventional wells in North America are among the most volatile on both a production and profitability basis.

The data very clearly shows that some high proppant wells, aka “Super Fracs,” have been goldmines, others have been disasters, he said. This is embarrassing at $100 oil, it can be crippling in a lower price market.

Purpose-built workflows Transforming risk management with big data and advanced analytics is extremely complex. Very few firms have achieved material success with this initiative using traditional commercial software (COTS), largely due to the complexity of most COTS software.

Evolving how decisions are made with data-driven insights require three things: data management; analysis; and enabling decision makers to test “what if” scenarios pre-transaction or predrill. Each “module” that is implemented with commercial software is a high-risk IT project. Very few oil and gas firms have successfully completed three parallel high risk IT projects; even fewer have created the inter-module syn-

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

Transforming Subsurface Science ergies required to create a cohesive data-driven workflow.

Occam’s Razor [“Entities should not be multiplied unnecessarily”] guides us to simplify each aspect of a complex problem, uniquely achievable in this case with custom data management, analysis, and activation software modules. Some firms may choose to develop these capabilities in-house. Others will partner with firms that have already developed proven solutions. Your chance of success increases dramatically if you only include the capabilities you need to accomplish a specific task, Mr Birdzell said.

For example, OAG Analytics has built tools to automatically detect well identifiers in data, something which is required for nearly every data set. “Taking that purpose built approach has enabled us to reduce huge amounts of complexity.”

Purpose-built workflows can lead to lower cost, faster delivery time, and also reduced staff training time, because the software becomes much more intuitive, he said.

works with is always changing and volumes are growing rapidly.

Many advanced data projects are stifled because companies want to keep to using their old (‘legacy’) technology, he said.

Machine learning can often uniquely isolate the effects of individual independent variables (stage spacing, proppant, fluid, location, etc.) on return on investment (ROI), i.e. quantifying the impact that the various independent variables have on oil and gas production.

Scalable machine learning solutions should be able to build and test many different algorithms, and then measure which of these produce the models that best represent real world system behaviour.

Be wary of ‘overfiltering’, i.e. reducing large and complex data sets to small samples with stronger correlations, as it often contributes to spurious conclusions.

One useful data visualisation technique is a graph showing the impact different parameters have on production. You can also visualise data with a ‘chord diagram’, showing the magnitude of relationships among different parameters with lines connecting them; thicker lines inducate stronger relationship between parameters. As oil and gas data gets increasingly complex, we are seeing more data sets in which, “everything in the data set is related to everything else,” he said. “Those characteristics have additional complexity that is frequently beyond the reach of traditional analysis techniques like XY plots. Machine learning can be fantastic for navigating this intricate web of complexity. It can isolate the effect of these individual parameters on what we're trying to understand. View Luther’s talk on video at

www.findingpetroleum.com/event/c8621.aspx

By comparison, commercial software typically supports every feature for every customer it has ever had. Virtually no single firm needs all of those capabilities.

How to make it work Before you start a project, you should define your objective, he said. You should then develop a data strategy and then improve it with further iterations. Pilot projects can be useful for testing methods to solve high value problems.

The greatest data-driven value creation we have seen in oil and gas resulted from collaboration among executives, geoscientists, petroleum engineers, big data strategists, data scientists, and big data software engineers, since virtually no one is an expert in all of these requisite disciplines.

It is useful to note where you have the most data. “If we get outside the boundaries of the data, we're extrapolating. The predictions will be nowhere near as reliable as predictions in the data rich area.” Your data strategy must be able to continually evolve, because the type of data the industry

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

11

Transforming Subsurface Science

List of attendees 'Transforming Subsurface Science’, The Geological Society, London, April 18, 2016 Allan Induni, Geoscientist

Ashleigh Hewitt, Director/Consultant Geophysicist, A.S.H. Exploration Services Ltd.

Christian Bukovics, Partner, Adamant Ventures Jonathan Gaylor, Affinity

Alexander Estrin, AL Capital Management Feargal Murphy, ARKeX

Stephen Rippington, Director and Consultant, Astute Geoscience Ltd. Christian Richards, Sales Consultant, Austin Bridgeporth

Andrew Harber, Consultant Acquisition Geophysicist, Barnes Geophysical Services Ltd. John Boucher, Director, Beagle Geoscience Joe M Boztas, Director/Interpreter, Boz Seismic Services Ken Agu, Executive Director, Bresson Energy Limited

Bryn Austin, Director & Geological/ Geophysical Consultant, Brynterpretation Ltd.

Jim House, Director, GeoSeis Ltd Faouzi Khene, GXT

Waclaw Jakubowicz, Managing Director, Hampton Data Services Norman Hempstead, Director, Hempstead Geophysical Svcs

Tim Gibbons, Managing Director, Hoolock Consulting

Kevin Phillips, Basin Research Geologist, IHS Mohamed khiar, Geoscientist, IHS Global

Peter Dolan, Chairman, Ikon Science Limited

Keely Harris, Asset Manager, Impact Oil & Gas Chris Godsave, Technical Support Manager, Impact Oil and Gas Neal Jones, Business Development, Independent Consultant Neil Dyer, Independent

Jeremy Trenchard, Consulting manager, Cegal

Abimbola Durogbitan, Principal Geoscientist Consultant, Independent Consultant

Siebe Breed, Structural Geologist, CGG NPA Satellite Mapping

Ben Dewhirst, Geologist, Independent Resources PLC

Adam Thomas, Senior Consultant, CGG John Glass, Consultant Geologist, Cloverfield Consulting Ltd. John Price, Exploration Advisor, Clymenia Consulting

Roger Doery, Consultant, Consultant

Diwin Amarasinghe, Geophysical Specialist, Consultant Micky Allen, Consultant Tim Willans, Consultant

Dan Kunkle, Director, Count Geophysics Maria Mackey, Energy Sector, Cray UK

Robert Ward, Advisor, Decision Frameworks Nigel Flood, Multiclient Sales Manager, Med,N/NW Africa, Dolphin Geophysical Howard Davies, Business Dev Manager, DownUnder GeoSolutions

Martin Riddle, Technical Manager, Envoi Kate Overy, Senior Geoscientist, ERC Equipoise Ltd.

Jonathan Moore, Product Manager, Evaluate Energy Ltd.

Karl Jeffery, Editor, Finding Petroleum Richard McIntyre, Sales Manager, Finding Petroleum

Avinga Pallangyo, Conference Coordinator, Finding Petroleum

12

Glenn Mansfield, Director, Flare Solutions Limited

Salar Golestanian, Managing Director, Finity Asset

Manouchehr Takin, Independent Consultant John Griffith, Upstream Advisor, JJG Consulting International Ltd

Ewan Rossiter, Exploration Geologist, JX Nippon Exploration and Production (U.K.) Limited Jonathan Bedford, Director, JXT Ravi Chandran, Director, Kalki Consultants Limited

Peter Allen, Consultant, Layla Resources Genni Wetherell, Junior Geologist, LGO Energy PLC

Neville Hall, Director, Llahven Ltd.

Ewan Whyte, Business Development Manager, LR Senergy Alan Smith, Director, Luchelan Limited

Amrit Brar, Marketing and Sales Manager, Lynx Information Systems

Martin Hodge, Data Processing, Time Imaging & Data Management Geophysicist, MoveOut Data Seismic Services David Weeks, Geoscientist, Neftex Rachel Zaborski, Neftex

David Buddery, Consultant, Neoseismic Ltd

Samuel Vye, Business Development, NextGeo Chris Kennett, NextGeo Mohamed Al Buraiki

Luther Birdzell, CEO, OAG Analytics

Mark Robinson, Managing Director/ Geoscientist, Oil and Gas Consultancy

Matthew Thompson, Technical Services Team Leader & Geoscientist, OPC Ltd Robert Parker, Consultant, Parker

Sebastien Duc, Exploration Manager, Middle East & North Africa, Perenco

Howard Dewhirst, Managing Director, Petroalbion P/L

Carlo Buscaglia, Subsurface Team Lead, Petrofac

Cyrille Reiser, Reservoir Characterisation Director, PGS

Mike James, Business Development Manager, Marine Contract, Africa, PGS Denis Potapov, SCRM Manager, PGS Robert Holden, PGS

Cyrille Reiser, Reservoir Characterisation Director, PGS

Steve Pitman, VP Corporate Marketing, PGS Daniel Buckingham, Broker, Pronto Business Funding

Aisha Sanda, PVE Consulting

Alastair Bee, Partner, Richmond Energy Partners Jerome Foreman, Principle Geoscientist, Sasol Petroleum

David Webber, Seismic Operations Supervisor, Sceptre Oil & Gas Miles Dyton, Schlumberger

Tom Martin, Director, Shikra Consulting

Naeem Datardina, Senior Software Engineer, Silixa Ltd. Patrice Carbonneau, Head of Software, Silixa Ltd.

Alexander Chalke, Business Development Director, Simpson Booth Charles Gassier, Consultant, Strategic Fit Glen Burridge, StrategicFit

Duncan Irving, EMEA Oil and Gas Practice Lead, Teradata Duncan Irving, EMEA Oil and Gas Practice Lead, Teradata

Peter Roberts, Business Development Manager, Tessella Nigel Quinton, Director of Exploration, Tower Resources plc Mick Michaelides, General Manager, Troika International Mike Branston, Senior Geophysicist, WesternGeco Andrew Zolnai, Owner, zolnai.ca

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

“ “ “ “ “ ” ” ” ” ” “ “ “ “ “ Transforming Subsurface Science

What did you enjoy most about the event? Everything... timely manner

Timely and interesting look at a subject that is often not given enough attention. Peter Dolan (Ikon Science limited)

Lectures in the first session & networking

Talking to other attendees.

It was a good opportunity to network. Stephen Rippington (Astute Geoscience)

Networking with various people from different horizons.

” ” ”

Learning LEARNING. Outstanding info, and the networking. Andrew Zolnai (zolnai.ca)

Very good presentations from Wally and Duncan.

The subject in my view was a very important one and I have a strong interest in the science of sound technology which is underrated by many. I like what you do. (Pronto Business Funding)

Discussions by audience and panellists. Manouchehr Takin (Independent Consultant)

” ”

Finding Petroleum - Special report, Transforming Subsurface Science, The Geological Society, London, Apr 18, 2016

13