volume 22 number 6 234th issue 2017 ivaap microservices it journal 2017_6 (no. 234).pdf ·...

12
Houston-based INT, purveyor of software ‘widgets’ that are widely used by major upstream software vendors and by in- house oil and gas company developers has announced Ivaap* a new microservices- based back end for its geoscience and engineering data visualization and analysis solutions. The Ivaap framework is designed to connect to multiple data streams such as a Witsml server. Developers can aggregate, analyze and display well and other data from multiple sources including Int’s own GeoServer. Ivaap is cloud-enabled and ‘network-secure’ for remote collaboration. Microservices technology, which favors lightweight, componentized software over monolithic applications is cloud-friendly and scalable for concurrent processing, performance and high-availability. INT’s compact widgets are well-suited to the microservices-style architecture. INT CEO Olivier Lhemann told Oil IT Journal, ‘Ivaap is organized as a suite of services that are easy to discover and use. Ivaap leverages Hateos which enforces the inclusion of hyperlinks within its web responses. The structure of a Witsml data source, a LAS file or a SQL database can easily be explored by following the links returned by a request.’ ‘This means that rather than enabling interoperability through the definition of new standards, microservices offer tools that other developers or partners can use to solve their problem.’ ‘We use the Akka Framework to package our microservices so that they are highly concurrent, secure and resilient. Our customers can use other technologies to augment the functionality of the platform and integrate with their own services.’ Another early-adopter of microservices is GE whose Predix leverages a suite of ‘discoverable’ microservices. While cloud-deployed microservices are clearly of interest to IT, the big question for the business is whether they can will really enhance interoperability between software components from different vendors. We asked Lhemann if Ivaap would ease interoperability with, say, GE’s Predix. He replied, ‘We have limited experience with Predix but as it is based on Pivotal’s Cloud Foundry, Predix applications deployed to the cloud will access external resources via the Open Service Broker API, launched last December. Eventually, Open Service Broker could be a good bridge between Ivaap and Predix. We are not quite there yet, but this seems like a good way to proceed in the future.’ More from INT. * INT Visualization and Analytics Platform. Highlights GBC IIoT in O&G Ikon interview More from PNEC Blockchain news Pipeline software VR revisited MS’ Red Carpet ThinAnywhere back Editorial 2 Ikon interview, Getting data right 3 Cgdms, Divestco, Statoil, Roxar, 3ESI, CMG, Baker Hughes, IHS Markit 4 Software, hardware short takes, Consortium corner 5 GBC IIoT/ digital solutions in Oil & Gas 6-7 People, Done deals 8 More from PNEC, Blockchain news, Siemens, BP 9 Sales, partnerships, Standards stuff 10 Pipeline news, IDS, Jaspersoft, Open Inventor, Hololens 11 Mitsubishi, OSIsoft, Microsoft, Quorum, ThinAnywhere, Implico, Actenium 12 In this issue: ...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com..www.oilIT.com... IVAAP microservices Volume 22 Number 6 2017 234 th issue Retired C-level execs (including former ConocoPhil- lips chairman Archie Dunham), leading VCs back Cerebra AI’s push to oil & gas. INT rolls-out microservices-based back end. CEO Olivier Lhemann reveals the technology under-the- hood and the potential for interoperability with other frameworks such as GE’s Predix. Flutura & JAG Houston-based Jerry Allen Group (JAG Resources) has partnered with AI-focused startup Flutura to ‘accelerate the revolution’ of artificial intelligence and the industrial internet in oil and gas. The partnership focuses on the use of Flutura’s flagship Cerebra IIoT platform. JAG Resources is a group of retired energy sector leaders who chose to remain active by mentoring start-up companies bringing ‘innovation, disruptive and transformative technologies’ to the energy sector. JAG advisor Archie Dunham (retired ConocoPhillips chair) said, ‘Competitive advantage in the energy marketplace will go to forward thinking players who invest in artificial intelligence capabilities like Cerebra. In oil and gas, the risk of digital inaction is greater than the risk of failure.’ Cerebra allows advanced diagnostic algorithms for equipment health ‘episode’ detection and provides OEMs with the ability to scale value added offerings across equipment classes. Flutura is backed by ‘leading’ venture capital firms Vertex Ventures of Singapore, Lumis Partners of Connecticut and Silicon Valley’s The Hive. More from Flutura and JAG resources. NEXT IN OIL IT J OURNAL , EAGE, 2017 P ARIS

Upload: others

Post on 20-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Houston-based INT, purveyor of

software ‘widgets’ that are

widely used by major upstream

software vendors and by in-

house oil and gas company

developers has announced

Ivaap* a new microservices-

based back end for its geoscience

and engineering data

visualization and analysis

solutions.

The Ivaap framework is designed

to connect to multiple data

streams such as a Witsml server.

Developers can aggregate,

analyze and display well and

other data from multiple sources

including Int’s own GeoServer.

Ivaap is cloud-enabled and

‘network-secure’ for remote

collaboration. Microservices

technology, which favors

lightweight, componentized

software over monolithic

applications is cloud-friendly

and scalable for concurrent

processing, performance and

high-availability. INT’s compact

widgets are well-suited to the

microservices-style architecture.

INT CEO Olivier Lhemann told

Oil IT Journal, ‘Ivaap is

organized as a suite of services

that are easy to discover and

use. Ivaap leverages Hateos

which enforces the inclusion of

hyperlinks within its web

responses. The structure of a

Witsml data source, a LAS file

or a SQL database can easily be

explored by following the links

returned by a request.’

‘This means that rather than

enabling interoperability through

the definition of new standards,

microservices offer tools that

other developers or partners can

use to solve their problem.’

‘We use the Akka Framework to

package our microservices so

that they are highly concurrent,

secure and resilient. Our

customers can use other

technologies to augment the

functionality of the platform and

integrate with their own

services.’

Another early-adopter of

microservices is GE whose

Predix leverages a suite of

‘discoverable’ microservices.

While cloud-deployed

microservices are clearly of

interest to IT, the big question

for the business is whether they

can will really enhance

interoperability between

software components from

different vendors. We asked

Lhemann if Ivaap would ease

interoperability with, say, GE’s

Predix. He replied, ‘We have

limited experience with Predix

but as it is based on Pivotal’s

Cloud Foundry, Predix

applications deployed to the

cloud will access external

resources via the Open Service

Broker API, launched last

December. Eventually, Open

Service Broker could be a good

bridge between Ivaap and

Predix. We are not quite there

yet, but this seems like a good

way to proceed in the future.’

More from INT.

* INT Visualization and

Analytics Platform.

Highlights

GBC IIoT in O&G

Ikon interview

More from PNEC

Blockchain news

Pipeline software

VR revisited

MS’ Red Carpet

ThinAnywhere back

Editorial 2

Ikon interview, Getting

data right

3

Cgdms, Divestco,

Statoil, Roxar, 3ESI,

CMG, Baker Hughes,

IHS Markit

4

Software, hardware

short takes,

Consortium corner

5

GBC IIoT/ digital

solutions in Oil & Gas

6-7

People, Done deals 8

More from PNEC,

Blockchain news,

Siemens, BP

9

Sales, partnerships,

Standards stuff

10

Pipeline news, IDS,

Jaspersoft, Open

Inventor, Hololens

11

Mitsubishi, OSIsoft,

Microsoft, Quorum,

ThinAnywhere,

Implico, Actenium

12

In this issue:

...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com...www.oilIT.com..www.oilIT.com...

IVAAP microservices

Volume 22 Number 6 2017 234th issue

Retired C-level execs (including former ConocoPhil-lips chairman Archie Dunham), leading VCs back Cerebra AI’s push to oil & gas.

INT rolls-out microservices-based back end. CEO

Olivier Lhemann reveals the technology under-the-

hood and the potential for interoperability with

other frameworks such as GE’s Predix.

Flutura & JAG

Houston-based Jerry Allen

Group (JAG Resources) has

partnered with AI-focused

startup Flutura to ‘accelerate the

revolution’ of artificial

intelligence and the industrial

internet in oil and gas. The

partnership focuses on the use of

Flutura’s flagship Cerebra IIoT

platform.

JAG Resources is a group of

retired energy sector leaders who

chose to remain active by

mentoring start-up companies

bringing ‘innovation, disruptive

and transformative technologies’

to the energy sector.

JAG advisor Archie Dunham

(retired ConocoPhillips chair)

said, ‘Competitive advantage in

the energy marketplace will go to

forward thinking players who

invest in artificial intelligence

capabilities like Cerebra. In oil

and gas, the risk of digital

inaction is greater than the risk of

failure.’

Cerebra allows advanced

diagnostic algorithms for

equipment health ‘episode’

detection and provides OEMs

with the ability to scale value

added offerings across equipment

classes.

Flutura is backed by ‘leading’

venture capital firms Vertex

Ventures of Singapore, Lumis

Partners of Connecticut and

Silicon Valley’s The Hive. More

from Flutura and JAG resources.

NEXT IN OIL IT JOURNAL,

EAGE, 2017

PARIS

Page 2: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 2

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

An announcement from MIT researchers on breakthrough computing using ‘photonics’ highlights the

potential for analog devices in artificial intelligence. Editor Neil McNaughton recalls earlier work using

light to ‘image’ seismics. Unfortunately no longer a ‘politically correct’ use case for MIT!

Computing with light!

Early on in my career I was a young

geophysicist in the head offices of a major

EU oil company. One day there was a

commotion in room near mine and soon

the ‘next big thing’ was unveiled, an

optical bench for analyzing seismic data.

This remarkable tool shone a laser beam

through a 35mm slide (remember them?)

of a seismic line. The light then passed

through a lens which focused the beam to a

point. No surprises there. But what was

mind blowing (to me at least) was the fact

that the information, the pattern of light at

the focal point, represented a Fourier

transform of the seismic image. For those

of you who have not come across Joseph

Fourier’s chef d’oeuvre, a Fourier

transform splits information into its

frequency components. A spectrum

analyzer if you like. This laser optical bank

split seismic into its spatial (rather than

temporal) frequencies.

This seemed to me rather fanciful until the

machine’s champion began sliding optical

gratings and wedges into the device at the

focal point, demonstrating just how

powerful a filtering device this was. You

could remove any directional component

in the slide and bring out features ‘hidden

in the data.’ It was so powerful that it may

have brought out some features that were

not there at all.

I later realized that this was not exactly the

‘next big thing’ but the last, having been

developed a decade earlier by United

Geophysical and sold as the LaserScan. In

the 1960s, (before my time!) this device

was of interest to seismic processors, even

though digital processing was already well

established. Digital geophysics was

invented a decade earlier (yes, in the

1950s) at MIT’s Geophysical applications

group, MIT-GAG. But the laser/analog

device was capable of instant processing at

a higher resolution than would have been

practical with the digital technology of the

time. Some examples of LaserScan are

given in Ernie Cook’s 1965 paper on

geophysical operations in the North Sea

and another by John Fitton and Milton

Dobrin in the October 1967 Geophysics.

My next encounter with non digital, analog

devices, does not have anything to do with

this editorial, but it was so clever and I

doubt that I’ll ever have a better

opportunity to talk about it so here goes.

In the mid 1970s GPS did exist but it was

not very good. In fact although it was

widely adopted, the first sonar-doppler

aided marine GPS systems were a step

back from radio navigation. Of which

there were many competing systems. One

of these (unfortunately I can’t remember

what it was called and can’t find any

references), used an analog delay line and

a radar type chirp that was broadcast over

the air. The signal was also sent, as an

acoustic surface wave across a solid-state

device. The distance travelled across the

device (at the speed of sound) was selected

so that it took about the same time as the

radio waves travelling to shore-based

beacons (at the speed of light). By cross-

correlating the returning radio signal with

the output of the delay line, the time of

travel (and hence the distance from) the

shore-side beacon could be measured very

accurately. At least that was the idea. If my

memory serves me well the navigation

service was not in operation for very long

as the GPS brigade got their act together

soon after the system was introduced and

the rest is history.

You may be wondering what the point of

all this is in today’s age of digital ‘big’

data. Well, a recent paper in Nature

Photonics, ‘Deep learning with coherent

nanophotonic circuits’ by Yichen Shen et

al. from MIT describes the use of an

optical, analog computer to perform

artificial neural network (ANN) ‘deep

learning.’ Seemingly, today’s computing

hardware, despite ‘significant efforts’ is

‘inefficient at implementing neural

networks.’ Just as the digital computers of

the 1960s weren’t up to some geophysical

processing tasks. And the solution may

again be computing with light.

As an aside, this kind of photonics is not to

be confused with quantum computing

which is also touted as solution for ANN.

Quantum computing is not as far as I know

yet feasible. MIT’s ‘photonics’ optical

computer just uses regular light, no quanta,

not even digital pulses.

MIT’s optical ANN promises an

enhancement in computational speed and

power efficiency over state-of-the-art

electronics. The researchers have

demonstrated the concept with a

programmable nanophotonic processor

featuring a cascaded array of 56

programmable Mach–Zehnder

interferometers. The system was trialed on

a ‘typical’ ANN-style problem, speech

recognition where it performed reasonably

well, scoring a 77% accuracy.

Commenting the breakthrough Shen said

that the architecture could, perform ANN

calculations much faster and using less

than one-thousandth as much energy per

operation as conventional electronic chips.

Energy consumption by the way is a big

issue in high performance computing.

‘Light has a natural advantage in doing

matrix multiplication, dense matrix

operations are the most power hungry and

time consuming part of AI algorithms.’

And, one might say, of geophysical

imaging. In fact the MIT team expects

other applications in signal processing. If it

wasn’t so politically incorrect these days,

they might have added ‘and in seismic

prospecting for oil.’ But MIT-GAG is of a

long-forgotten past. MIT’s current Energy

Initiative, MITEI, is an altogether greener

thing, even if though it is funded by oil and

gas companies.

By the way, after the knock-off device

across the corridor from my office was

installed, I would occasionally sneak

across the corridor and peek into the laser

room. I don’t recall it being used much. In

fact I think if was one of those things that

you buy, use and couple of times to amaze

your friends and then forget about. A bit

like my Panono!

@neilmcn

Oil IT Journal ISSN 1632-9120

(formerly Petroleum Data Manager) is a

confidential newsletter produced by The

Data Room for paid-up subscribers.

All material is

©1996-2017 The Data Room SARL.

Oil IT Journal is a subscription-based

publication. Forwarding or otherwise

sharing your copy and/or online access

codes with third parties is

strictly prohibited.

Page 3: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Oil Information Technology Journal ISSN 1632-9120 Volume 22

[email protected] Page 3 234th issue

What’s it like being an upstream software

boutique in the downturn?

MB Everyone did fine at $120 oil! Times

are hard for many but in my experience,

mid-sized specialists like Ikon suffer less

in the downturns than others. I joined Ikon

in February when founder Martyn

Millwood Hargrave was looking for a

commercially minded person to drive

future growth. Martyn is still chairman.

We are now coming strong out of the

downturn and are lean and in better shape

than our main competitor*! We are

nimble and can react quickly to our clients

in IOCs, NOCs and large independents

who come to us for rock physics, quantita-

tive interpretation (QI) and our JiFi joint

inversion.

Where do you see growth coming from?

MB The rebound in North America is

strong. Companies are seeing the green

shoots of returning OPEX and a resur-

gence of research money. The EU is OK,

Asia Pacific so-so. Overall the demand is

for software which is always a more robust

business. Services are not so strong.

What is driving the renewed interest from

operators?

MB Although oil price is important, what

really matters is long term price stability.

Clients won’t invest if the oil price yo-yos

up and down. A realistic range is around

$50 for the next 12-18 months. Demand is

picking up, especially for unconventionals.

At $120 there was no time to think and

analyze. At $50 operators have to work

smarter on where to drill, how to drill and

how to maximize recovery. This is where

Ikon comes in. Working faster is not

enough. Operators need to change from the

cookie cutter approach, take a step back

and analyze their workflows.

But this goes against the factory drilling

paradigm. How do you convince the

engineers?

MB We see the silos as collapsing. Asset

team managers want more control. They

increasingly look holistically across the

business and are bringing the different

disciplines together, including geophysics.

DS Geopressure and geomechanics on

their own did not have much impact on

unconventionals at $120. Some analysis

was done ‘just in case’ but then the factory

drilling approach came along. Geophysics

was replaced by statistical drilling. But

now, at $50, you can only can drill so

many wells and companies are looking for

that extra edge. The pendulum has swung

back re geophysics, geomechanics and QI.

Now this stuff ‘may be worth a shot!’

There is no silver bullet here, but Ikon has

slowly added value to the methodology.

Our software is a differentiator. The larger

upstream software vendors have built

‘elephant’ platforms that can do anything

but do nothing very well! These folks are

feeling the pain now as they see drop in

maintenance revenue. Ikon created the

rock physics market which is now in every

company’s workflow. JiFi is a step

change, bringing-in fracking and integrat-

ing geology and geophysics. Our real time

pore pressure predictor gets closer to the

driller with practical workflows and the

application of high-end science.

How do you cohabit with the big guys?

DS We have a collaboration deal with

Schlumberger to bring key functionality

into Petrel. There is now a large group of

Petrel users interested in RokDoc. It has

been a win-win.

How do you approach geophysical number

crunching?

DS Our platform is Java-based so we run

on Windows and Linux. We also have

support for MPI so our number crunching

can run on clusters. We are in the process

of a move to the cloud, extending our

parallel processing capability. Industry is

generally very slow to move to the cloud.

But last year saw a change, especially with

smaller operators. A couple of US

operators have outsourced all of their IT.

Landmark’s DecisionSpace has been on

the cloud for a couple of years. Schlum-

berger is moving slowly that way and will

bring us along. It is not really our role to

initiate such a move for operators.

MB Oil and gas as a whole is a laggard in

technology adoption. We need to stop

these risk-averse ways of looking at

technology. Yes, our industry is unique but

not so unique that it can’t learn from

others. Many oils actually make their own

software. Would they be happy if Ikon

drilled wells? It doesn’t make sense!

More from Ikon Science.

* After the interview, CGG (whose Jason

unit is an Ikon competitor) filed for

protection from bankruptcy in France and

other jurisdictions.

Ikon Science CEO Mark Bashforth and CTO Denis Saussus discuss working smarter at $50 oil,

collapsing engineering and geoscience silos and the return of geophysics to shale drilling.

Oil IT Journal interview, Mark Bashforth, Ikon Science

Getting data right - don’t model, argue! O’Reilly’s free book advocates abandoning data modeling in favor of Tamr’s DataOps.

Getting Data Right (GDR), tackling the

challenges of big data volume and variety

is 77 page free book apparently without

sponsorship so where’s the catch? We

looked the gift horse in mouth and noted

that the GDR is published by Tamr.

Otherwise the seven co-authors provide an

interesting narrative around big and not so

big enterprise data. Tom Davenport

immodestly proposes his own ‘law’ thus,

‘The more an organization knows about a

particular business entity, the less likely it

is to agree on a common term and meaning

for it.’ This leads to ‘corollaries,’ viz, ‘A

manager’s passion for a particular

definition of a term will not be quenched

by a data model specifying an alternative

definition,’ and ‘consensus on the meaning

of a term is achieved not by data

architecture, but by data arguing. ‘Data

modeling doesn’t often lead to dialog,

because it’s not comprehensible to most

nontechnical people.’

Other authors offer similar pithy, if hard to

execute, advice: Michael Stonebraker on

data curation and the data lake where we

learn that data ingestion to the lake

involves converting XML to CSV (and

that’s a good thing?) Ihab Ilyas advocates

‘humans in the data cleansing loop’ and

the ‘judicious involvement of experts’ and

reference sources. Michael Brodie writes

on the ‘fourth paradigm’ of data science

with an interesting use case of CERN’s

Atlas project. Andy Palmer introduces the

next big thing ‘DataOps.’ James Markarian

advocates replacing sluggish old ETL with

snazzy ‘data unification’ à la Tamr. So

there was a catch, but quite an interesting

read all the same!

Page 4: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 4

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

Speaking at a recent gathering of the

Calgary geoscience data managers society,

Divestco GIS guru Terry Steinkey gave a

useful 101 presentation on the

fundamentals of mapping. Steinkey ran

through the essentials of geodetics the

relative merits of different projections and

datums, along with a plug for the essential

work of the EPSG. ‘Issues,’ like 200m

plus errors in positioning, can easily result

from a misuse of projections.

GIS was invented by Canadian Roger

Tomlinson and is today a ‘multi billion

dollar’ industry. Popular GIS applications

include Esri which can be extended with

modules such as Divestco’s own

GeoCartaGIS for Arcview, adding G&G

functionality. The 101 includes a brief

exposé on the different Esri data formats

the shapefile and geodatabase.

The free and open source QGIS, ‘less

aesthetic and harder to learn than Esri’ is

nevertheless a ‘direction that many

organizations are taking.’ Google Earth

Pro (GEP) is a ‘great starting point’ for

web-based mapping. GEP development

has now curiously been taken over by Esri

and ‘they are not doing a good job!’

Perhaps because Esri has its own ArcGIS

Online web mapper with a freeware

edition. But this is ‘more for fun than

production.’ Serious work needs an

upgrade to (a paid) ArcGIS Server.

Divestco’s mapping 101 Calgary geoscience data managers society presentation on Esri and other GIS solutions.

Baker Hughes, IHS Markit, CMG team Partnership promises ‘new level’ of G&G/engineering solutions. CMG’s healthy 2017 results.

CMG, IHS Markit and GE unit Baker

Hughes have announced a technology

alliance to ‘deliver a new level’ of

geoscience analytics and engineering

solutions to the upstream. The partnership

sets out to offer a ‘new choice’ for

integrated workflows spanning geological

characterization, reservoir modeling,

production data analysis and reservoir flow

simulation. Software in the alliance

includes IHS Kingdom, Baker Hughes

JewelSuite and CMG’s reservoir suite of

simulation software. The collaboration is

to deliver ‘new levels of insight and

efficiency for integrated geoscience

analytics.’

IHS’ Russell Roundtree said, ‘Our

customers want options, not inflexible,

closed systems, to stay agile in these

difficult times.’ Baker Hughes’ Martin

Brudy added that the alliance would

optimize connectivity and provide

‘adaptable, scalable and solutions.’

CMG recently reported its fiscal 2017 year

-end results which are pretty spectacular

for an upstream software house in the

downturn. As of March 31, 2017, CMG

had an enviable $63.2 million (Canadian)

in cash, no debt and an $822.6 million

market capitalization.

Emerson’s Roxar unit has completed its

‘Total uncertainty management’ program,

a joint venture with Statoil that addresses

history matching, uncertainty management

and ‘quantification’ across the reservoir

characterization workflow. TUM has

informed major updates to Roxar’s

Tempest/Enable history match application

notably with the addition of ensemble

Kalman-filter based history matching that

is said to have given ‘a marked

improvement’ in statistical accuracy.

Other TUM spin-offs include an

application connector for chained

workflows and the implementation of an

‘ensemble smoother’ that is ‘particularly

successful in handling the production

effects seen on 4D seismic.’ All TUM

developments are now available in the

commercial 8.0 release of Roxar Tempest.

The joint venture has also enabled

deployment of TUM methodology on

clusters and in the cloud, said to be

essential for larger integrated workflows.

Roxar’s Kjetil Fagervik said, ‘In field

development plans, investment or

divestment proposals, accurate forecasting,

uncertainty and financial risk management

are some of the industry’s greatest

challenges.’

Statoil’s total uncertainty management program complete Emerson/Roxar embeds results of multi-year R&D in commercial release of Tempest/Enable.

3esi-Enersight has just published a

position paper on the use of ‘Reliable

technology in SEC reserves reporting’

authored by John Lee (Texas A&M). Lee

is in a good position to explain the niceties

of the SEC’s regulations since his stint as

an ‘academic engineering fellow’ at the

SEC where he was principal architect of

the modernized reporting rules.

Reliable technology ‘must be based on

sound scientific and engineering princi-

ples, and must ‘lead to the correct

conclusion the vast majority of the time.’

Lee’s thesis is that reserves estimation

techniques, including software, can meet

the criteria for reliable technology

providing a developer can back it up with

‘extensive field evidence.’

Most operators don’t have the resources to

pioneer reliable technology and should let

vendors do the heavy lifting and propose

novel technologies that enhance accuracy.

While tried-and-true methods may meet

current reporting rules, new software-

based reporting ‘can increase PUD

estimates in unconventional resources, by

expanding proved area well beyond

immediate offsets and increasing reserves

with more favorable aggregation of

individual well EURs.’ Smart companies

will take advantage of reliable technology

to increase reported reserves using

software that others have developed and

validated as meeting the SEC’s rules.

Lee’s approach is curious in that it

equates ‘reliable’ with larger. One might

have thought that using ‘reliable technolo-

gy’ could occasionally prove disappoint-

ing! More from 3esi-Enersight and our

October 2009 editorial on the ‘perfect

storm’ of reliable technology and shale

reserves.

3ESI-Enersight presentation on reliable technology and software SEC’s ‘reliable technology’ reporting rules open a route to increased reported reserves.

Page 5: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Oil Information Technology Journal ISSN 1632-9120 Volume 22

[email protected] Page 5 234th issue

OTM has launched the Geomechanics

Initiative to address geomechanical

challenges such as pore pressure

prediction, wellbore stability, fracture and

fault seal analysis and compaction and

subsistence. A survey found strong support

for the initiative although some expressed

concerns over other private consortia that

fail to share IP and ‘only open the doors

after a safety or an environmental issue.’

OTM plan one meeting per year and a

public website to promote the initiative.

Project reports and updates will, curiously,

be held private.

DNV GL’s JIP on standardizing subsea

processing has moved into Phase 2. Phase

1 saw completion of a functional

description of subsea pumping. Phase 2

will deliver standardized guidelines. JIP

members include Aker, GE, OneSubsea

and TechnipFMC alongside operators

Shell, Statoil and Woodside.

The US Department of Energy is to

invest $20 million in new oil and gas

research projects. Target areas include

increasing recovery from unconventional

oil and gas wells and preventing offshore

spills and leaks. The funding is said to

support the Office of Fossil Energy’s

efforts to ensure ‘environmentally

sustainable’ domestic and global supplies

of oil and natural gas.

Dutch R&D outfit TNO has developed a

field development optimization benchmark

challenge under the auspices of its ISAPP

research program. The benchmark is based

on a reservoir model for the fictitious

Olympus field. Following-on from the

2008 Brugge challenge, Olympus adds the

complexity of optimizing wells placement.

Download the challenge models here.

The SEG has teamed with Kudos*, a

private organization that claims to ‘help

researchers ensure their publications get

found, read and cited in a world of

information overload.’ Kudos aggregates

metrics on a published paper and ‘maps

outreach activities against those metrics.’

* Nothing to do with Strava!

Consortium corner OTM’s Geomechanics Initiative. DNV GL standardizing subsea processing. US DoE’s $20 million for

oil and gas R&D. TNO’s field optimization benchmark Round 2. SEG teams with Kudos.

Software, hardware short takes …

Blue Marble Geographics’ Global

Mapper 18.2 introduces 3D digitizing and

connectivity to Amazon web services. The

SDK includes support for Amazon’s cloud

-based simple storage service, 3D fly-

through and improved display of vector

features and models in the 3D Viewer.

Dynamic Graphics has released CoViz

4D V9.0 with quantitative visualization

and analysis functions targeting reservoir

engineering. DGI’s WellArchitect 5.0 adds

functionality for directional drilling

including clearance scans, data display in

differing projection systems and Witsml

feed ingestion.

Emerson/Roxar has announced RMS 10.1

with enhanced seismic-to-flow simulation

workflows, integrated decision support and

integration between domains.

A new publication from Halliburton

‘Digital transformation for oil and gas

production operations’ introduces ‘Voice

of the Oilfield’ technology. VotO’s high

level architecture spans downhole

instrumentation, Landmark’s Field

Appliance IoT connector and

DecisionSpace Production 365 running in

the cloud*.

Hexagon has announced the Smart digital

asset collaboration module for its

Intergraph flagship. SDA-C helps owner

operators build a digital asset model in the

cloud during the project phase for use later

in operations and maintenance. SDA-C

was a joint development with a ‘major

owner operator.’ Other ‘cloud ready,’ SDA

modules are planned.

Kadme’s Whereoil 4 promises a ‘wealth

of new capabilities’ including faster spatial

search, flexible GUI configuration options

and new well log QC and visualization.

LandWorks has announced V5.5 of its

land management suite which now ‘fully

supports’ the Esri ArcGIS Online and

Portal platforms.

Sintef has released V 2017a of MRST, the

Matlab reservoir simulation toolbox. The

free software can be downloaded here.

MRST updates include support for multi

segment wells, inflow control devices and

sensitivity analysis of input parameters.

Quorum has announced myQuorum Land-

on-demand, a cloud-based edition of its

land management suite for small and mid-

sized E&P companies.

A team from NIST/CU has ‘launched’ a

‘Comb and Copter’ system to map

atmospheric gases around oil and gas

facilities. The C&C comprises a ground-

based laser and drone-mounted sensor that

analyzes absorbed light to identify gasses

in near-real time.

Rock Flow Dynamics’ tNavigator 17.2

adds a new network designer module and

3D seismic interpretation functionality

with manual and autotracking and time to

depth conversion.

Terrasciences has released LAS Probe, a

freeware utility to inventory and QC

curves in multiple LAS files. LAS Probe is

available for Windows and Linux.

TrendMiner 2017R1 adds support for

large companies operating across multiple

sites with different historian/MES systems

to gain analytics insights without changing

their existing systems. Tags from a variety

of historians can now be indexed and

searched within a single analytics

environment. TrendMiner ships with

connectors for Wonderware Historian,

OSIsoft PI, Honeywell PHD, Yokogawa

Exaquantum and AspenTech IP21.

SpectraLogic has published its Digital

Data Storage Outlook 2017, a useful 25

page guide to data storage hardware trends.

* See also our coverage of Halliburton’s

‘Open enterprise architecture’ in ‘More

from PNEC’ on page 9 of this issue.

Blue Marble Geographics, Dynamic Graphics, Roxar, Halliburton, Hexagon, Kadme, LandWorks,

Sintef, Quorum, NIST/CU, Rock Flow Dynamics, Terrasciences, TrendMiner, SpectraLogic.

Page 6: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 6

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

GBC* is an established conference

organizer for the downstream. The IIOT &

Digital Solutions for Oil & Gas event held

in Amsterdam last month was its first

venture upstream. Kari Jordan provided a

keynote introduction to Shell’s Tacit IT

initiative. Digitalization is changing the

energy industry and helping the world

decouple global emissions from economic

growth, notably with the internet of things,

machine learning and big data/analytics

(BDA). Shell has many IoT experiments

running and a small team working on

cognitive computing.

Shell’s data represents a

‘huge untapped source of

richness.’ Shell is running

robotics proof-of-concept

trials and also has a team

working on blockchain.

Jordan thinks that

blockchain, which has no

central authority, ‘may

change current thought

process around business

models.’ Shell’s iScope virtual reality

geoscience center got a plug as did

Honeywell’s operator training simulator

for the Prelude platform. Analytics in Shell

is ‘quite mature,’ but there are still

questions as to how to incorporate these

developments in future plants.

McKinsey’s Sverre Fjeldstad made the

bold claim that digital can be used to ‘fix

the economics of oil and gas,’ whose

business model ‘has been destroyed in the

last decade.’ Since the 2015 downturn,

companies have done an excellent job of

restoring margins by squeezing suppliers,

making employees run faster and by

‘losing the gold plating.’ But this is not

enough in a lower for longer oil price

scenario. In particular, shale oil ‘has not

been profitable since 2014.’ Future

progress will come from digital and

advanced analytics in predictive mainte-

nance and ‘a lot of other use cases.’

McKinsey has applied BDA to several

years of control room parameter historical

data. Machine learning was applied to a

500GB data set to identify five ‘themes’ of

settings that made for periods of high

production. Predictive models promise a

production hike of ‘1-2%.’ Having said

this, Fjeldstad observed that plugging real

time/ad hoc analytics back into operations

was ‘a few years away!’

The conference theme, the Industrial

Internet evokes standardization. Stephen

Mellor of the Industrial Internet

Consortium (IIC) opined that while it was

relatively easy to standardize the bottom of

stack, the hard part is to standardize higher

up in the stack, above operating systems

and communications protocols. The search

is on for commonalities across verticals

where there is a need for global standards.

Unfortunately, ‘companies

like to embrace, then

extinguish their own

standards as do nations.’

There are different

international IIoT

standards across the US

(NIST), Japan (UiT

acceleration consortium),

China (CAICT) and the

EU (AIOTI). Mellor

offered the IIC’s Track & Trace testbed as

an example of what could be achieved.

This involved a joint venture between

Airbus and Bosch to leverage IoT-enabled

tool tags such that less time was wasted

looking for tools! Having said that, Mellor

revealed that ‘the IIC is not a standards

organization!’ The IIC will ‘establish

frameworks, evaluate existing standards,

identify requirements and propose them to

standards bodies such as ISO.’

IHS Markit’s Oscar Abbink pushed back

the digital time line to the 2004 CERA

White Paper that introduced the ‘digital

oilfield.’ DOF program starts peaked in

2006 with the expectation of up to 20%

savings in OPEX. Poster children included

Kvaerner’s unmanned ‘Subsea on a Stick’

facility and real time reservoir manage-

ment as practiced by Chevron in Kern

River. Big data is not exactly new. There

are already ‘pockets of excellence’ in the

industry which may spread with the next

wave of digitalization. This will see

‘convergence around a core of data-driven

analytics and optimization.’

Einar Landre traced Statoil’s GoDigital

program that kicked off early in 2016.

GoDigital’s premise was that there were

‘threats and opportunities’ in the digital

space which could impact unsustainable,

post-downturn margins. This led to the

creation of a digital center of excellence in

March 2017. The CoE is now operational

and ready to deliver the digital roadmap.

The roadmap derived from 38

‘lighthouses,’ idealized future states that

provide direction. GoDigital uses a bottom

-up approach, ‘top-down involves more

politics than you need.’ The next step is to

select a technical platform for BDA and to

‘develop a new mindset for an unmanned

world.’ Statoil wants to stay profitable,

even at $30 oil.

Shahrul Rashid and Jefferi Bin Kamarudin

presented Petronas’ digital roadmap for

the downstream. Today there may be up to

45,000 instruments in a refinery, keeping

operations safe. Petronas’ refinery of the

future program envisages an integrated

operations and maintenance center with

data captured to the Petronet cloud

database. A Nagios-based dynamic risk

analysis and early warning system was co-

developed with Near Miss Management

also ran. ‘dSCE,’ a digital toolkit for

operations and maintenance leverages

Petronas’ pervasive wireless network and

handheld devices providing information at

workers’ fingertips.

Asger Klindt presented Maersk Drilling’s

work towards a common IIoT platform to

support a digital twin of a drilling rig to

support predictive maintenance and

drilling productivity enhancement. Maesrk

could not find an off-the-shelf IoT solution

and so has engaged with GE to develop a

solution around Predix and SmartSignal.

Predix is not suited to offshore low

bandwidth Vsat communications so

Maersk is working on ‘edge’ analytics on

the rig. Earlier this year Maersk was

reported to have deployed Maana’s

knowledge graph at its shipping business.

Hatem Oueslati showed how Baker

Hughes (now a GE company) is using real

time downhole data in its Remote

Operations Services. Here, cross function-

al teams compare real time and historical

data and analytics. A modern rig can

produce over a terabyte/day. Big data

support leverages Cassandra along with

‘Unfortunately, companies like

to embrace, then extinguish

their own standards,

as do nations !’

Stephen Mellor,

Industrial Internet Consortium

Shell’s Tacit digital initiative. McKinsey, ‘fix oil and gas economics with digital.’ IIC on elusive IoT

standards. IHS on digital ‘pockets of excellence.’ Statoil’s GoDigital/CoE. Petronas’ downstream

digital roadmap. Maersk on Predix and ‘edge’ analytics. Baker Hughes’ Remote Operations Services.

Movus’ FitMachine. Honeywell’s Connected Plant. Gazprom/Rostelecom team on an ‘Open partner-

ship for the industrial internet of things in oil and gas.’

GBC IIoT and digital solutions in oil and gas, Amsterdam

Page 7: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Oil Information Technology Journal ISSN 1632-9120 Volume 22

[email protected] Page 7 234th issue

Energistic’s Witsml, ETP and global log

curve mnemonics. Data analysis and

performance benchmarking is performed

with the BHI Signals platform.

While others presented somewhat high-

level approaches to the IIoT, Greg Harris’s

presentation of CD Power’s use of Movus

FitMachine was more pragmatic. CD

Power supplies electricity generators to

onshore drillers. These are now monitored

with Movus’ FitMachine, a hockey puck

sized, self-powered magnetic device that

transmits vibration and temperature into

Movus’ Google/TensorFlow-powered AI

engine. Machine behavior is learned in a

couple of weeks from which time, the

system issues alerts if machine health

deteriorates. Movus’s ‘zero IT’ footprint

receives software updates over the wifi

network. Five have been fitted onto CDP

LNG compressors too.

What follows is a narrative derived from

the panel discussion that wound up day

one of the conference. A potential issue

raised regarding BDA was that ‘data

scientists become insiders and get to know

more about the business than anyone!’ In a

similar vein, BDA is more of a human

problem than a technological one, if you

want to access say HR data, the first

question you will be asked is ‘why?’

Nobody wants to relinquish control over

the data asset and such reticence to share

information is ‘a major cultural roadblock

to change.’ On the HR front, the risk of a

shortage of data scientists recalled the

scramble for cyber security specialists of a

couple of years back. There again, there

are a lot of smart math-trained people in

oil and gas that could fill data scientist

roles. Data ownership is also a vexed

issue. ‘If you have a GE turbine, is the data

yours or GE’s? For a major operator, the

data is a part of its asset, even though it

may be shared with GE such that they can

improve and service the machine. Cyber

security is getting increasingly risky as ‘all

the data eggs are in the same data lake/

basket’ instead of being spread across

multiple legacy systems.

Christophe Romatier presented

Honeywell’s Connected Plant, delivered

from its UOP unit, a century old line of

business that now ‘makes over 60% of the

world’s gasoline.’ Like GE, Honeywell

recently decided to transform itself into an

industrial software company. The

connected plant offering is a digital twin of

the plant with a big data engine inside.

Connected Plant components include a

Hadoop data lake and metadata model,

Spark stream processing (time series

database) and Honeywell’s cloud-based

data historian.

Mikhail Korolkov (Gazprom) and

Aleskey Kostrov (Rostelecom) presented

their ‘Open partnership for the industrial

internet of things in oil and gas.’ The

combination of Russia’s major gas

producer (13% of world production) and

the national telco are to deliver a platform

for data analytics, IoT connectivity and a

marketplace, starting in 2018. A Russian

national cloud platform will provide

connectivity to oil wells and production

facilities for real time drilling monitoring,

security and more. Gazprom plans to use

the platform in future operations, starting

with facility design in a virtual space. The

model will then run as a prototype to check

performance. The system is to support

plug and play construction, model-based

production optimization and self-

diagnosing equipment. Target architecture

centers on the digital twin. Korolkov

observed that although Honeywell has a lot

of answers, there is also a lot of hype

around the digital twin. Specific use cases

are still to be determined. The scope of the

digital twin, how to keep it in sync with

reality and its cyber security all need

further thought.

In the ensuing debate, the issue of the

intellectual property (IP) in the digital twin

was raised. This is a recurring question and

a subject of tension between owner

operators and software vendors, curiously

since, ‘Operators have no mechanism for

monetizing IP.’ One major reported ‘more

negative than positive examples’ of

collaboration as the supplier got all the

goodies and ‘left us with nothing.’ The

perennial issue of information handover

from construction to operations was raised.

Construction may take place in a virtual

space but the idea of this being carried

through across handover is ‘light years

from today.’ ‘What exactly is the digital

twin and how can we get there realistical-

ly?’ This issue has plagued oil and gas but

in one flagship rail project in the UK,

‘everyone involved has access to all the

data all the time.’ The commercial world

could learn a lot from this government-

backed approach. Another speaker

acknowledged that ‘nobody has cracked

this’ stakeholders ‘need to want to share.’

The chair invited speakers to give their

evaluation of the state of the art of the

IIoT, with a subjective scoring of

technology and take-up. A major vendor

opined that all are ‘just starting.’ Most

have reasonably established DOF solution

with sensors and are taking their first steps

in analytics. But nobody is running fully

autonomous plant. Most are at stage 2

moving to 3 or 4 (out of 5). Most are just

at the asset configuration stage, ‘what do I

have,’ although companies in the power

and other regulated sectors are ahead. The

fact that majors work in silos represents

both a problem and the potential for a big

win. But with the downturn, budgets for

research have been cut. ‘Maybe more are

at 2 to 3.’ A major got a ripple of laughter

when he opined that ‘vendors are also at

level 2.’ ‘Everyone knows how to get there

but…’

Russel Herbert reported that OSIsoft’s PI

System is used on some 1.5 billion real

time data streams representing 45 million

boe/day. Shell alone has 7.5 million

connected devices performing 100,000

calculations/minute for 15,000 users. PI is

therefore central to the first wave of the

digital transformation and OSIsoft intends

to stay ahead in the next wave of the cloud,

BDA, integrated operations and mobility.

Everyone is asking ‘can’t we do more with

the data?’ The answer is, yes we can! 80%

of the value of analytics is at the opera-

tions level and involves data from

counters, trends, pressure rising too

quickly, how many times on and off. Here

‘no advanced analytics is required.’

Elsewhere (the 20%) run at the enterprise

level, with the application of machine

learning to time series data, blending data

from maintenance, finance and maybe

running in the cloud. Tagged historian data

may not mean much outside of plant. This

needs to be fixed upfront for digital

initiatives, adding context and metadata to

raw tags. All oils use multiple analytics

solutions providers, SAP/Hana, Watson,

Azure, and the OSIsoft Cloud. Such

multiple connection points may pose a

security risk. Another approach has data

lakes sitting beside PI which adds

complexity. Streaming analytics is hard

because contextualization takes time

although there are some successes.

OSIsoft’s recommended approach is,

unsurprisingly, to have PI at the heart of

the solution with all other digital oilfield

tools working off of PI. Shell, ENI and

Element Analytics were cited as users/

partner.

* Visit the Global Business Club and the

IIoT/Digital Solutions event home page.

Page 8: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 8

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

The API has named Megan Barnett

Bloomgren VP communications.

Gary Baughman is CEO of Veritas

Capital’s Aprim unit.

Aveva has appointed Steen Lomholt-

Thomsen as chief revenue officer. He hails

from IHS Markit.

Tony Canfield is now VP engineering with

BCCK. He hails from DCP Midstream.

CSE Icon has promoted Hoon Chew Toh

to president and general manager.

GE’s new Baker Hughes unit has named

Lorenzo Simonelli, president and CEO,

Jennifer Hartsock, CIO, and Matthias

Heilmann, president and CEO, digital

solutions. John Flannery is to become

chairman and CEO of the parent GE when

Jeff Immelt retires at year end 2017.

Gene Hallman and Armando Ferri are now

respectively south and east regional

directors with Blackeagle Energy

Services.

Pat Mullen is now president and CEO with

CB&I. He was previously COO.

Norbert Seimandi heads-up CGI’s new

global center of excellence in Lyon,

France.

ConocoPhillips has appointed Sharmila

Mulligan to its board. She previously held

senior roles at Aster Data, Hewlett-

Packard and Netscape.

Dawson Geophysical has promoted Philip

Lathram to VP IT and Tom Phillips to VP

applied geophysical Technology.

DNV GL has appointed Ulrike Haugen

chief communications officer. She hails

from ABB.

Frontier Integrity Solutions has named

Keyth Pengal as CEO and Bill Boyer to its

board.

GSE Systems has elected Jack Fuller to its

board.

Chris Weber is executive VP and CFO at

Halliburton. He hails from Parker

Drilling.

Erik Staffeldt has been promoted to senior

VP and CFO at Helix Energy Solutions.

Tony Tripodo becomes executive VP and

senior advisor.

Matt Bowyer and Amanda Turner have left

Merlin Energy Resources. Bowyer to

take an MSc in renewable energy.

Erinn Broshko is now director and

executive chairman of New West Energy

Services.

Offshore Technical Compliance has

appointed David McCubbin as COO and

Dan Phelps as director, compliance

inspection services.

Occidental has appointed Cedric Burgher

as CFO, replacing retiree Chris Stavros.

Burgher hails from EOG Resources.

Scott Tidemann is now CEO of Petrosys.

Scott Key is chairman of the board with P2

Energy Solutions. Fritz Smith is chief

revenue officer. Both come over from IHS

Markit.

Margaret Barron has rejoined the PPDM

association as chief, professional

development.

Shannon Rasmussen is VP engineering at

QS Energy. He was previously with

Citrine Energy.

RocketFrac Services has appointed Ron

Spoehel as chairman. He was previously

with NASA.

Carnegie Mellon University’s Software

Engineering Institute has appointed Bobbie

Stempfley to director of its CERT division.

She hails from Mitre Corp.

Stress Engineering Services has hired

Brian Weaver as sales manager. He was

previously with GE Oil & Gas.

The Williams Companies has appointed

Chad Zamarin senior VP strategy,

succeeding retiree Frank Billings. Zamarin

was previously with Cheniere Energy.

We’re hiring

Berkana Resources is looking for a

manager for its new Calgary office.

Statoil is to recruit ‘more than 50 skilled

workers annually’ to maintain competence

and capacity on the Norwegian continental

shelf.

CGI aims to recruit 250 experts over the

next two years in Lyon, France.

Folks, facts, orgs ...

Altair has acquired Carriots, developer of

its eponymous internet of things platform.

BP Ventures has invested $20 million into

Caltech AI/cognitive computing boutique

Beyond Limits. BPV’s Meghan Sharp is

to join the BL board.

IFP Training unit RSI is to merge into

French process simulation boutique Corys.

The combined company will be owned

50% by AREVA, 25% by EDF and 25%

by IFPT.

3esi-Enersight has acquired Energy

Navigator.

Ensco has acquired Atwood Oceanics in

an all-stock transaction.

Honeywell is acquiring Nextnine whose

ICS Shield cyber security solution will be

added to its Connected Plant offering.

AFGlobal has acquired Key Energy

Services’ technology development and

controls system business unit Advanced

Measurements.

Japan’s SoftBank has acquired a

‘significant minority interest’ in OSIsoft

from early investors Kleiner Perkins, TCV

and Tola Capital. OSIsoft backer, Mitsui

keeps the stake it acquired in 2016.

Teradata has acquired StackIQ, a

developer of a ‘bare metal’ software

platform for the cloud.

Vela has acquired Petrosys.

Tibco is to acquire data science specialist

Statistica.

Trimble has acquired Network Mapping

Group, adding data modeling and 3D

visualization to its energy solutions

portfolio.

Altair/Carriots. BP Ventures/Beyond Limits. RSI/Corys. 3ESI-Enersight/Energy Navigator. Ensco

Atwood. Honeywell/Nextnine. AFGlobal/Advanced Measurements. Sofbank/OSIsoft. Teradata/

StackIQ. Vela/Petrosys. Tibco/Statistica. Trimble/Network Mapping Group.

Done deals

API, Aprim, Aveva, Berkana, BCCK, CSE Icon, Baker Hughes, Blackeagle, CB&I, CGI, ConocoPhillips,

Dawson, DNV GL, Frontier Integrity, GSE, Halliburton, Helix, Merlin, New West, Offshore Technical

Compliance, Occidental, Petrosys, P2 Energy Solutions, PPDM, QS Energy, RocketFrac, Carnegie

Mellon/CERT, Stress Engineering Services, The Williams Companies, Statoil.

Page 9: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Oil Information Technology Journal ISSN 1632-9120 Volume 22

[email protected] Page 9 234th issue

A new publication from Siemens, ‘The

race to a digital future, assessing digital

intensity in US manufacturing,’ divides the

digital enterprise world into two camps,

‘efficiency experts’ and ‘revenue re-

inventors’. Both use digital technologies to

improve productivity and efficiency, but

the latter is ‘leading the way in exploring

how digital transform business models and

unlock new markets.’

In a somewhat oblique citation to a

publication from Tessela, Siemens kindly

places BP in the latter camp, thanks to the

‘new technology’ of the BP Well Advisor

(Oilit Jan 2013). Siemens surveyed some

200 US companies across five industrial

sectors including oil and gas. All sectors

‘have significant progress still to make as

they move toward a digitally enabled

future.’

BP as Siemens’ poster child for ‘revenue re-invention.’

Survey finds all US manufacturers lag on digital

Oil Information Technology Journal

ISSN 1632-9120

(formerly Petroleum Data Manager)

is published by The Data Room

© 2017 The Data Room SARL.

All rights reserved.

All trademarks acknowledged.

No reproduction without written permission

of the publisher. Copy and other

information to [email protected].

Tel +331 4623 9596

More from PNEC Data & Information Management Shell/Flare ‘redefine’ standard search model. Halliburton’s ‘open enterprise architecture.’

PNEC always offers rich pickings for the

data and knowledge managers and we had

to leave out a couple of interesting

presentations in our report last month. Joel

Graves and Paul Cleverley (Shell and

Flare Consultants) are ‘redefining’ the

standard information search model for oil

and gas. The standard model leverages

natural language processing, geo-

referencing and taxonomies to retrieve

‘pages of lists of records, documents or

websites.’ Shell is now working on a

‘transformational approach’ that adds

neural network-based analysis of large text

corpuses to identify geological analogues

and to progress ‘from finding to

understanding.’ Instead of building a

database of field analogues using

traditional manual techniques, these can

now be built automatically from textual

data.

In the last couple of years some 120 North

American upstream outfits went bust

leaving debt pile of $80 billion. In this

context, Jody Winston and Shashank

Panchangam presented Halliburton’s

‘Open enterprise architecture’ that is set to

‘solve E&P’s biggest challenges.’ Their

thesis is that E&Ps are undergoing a

generational change that shows the ‘folly’

of doing business in the same old way.

Changes are so pervasive that companies

must come together to solve computational

infrastructure challenges, software

development and deployment. The cost of

proprietary and closed systems will drive

companies toward solutions that reduce

existing high cost IT infrastructure,

overcome the obstacles of proprietary

databases and eliminate siloed applica-

tions. How? Well on the one hand,

‘software is eating the world,’ but it is not

proprietary software where vendor lock-in

is a concern. Rather it is cloud-based, open

source software like the Berkley data

analytics stack. This itself is built on a

constellation of open source tools - Apache

Mesos, Alluxio/Hadoop and Apache

Spark. Roll-in stateless programs based on

a REST API and an open source database

like Apache Cassandra. Cost savings from

migrating to open source in the cloud are

expected to be significant. The authors

cited GE Oil and Gas as an oil country

poster child for the approach.

More from PNEC.

Blockchain news Natixis/IBM/Trafigura on Hyperledger. BP/ENI/Wien on BTL Group’s Interbit. Xpansiv Data’s digital

feedstock on GEM OS. IBM/Energy Blockchain Labs on Hyperledger.

Natixis, IBM and Trafigura have

announced a blockchain solution to

support trade finance for US crude oil

transactions. The distributed ledger

platform, built on a ‘permissioned’ version

of the Linux Foundation’s open source

Hyperledger Fabric, captures major steps

in crude oil transactions into a blockchain,

ensuring improved transparency, enhanced

security, and optimized efficiency. All

parties share the same ledger and can

track the status of a transaction, from the

time a new trade is confirmed and

validated, to when the crude is inspected,

delivered and the letter of credit cancelled.

Following a successful trial, BP, Eni

Trading & Shipping and Wien Energie

are going live with a BTL Group’s

‘Interbit,’ ‘proprietary, private’ blockchain

-based energy trading platform. BTL is

inviting other energy companies to join in

the six month ‘go-to production’ phase

which will see the launch of a commercial

version of the energy trading solution,

linked into live environments. BTL is also

working on the applications of its

technology in other contexts. EY partnered

on the pilot.

San Francisco-based Xpansiv Data has

announced the ‘world’s first’ Digital

Feedstock for commodity markets. The

proof of concept used production data

from Wyoming-based natural gas producer

Carbon Creek Energy to create a digital

representation of CCE’s plant atop of

Xpansiv’s registry and marketplace

applications. These run on Gem OS, a

blockchain interoperability platform that

works across protocols including

Ethereum, Hyperledger and more. Xpansiv

now generates its ‘digital feedstock’ at

thousands of CCE’s ‘frack-free’ well sites

daily.

IBM and Beijing-based Energy

Blockchain Labs have rolled out the

‘world’s first’ blockchain-based green

asset management platform based (again)

on the open source, openly governed

Hyperledger Fabric.

Page 10: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 10

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

A presentation by Todd Buehlman at the

recent FME International User Conference

heard how Logic Solutions Group uses

FME to automate loads and enrich data for

oil and gas clients, notably BP’s Lower 48

Onshore unit where ‘cumbersome’ IHS

and Drillinginfo data has been QC’d and

loaded into Esri Geodatabases.

Norwegian drilling contractor Songa

Offshore has implemented the IFS IoT

business connector to make internet of

things data captured across its oil rigs

accessible from its IFS Applications suite.

France-headquartered Anticip has won a

three-year contract to provide its Tracking

EyeDefend geolocalization services to

Luloil’s operations in Irak. Anticip was

founded by former members of France’s

GIGN, an elite tactical gendarmerie unit.

C&J Energy Services has deployed MiX

Telematics’ fleet management solutions

for over 3500 vehicles drilling support

vehicles. MiX Fleet Manager is ‘ELD-

ready’ and supports FMCSA requirements.

Saudi Aramco has renewed its contract for

Coreworx’ Interface Management for an

extra five years. Coreworx has been used

on $60 billion worth of projects in the

Kingdom since 2011 including oil and gas,

infrastructure and new city builds.

Norway’s EPIM joint industry body

organization has awarded the contract for

its EPIM ID identity and authentication

service to ATEA. EPIM ID will provide

accounts management and authentication

services and ensure secure internet access

to EPIMs applications.

Gazprom Neft has signed a five-year

technology cooperation agreement with

Weatherford to ‘further develop current

business relationships and collaboration

between the companies.’

The Geological Society of London’s new

website for its Lyell Collection of journals

and books was developed atop HighWire

Press’ JCore platform.

IT Vizion’s Operational Excellence

solution is now available on the OSIsoft

Marketplace. Vizion OE extends PI

System infrastructure and system of record

by analyzing PI System data and reporting

operational bad actors.

Katalyst Data Management has signed

its first major well data (i.e. non seismic)

iGlass project, a ‘paper to digits’ initiative

to scan and classify well files from some

20,000 US Lower 48 wells.

OGA, the UK regulator has awarded CGG

GeoConsulting a contract to supply it with

digital well products. These are used

internally by OGA and will be released to

the public domain via the OGA’s website

to be used freely by E&P companies

looking for new prospects on the UKCS.

CGG has also extended its contract for

PTT E&P’s dedicated processing center

for an extra three years.

PGS has extended its seismic trace

management service agreement with

OvationData out to 2023. The deal covers

cloud data storage and management of

PGS’ global seismic library. OvationData

also provides remastering of legacy data to

modern high-density media.

Seeq has added connectivity to Inductive

Automation’s ‘Ignition’ scada system,

allowing users of Seeq Workbench to

‘wrangle’ process data into insights,

improved asset availability, product

quality and process efficiency.

Stress Engineering Services has teamed

up with DeepMar Consulting to expand

its existing upstream service offerings. The

integrated team will provide analysis,

testing, materials, real-time health

monitoring, predictive forecasting,

efficient work flow processes and

operational guidance.

Total has kicked of round 2 of its ‘Plant

4.0’ start-up incubator program, designed

to accelerate use of digital technology in

industry. The call is out for innovative

solutions in leak detection, corrosion

monitoring, non-invasive flow

measurement and valve position displays.

PI System integrator Capula is to provide

support and expertise for TrendMiner

users in the United Kingdom and Ireland

Wood Group has teamed with

Librestream on ‘eXpert’ a remote

collaboration video link for oil and gas.

Sales, deployments, partnerships … FME, Logic Solutions, IFS, Anticip, MiX Telematics, Coreworx, EPIM, ATEA, Weatherford, HighWire,

IT Vizion, Katalyst, CGG Geoconsulting, OvationData, Seeq, Inductive Automation, Stress

Engineering, DeepMar, Total, Capula, TrendMiner, Wood Group, Librestream.

The OMG’s Cloud Standards Customer

Council has published a paper defining a

standard reference architecture for

blockchain applications.

ISO is also in on the act with the

announcement of ‘new blockchain

international standards in pipeline’ (read

‘in the pipeline!’) The ISO effort is

conducted under the auspices of ISO TC

307 with ISO Australia’s Craig Dunn in

the chair.

The EU Commission recently held a ‘high

-level meeting’ to ‘leverage interoperabil-

ity to create the internet of energy.’ The

IoE is to build on ETSI/Saref, the smart

appliances reference ontology and

OneM2M, ‘the global initiative for IoT

standardization.’

A recent meeting of NEN, a Dutch

standards body, co-hosted with Shell heard

from the ISO ISO/TC 67/WG 4 work

group for reliability and cost-related

standards in the petroleum and other

industries. The group presented a revised

ISO 14224 standard for reliability and

maintenance data and a new ISO 19008

cost coding standard.

The US NIST held its second Big data

public workshop last month to promote

NBDIF, a ‘vendor-neutral, consensus-

based, technology-independent’ big data

interoperability framework. That should be

easy enough!

Energistics has put its recent webinar

providing a 101 introduction to the Resqml

reservoir data interoperability standard

online. The presentation was made jointly

by Energistics COO Jana Schey and BP’s

Lisa Towery. Energistics also recently

published a case study of ‘lean, automated

reporting powered by Witsml,’ co-

authored with IDS.

Standards stuff OMG and ISO announce (different) blockchain standards! EU Commission’s Internet of Energy. ISO

14224 reliability standard. NIST’s NBDIF big data framework. Energistics’ Resqml 101.

Page 11: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Oil Information Technology Journal ISSN 1632-9120 Volume 22

[email protected] Page 11 234th issue

The 9.8 release of Thermo Fisher Scien-

tific’s OpenInventor, out around

September 2017, will support virtual

reality application development on the

HTC Vive and Facebook’s Oculus Rift

headsets. These currently predominate in

the market. Both combine immersive

visualization, sensors for user displace-

ment detection and controllers for

manipulation. Open Inventor adds high

quality rendering and high-level generic

objects to extend support for data manipu-

lation.

Microsoft’s HoloLens ‘mixed reality’

headset promises a blended view of the

real and digital worlds. The kit is currently

only available in a ‘developers’ edition at

prices of $3,000 to $5,000.

IFS has developed a proof of concept

HoloLens application that adds a holo-

graphic overlay to enterprise data from its

IFS Applications industrial ERP flagship.

Field service personnel can view holo-

graphic data overlain on the real-world

object being inspected.

A joint venture between Worley Parsons,

CGI and 3D software boutique Taqtile

has announced Manifest, a HoloLens/

Azure field inspection and maintenance

solution integrated with Microsoft

Dynamics 365.

HoloLens does not yet have an end-user

edition. This, according to SlashDot and

blogger Brad Sams, is because Microsoft

canceled the second version and is now

working on an ‘even more advanced’

iteration to be available in 2019.

ISSN 1632—9120

Blog posting traces reporting software’s evolution from Flash to Tibco’s HTML5-based solution.

Jaspersoft for IDS DataNet

In a recent blog, Colin Dawson reports on

IDS’ use of TIBCO Jaspersoft as an

‘embeddable’ reporting, visualization and

query engine for use in its drilling

reporting solution IDS DataNet. DataNet

has, to date, reported on over 150,000

drilling operations on nearly 500 rigs

worldwide.

An early version of DataNet was built on

Flash. When this was deprecated a few

years ago, IDS carried out a ‘lengthy

investigation’ into embeddable technolo-

gies before deciding on Tibco’s Jaspersoft

with its HTML5-based visualization and

multi-tenancy. JasperReports was an

additional bonus, speeding DataNet report

generation tenfold.

IDS is now recreating legacy reports in

Jaspersoft Studio and creating custom

visualization widgets for wellbore

schematics, histograms and more such that

customers can run DataNet visualizations

on their own data. IDS sees the DataNet as

helping operators automate manual work

by, for example, moving from Microsoft

Excel to web-delivered data management.

Tibco also reports use of Jaspersoft in

GE’s Software & Analytics offering

although it is not so clear that this is still

leveraged in Predix.

Pipeline software news

DNV GL Pipeline Evaluation Portal. SNAM’s €200 million digital transformation. Hexagon/Leica’s

Captivate Pipeline solution. Schneider on integrity management. Splunk for major.

DNV GL has announced the Pipeline

Evaluation Portal to ‘connect pipeline

expertise with real-time data.’ The PEP

was inspired by a survey of 700 plus

worldwide pipeline professionals that

found that ‘61% believe most pipeline

failures could be avoided by investments

in new technologies’ and ‘67% say the oil

and gas industry needs a new way of

monitoring pipelines.’ The PEP exposes

DNV GL’s models, data and probabilistic

assessment tools to customers through a

web browser. The first application ready

for testing via the portal performs

interactive lateral buckling assessment.

In its in-house publication Perspectives,

DNV GL reports the use of data science to

improve Italian Snam’s gas transport and

to ‘progress from preventative to

predictive’ integrity management. Snam is

investing over €200 million in a five-year

digital transformation and is looking to

export its technology to worldwide gas

operators through a new global solutions

unit. Field staff now use about 1,000 tablet

computers under the company’s SMART

gas (Sistema Manutenzione Rete Trasporto

Gas) maintenance system. Data collected

in the field on construction and

maintenance operations and from field

instruments is now available for analytics

in real time. Information is fed back to

field workers on a ‘need-to-know’ basis.

Hexagon announced a ‘pipeline solution to

end all pipeline solutions’ aka the

‘industry’s first end-to-end fluid solution’

at its recent HxGN Live event in Las

Vegas. Turns out that the end to end

solution is limited pipeline survey. It

comes from the Leica Geosystems unit

which has rolled out ‘Captivate Pipeline’

to streamline tracking and reporting of

pipeline surveys. CP includes a bar code

reader to collect pipe attributes and is

integrated with Blue Sky’s Dash pipeline

survey software.

Lars Larsson (Schneider Electric) has

authored a 14 page, informative white

paper, Pipeline integrity: best practices to

prevent, detect and mitigate releases.’ The

paper covers advanced technologies that

enhance pipeline integrity, particularly

computational pipeline monitoring (CPM)

with reference to the 2012 API RP 1130

standard. CPM methods in use today

include line balance techniques, real-time

transient modeling, monitoring and

statistical analysis of pressure and flow

and acoustic monitoring.

Big data specialist Splunk reports on the

use of Splunk Enterprise to manage and

monitor tens of thousands of pipeline field

devices deployed across an unnamed US

energy company’s 50,000 mile pipeline

network. Log data from Schneider Electric

OASyS DNA and other scada systems are

amenable to capture into Splunk’s big data

lake where a variety of analytical

techniques raise alarms.

High-end virtual reality for Occulus and HTC Vive. IFS, Worley Parsons’ ‘mixed reality’ trials.

OpenInventor on Occulus. News from the HoloLens front

Page 12: Volume 22 Number 6 234th issue 2017 IVAAP microservices IT Journal 2017_6 (No. 234).pdf · transform splits information into its frequency components. A spectrum analyzer if you like

Page 12

Oil Information Technology Journal 2017 http://www.oilIT.com

© 2017 The Data Room

This is not an oil and gas tale but

Mitsubishi Hitachi’s collaboration with

OSIsoft and Microsoft on a new digital

platform for thermal power plant

operations is perhaps the shape of things to

come for the downstream. Mitsubishi’s

Tomoni digital platform provides ‘soup-to-

nuts’ support for power plant operations

that minimizes unplanned downtime and

ups asset performance.

At the 2017 Hannover Messe earlier this

year, Mitsubishi/Hitachi Power Systems

announced that Tomoni was to join the

‘Red carpet incubation program’ (RCIP), a

joint venture between OSIsoft and

Microsoft that uses OSIsoft’s PI Integrator

for Microsoft Azure to ‘automatically

clean, prepare and transmit’ PI System

data to Microsoft’s cloud.

Oil and gas operators may identify with

OSIsoft’s Prabal Acharyya who

commented, ‘Data preparation is a

necessary first step in analytics, but it

consumes inordinate amounts of time and

energy. [This collaboration] will accelerate

cloud-based advanced analytics solutions

for the power industry.’ RCIP is said to

reduce the burden involved in data

preparation, lowering the total time, cost

and energy consumed, as compared with

traditional data analytics methods and

business applications. Joseph Sirosh VP of

Microsoft’s data group added, ‘RCIP

bridges the OT-IT gap and helps

customers get valuable insights from the

PI System and Cortana Intelligence.’

Microsoft’s Red Carpet, more PI in the sky

Plug and play OPC connectivity simplifies IoT solutions. Quorum as Azure IoT poster child.

Microsoft has announced the IoT Central,

a software-as-a-service (SaaS) offering to

‘reduces the complexity of internet of

things solutions. A preconfigured IoT

solution, providing plug and play connec-

tivity from OPC UA and classic devices

into the Azure cloud, was unveiled at

Hannover Messe earlier this year. Another

IoT novelty from Microsoft is Azure Time

Series Insights, an analytics, storage and

visualization service for billion-event scale

cloud data. Azure Stream Analytics for

edge devices makes it possible to perform

streaming analytics ‘at the device level.’

Poster child for Microsoft’s Azure IoT is

long time partner and oil country software

house Quorum. Quorum reports that

today’s ‘the rate of innovation far exceeds

what traditional technology infrastructure

can handle.’ Cloud computing is impera-

tive to achieving IoT scalability. Quorum

has retooled its oil and gas field operations

toolset to run as service (but not a

‘microservice!’) in the Azure cloud.

If you fancy taking the IoT for a test spin,

the June 2017 issue of Microsoft DevNet

Magazine provides step by step instruc-

tions for deploying Microsoft’s IoT

solution to for remote monitoring and

predictive maintenance with using a DIY

testbed comprising a USB camera, a

Raspberry Pi device* running the

Windows 10 IoT Core OS and controlled

from a remote Windows workstation.

* A bare-bones computer for enthusiasts

that usually runs a Linux-based OS.

Microsoft Internet of Things Central

Mitsubishi Hitachi leverages OSIsoft-Microsoft incubator in power plant digitization.

We have not heard much from Houston-

based Mercury International Technology’s

(MIT!) since the October 2007 deal which

gave Schlumberger exclusive rights to its

remote visualization technology,

ThinAnywhere. Last year, MIT

renegotiated the deal. The exclusivity

clause has gone and MIT can now sell

ThinAnywhere direct again.

To entice users back into the fold, MIT has

rolled out a new release of TA with

functionality which will be exclusive to

those who sign a new maintenance

agreement. Customers who have service

agreements with third parties (notably

Schlumberger) will not benefit from the

new goodies. These include RHEL 7

support, new compression that doubles

graphics performance, mobile licenses and

support for Java 3D OGL as used in

Landmark’s latest release. For Windows,

TA now includes the above new features

plus a new Windows 10 client edition.

More from MIT.

Implico rolls-out OpenTAS at Gunvor Ingolstadt refinery Tank truck filling continues as new terminal management system is paged-in.

ThinAnywhere, ‘we’re back!’ Ten year exclusive deal with Schlumberger expires. Remote visualization now selling direct.

Software and consulting company Implico

has deployed its OpenTAS terminal

automation solution at Gunvor’s refinery

in Ingolstadt, Germany. The upgrade was

performed as the refinery continues in

normal operations. One by one, Implico

decoupled each loading bay from the

legacy system, upgraded the hardware and

field equipment (with help from

Actenium) before putting the bay back on

stream. Implico also set up development

and acceptance testing systems during the

migration period. This allowed the

software company’s developers to test the

nuts and bolts of the new functions

beforehand and verify compatibility of

hardware and software.

Implico’s Volkmar said, ‘We also took the

opportunity to replace the old data

interfaces. Gunvor now uses a new XML

format for data exchange which will

improve data communications in the long

run.’ OpenTAS’ provides manufacturer-

independent communication with DCS/

PLC systems.

OpenTAS leverages FLEXX, an EU XML

-based standard for downstream oil data

exchange that is aligned with PIDX’

petroleum industry e-commerce protocol.