Fighting Oil Well Inefficiencies: The Promise of Big Data

Fighting Oil Well Inefficiencies: The Promise of Big Data

From planning and maintenance to equipment deployment, oil well inefficiencies are wide-ranging and require diverse solutions. Low oil prices have resulted in cost-cutting, but efficiency requires investment in new technologies and big data.

Last year GE Intelligent Platforms Software announced a new production optimization project to connect BP oil wells around the world to the industrial internet. GE’s data management software would give field engineers real-time access to common machine and operational data across all wells, information that could improve efficiency, prevent failures and minimize downtime. Deployment at 650 BP wells is scheduled for 2016, expanding to 4,000 in coming years.

In a press release, Kate Johnson, GE Intelligent Platforms Software CEO points out: “Based on industry averages, for each week a well is out of commission, operators experience revenue losses of more than $3 million for a subsea well.”


Courtesy of GE


Big Data: A Long-Term Trend For Oil & Gas?

David Haake, cognitive solutions team leader for the chemicals and petroleum division at IBM suggests big data could be an essential tool for tackling the multitude of oilfield inefficiencies. Capturing, pooling and processing information from areas as varied as equipment reliability, geotechnical conditions, weather forecasts and even human knowledge could improve operations and result in higher yield and greater production revenues.

After 100 years of drilling and trillions of dollars of investment, we are still only managing to extract less than 30% of the available oil out of the ground. If 100 years of best practice efforts results in these levels, then is it really best practice?

The concept of “big data” is only about three years old, he notes, but if all available relevant information is analyzed effectively, its use could become a long-term trend for industry.

The two biggest problems in production operations are reliability of equipment and science and nature-related issues. Are you drilling in the best place and what are the geotechnical conditions? Reliability is the holy grail of the oil business. If you are reliable and safe then you will do as well as science and nature will let you.

Pulling together all available information could make oil wells much more efficient.

Take the weather. Just having access to the best weather prediction technology can help with preventive measures. Why shut an offshore platform down a week early if you could have gone on drilling for another three days without putting anyone in danger?

New Monitoring and Detection Solutions

Courtesy of GE

Courtesy of GE

But it all depends on data acquisition. New measuring and monitoring solutions can gather precious information. Quebec-based Opsens Solutions focuses on innovative fiber optic sensing solutions designed to measure temperature, pressure and strain in difficult environments. Among them is the OPP-W, a fiber optic pressure and temperature sensor offering long-term accuracy, durability, low drift and high fidelity in the harshest applications, including downhole oil and gas.

Emerson Process Management also offers measurement and analytical technologies. These include its Roxar Downhole Wireless PT Sensor System, measuring online and in real-time the previously inaccessible pressure and temperature behind the casing in subsea production wells.

UK-based Ti Thermal Imaging provides thermal imaging services to detect metal fatigue, pipe wall thickness and corrosion, and assess weld integrity. The company’s new Android-based TICOR thermal imaging reporting software cuts survey and reporting time by 25% by streamlining data capture, input and analysis. For CEO Richard Wallace:

It can also be integrated into WEBCOR, our online predictive maintenance and condition monitoring program so data can be used to implement a maintenance plan to reduce unplanned shut-downs.

After Big Data, Data Lakes?

Courtesy of GE

Courtesy of GE

But human experience and behavior are essential too, even within big data, stresses IBM’s Haake. Valuable first-hand knowledge gained through decades-long oil field experience could be harvested and fed into a data lake (a large-scale storage repository and processing engine) via cognitive analytics.

There is no magic to big data or cognitive analytics, but they depend on ingesting the information in the first place, which requires the creation of a data lake. That is the real work. The problem is there are tens of thousands of wells and at least 10,000 oil fields operated by hundreds of companies. And while we are trying to help clients make data lakes, the reality is that they are more of a ‘data swamp,’ with lots of small and dirty puddles.

The available information is indeed messy and muddled. It concerns many different areas potentially relevant to oil well production, but needs to be cleaned up to be useful.

We can clean these puddles of individual well-level data and build effective data lakes for a field to give great field-level insight, but no-one is applying big data across an entire enterprise yet, even if they can see the promise of it.

He adds that big data could improve “the amount of oil being extracted and that even an incremental improvement of just 1%, for example, would make a considerable difference to a producer now, and even more so when the price of oil starts to rise again.”

Related articlesSee all articles

Advertisement pub

Most Viewed

Advertisement pub
Advertisement pub

Latest Stories

Advertisement pub

Editor’s Picks