Four tips to take you beyond the big data hype cycle Blog post by Svetlana Sicular

Using big data analytics in your business can save tens of millions of dollars — or open up new opportunities. But it’s early days, so here are suggestions that will help you avoid the “trough of disillusionment.”As 2013 kicked off Gartner analyst Svetlana Sicular noted in her blog that big data is sliding down into the Trough of Disillusionment, a steep cliff in the Gartner Hype Cycle that follows the Peak of Inflated Expectations. (If you’re not familiar with the Gartner hype cycle, check out the illustration on Svetlana’s blog.)

In my experience with big data, there’s no reason for disillusionment. Big data analysis can create huge amounts of value. As with most worthwhile pursuits, it takes work to unlock that value. In the last three years, as a member of the CIO staff at Intel, I’ve spent a big chunk of my time developing business intelligence and analytics solutions that have resulted in tremendous cost and time savings and substantially improved time to market.Beyond my own personal anecdotes, Gartner’s most recent Hype Cycle report seems to agree that there is in fact substance behind the hype: if you can stick it out past knowledge gathering and initial investment to actual deployment, you’ll move beyond disillusionment and start seeing results. As a matter of fact, many organizations are already finding the value in Big Data and investing even more heavily in related projects for 2014.However, the report also notes that 2013 is the year of experimentation and early deployment, which is why many may not be singing the praises of big data initiatives just yet.

If you find yourself in this stage, there’s no reason to despair. Here are four tips for steering clear of the ‘trough of disillusionment’ and deriving value from your big data implementation.

Think even bigger.Think of a larger, more comprehensive model of business activity and figure out how you can populate it from as many data sources as possible. Then you can see the big picture. After you envision what infrastructure you need to support data at that scale, ask yourself if you could increase your data by a factor of 10 or more and still use the same infrastructure.

This is what Oregon Health & Science University (OHSU) is doing on a big data project to speed up analysis of human genomic profiles, which could help with creating personalized treatments for cancer as well as supporting many other types of scientific breakthroughs. Calculating about a terabyte of data per patient, multiplied by potentially millions, OHSU and its technology partners are developing infrastructure to handle the massive amount of data involved in sequencing an individual’s human genome and noting changes over time. With breakthroughs in big data processing, the cost for such sequencing could come down to as low as $1,000 per person for this once-elite research, which means demand will skyrocket. And when demand skyrockets, so will the data.

Find relevant data for the business. Learn from line of business leaders what their challenges are, what’s important to them, and what they need to know to increase their business impact. Then search for data to see if you can help them solve their business problems. That’s exactly what happened with Intel’s internal big data initiatives. We were asked to help the sales team focus on which resellers to engage, when, and with what products. In 2012, the results of this project drove an estimated $20 million in new revenue and opportunities, with more expected in 2013.

Be flexible. We are in a phase of rapid innovation. It isn’t like implementing enterprise resource planning. From a technology standpoint, you must be fluid, flexible, and ready to move to a different solution if the need arises. For example, the database architecture built to collect “smart grid” energy data in Austin, Texas, withPecan Street Inc., a nonprofit group of universities, technology companies, and utility providers, is now on its third iteration.

As smart meters generate more and more detailed data, Pecan Street Inc. is finding new ways for consumers to reduce energy consumption as well as helping utilities better manage their grids. But Pecan Street had to be willing to keep changing its infrastructure to meet demand. The bottom line: If you think you know what tools you need to build big data solutions, a year from now it will look different. Be ready to adapt.

Connect the dots. At Intel, we realized there could be tremendous benefit in correlating design data with manufacturing data. A big part of our development cycle is “test, reengineer, test, reengineer.” There is value in speeding up that cycle. The analytics team began looking at the manufacturing data—from the specific units that were coming out of manufacturing—and tying it back to the design process.

In doing so, it became evident that standard testing processes could be streamlined without negatively impacting quality. We used predictive analytics to streamline the chip design validation and debug process by 25 percent and to compress processor test times. In making processor test times more efficient, we avoided $3 million in costs in 2012 on the testing of one line of Intel Core processors. Extending this solution into 2014 is expected to result in a reduced expenditure of $30 million.

We are only at the beginning of understanding how we can use big data for big gains. Far from being disillusioned with big data, we find many exciting possibilities as we look at large business problems holistically and see ways to help both the top line and the bottom line, all while helping our IT infrastructure run more efficiently and securely. It’s not easy to get started, but it is certainly well worth the time and effort.

 

Big Data Needs Thick Data Article from Ethnography Matters

Big Data can have enormous appeal. Who wants to be thought of as a small thinker when there is an opportunity to go BIG?

The positivistic bias in favor of Big Data (a term often used to describe the quantitative data that is produced through analysis of enormous datasets) as an objective way to understand our world presents challenges for ethnographers. What are ethnographers to do when our research is seen as insignificant or invaluable? Can we simply ignore Big Data as too muddled in hype to be useful?

No. Ethnographers must engage with Big Data. Otherwise our work can be all too easily shoved into another department, minimized as a small line item on a budget, and relegated to the small data corner. But how can our kind of research be seen as an equally important to algorithmically processed data? What is the ethnographer’s 10 second elevator pitch to a room of data scientists?

…and GO!

Big Data produces so much information that it needs something more to bridge and/or reveal knowledge gaps. That’s why ethnographic work holds such enormous value in the era of Big Data.

Lacking the conceptual words to quickly position the value of ethnographic work in the context of Big Data, I have begun, over the last year, to employ the term Thick Data (with a nod to Clifford Geertz!) to advocate for integrative approaches to research. Thick Data uncovers the meaning behind Big Data visualization and analysis.

Thick Data: ethnographic approaches that uncover the meaning behind Big Data visualization and analysis.

Thick Data analysis primarily relies on human brain power to process a small “N” while big data analysis requires computational power (of course with humans writing the algorithms) to process a large “N”. Big Data reveals insights with a particular range of data points, while Thick Data reveals the social context of and connections between data points. Big Data delivers numbers; thick data delivers stories. Big data relies on machine learning; thick data relies on human learning. 

CAUTION

As the concept of “Big Data” has become mainstream, many qualitative researchers from Genevieve Bell (Big Data as a person) to  Kate Crawford (algorithmic illusion, data fundamentalism), and danah boyd (privacy concerns) have written essays on the limitations of Big Data. Journalists have also added to the conversation. Caribou Honigdefends small data, Gary Marcus cautions about the limitations of inferring correlations, Samuel Arbesman calls for us to move on to long data. Our very own Jenna Burrell has produced a guide for ethnographers to understand big data.

Inside organizations Big Data can be dangerous. Steven Maxwell points out that “People are getting caught up on the quantity side of the equation rather than the quality of the business insights that analytics can unearth.” More numbers do not necessarily produce more insights.

Another problem is that Big Data tends to place a huge value on quantitative results, while devaluing the importance of qualitative results. This leads to the dangerous idea that statistically normalized and standardized data is more useful and objective than qualitative data, reinforcing the notion that qualitative data is small data.

These two problems, in combination, reinforce and empower decades of corporate management decision-making based on quantitative data alone. Corporate management consultants have long been working with quantitative data to create more efficient and profitable companies.

With statistically sound analysis, consultants advise companies to downsize, hire, expand, merge, sell, acquire, shutdown, and outsource all based on numbers (e.g.Mckinsey, Bain & Company, BCG, and Deloitte).

Without a counterbalance the risk in a Big Data world is that organizations and individuals start making decisions and optimizing performance for metrics—metrics that are derived from algorithms. And in this whole optimization process, people, stories, actual experiences, are all but forgotten. The danger, writes Clive Thompson, is that “by taking human decision-making out of the equation, we’re slowly stripping away deliberation—moments where we reflect on the morality of our actions.”

INSPIRATION and EMOTION

tricia wangthe rest of this article can be read here: https://ethnographymatters.net/2013/05/13/big-data-needs-thick-data/

 

SPE Abu Dhabi Section Meeting 30th October 2013

Integrated Operations and Digital Oilfield enabling new operational and project concepts

Tony Edwards, CEO of Stepchange Global, the International Digital Oilfield Consultancy, will be speaking at SPE’s Abu Dhabi Section Meeting on Wednesday 30th October, at the Hilton Hotel, Abu Dhabi Corniche, from 18.30.

He says: “Integrated Operations (IO) and Digital Oilfield (DOF) has been a major performance improvement initiative within many oil and gas companies for much of the last decade.  It is therefore surprising to find that, although many companies have been successful in applying this to brownfield operations, the application to new greenfields has made less of an impact.  Value potential that is accessible from the application IO to new project concepts will be outlined in the areas of CAPEX, OPEX, safety and risk, and production and availability”.

Tony will be arriving in Abu Dhabi direct from the SPE Intelligent Energy Conference in Dubai, having delivered two papers – the art of Intelligent Energy 2, and Intelligent Energy – Changing how work gets done” a paper delivered in collaboration with MIT.  Stepchange Global will also be exhibiting during the conference, on stand D40. The Exhibition is free entry, and details can be found on the SPE Intelligent Energy site here:

https://www.intelligentenergy-me.com/

About StepChange Global:

StepChange Global is a Market leader in the application of Intelligent Energy, Integrated Operations and Collaborative Working. They move at the cutting edge of how people, process, technology, environment and organisations change and influence each other to produce tangible performance improvements, offering an unrivalled resource in the application of Intelligent Energy and Integrated Operations (IO) capabilities. Visit their website at www.stepchangeglobal.com.

contact Tony direct at tony.edwards@stepchangeglobal.com.

Collaborative Environments are Key Article by Tony Edwards

With today’s focus on unmanning or minimal manning and decreasing operational costs for remote offshore platforms, the use of “collaborative environments” ensures that operational issues can be solved with real-time information.

Collaboration centres make the best use of scarce resources by creating an operations hub where experts from a variety of disciplines can access information, troubleshoot, monitor, and optimize the oil and gas fields, all from a single location.

Intelligent Operation (IO) provides the perfect collaborative environment for communication, data collection, reporting, monitoring, and information sharing. These physical workspaces are intended to help people make better, more informed decisions in order to take the appropriate actions across the enterprise on time and in real time.  Opportunities can be prioritized, with the common goal of maintaining optimal, unbroken production.

Innovations in various collaborative technologies are helping companies make integrated operations a reality. Today’s collaboration centres provide not only a high-tech physical workspace, but a new way of operating. Access to a complete array of digital, real-time data linked with state-of-the-art technology facilitates the operations process and gives personnel the comfort level to make decisions quickly and intelligently.  This ability to make rapid, informed decisions will define efficient operations.

Santos integrates intelligent field with transformative operations centre Changing how Gas Fields are operated

Stepchange Global are delighted to have been working with Santos on their new operations centre.

Control rooms are adding a few new tasks to their job descriptions.  While they’ll always be responsible for process control and process safety, they’re now evolving into integrated operations centres, which are also responsible for making the right business decisions, too. However, to run efficient and profitable applications, these centres need data from equally intelligent fields of sensors, instruments, remote terminal units (RTUs) and networking devices.

Both integrated control operations and the intelligent fields are transforming the jobs and the very nature of field devices and what it means to be a control room.

For instance, Santos Ltd. has been supplying natural gas in eastern Australia for 50 years, and one of its primary applications includes about 10 compressor stations, more than 1,000 wells,  and regional control hubs producing coal-seam natural gas in a area covering several hundred square kilometers (km), which is located up to 1,000 km west of Brisbane. Obviously, managing and coordinating the operations of all these far-flung wells and facilities is a complex and usually time-consuming job.

“We have many residential communities in our area, and so we must engage and negotiate with them. Then, our coal seam gas is low pressure, and so we need to bring more equipment into the field to secure it,” said Patrick Gorey, controls advisor for engineering and reliability at Santos. “We needed a more efficient way to run our plants, monitor and control the fields, and integrate operations. Supporting numerous staff in the field was becoming cost prohibitive, and so we also needed to provide real-time data to our maintenance and operations personnel to allow them to optimize performance of the wells, well-pads and fields.”

To achieve its vision, Santos began developing and designing its Brisbane Primary Control Center (PCC) in 2010, and just finished building it about one year ago. The PCC is part of Santos’ overall Brisbane Operations Center (BOC), and its upgrade will help supply the Gladstone Liquid Natural Gas (GLNG) project, which is a joint venture between Santos and three of the world’s largest energy companies, including Petronas, Total and Kogas. GLNG is expected to increase the number of Santos’ wells in the region west of Brisbane to about 6,000 over the life of project.

“We’re implementing our new hubs this year, and will begin increasing LNG production next year,” said Gorey.

Gorey and Mike Ilgen, business development and industry marketing director for Emerson Process Management, Asia Pacific, presented “Santos Brisbane Primary Control Center—A Collaborative Environment for the Intelligent Field” on the second day of Emerson Global Users Exchange 2013 on Oct. 1 at the Gaylord Hotel and Convention Center in Grapevine, Texas, near Dallas. Also, a YouTube video of Santos’ project is at https://bit.ly/1c1QSmm

Ilgen added that integrated operations centers and intelligent field applications are transforming the oil and gas and other process industries. “Most international oil companies (IOCs) and national oil companies (NOCs) have some form of these new centers, and research indicates that intelligent operations and smart fields can improve overall gas recovery by 5{b5f554a57bdf937f120bd279bce2e076afbdb96072e0d19c4adcd5530523b70d} and overall oil recovery by 10{b5f554a57bdf937f120bd279bce2e076afbdb96072e0d19c4adcd5530523b70d}, which can mean billions of dollars in added revenue,” said Ilgen. “However, all these integrated operations (iOps) centers are based on more and better data from their intelligent fields and the pervasive sensors located downhole and at the wellheads, compressor stations and other facilities. This data can then go anywhere in the world, and is then treated and analyzed by intelligent applications to improve decision making and operations. And, the faster this whole data-to-decision loop can go, the better companies can run their businesses.”

Ilgen added that intelligent field and integrated operations eventually ties together entire production and business operations as they’ve never been melded before, and serves to meld operations, maintenance, collaborative areas, multi-disciplinary teams, production planning and scheduling, and enterprise functions.

For Santos’ natural gas extraction and processing operations, the “gas field of the future” means no longer driving or flying by helicopter to remote wells, which is costly and can pose added safety risks. Now, Santos’ operators and engineers are implementing predictive diagnostics, including using wireless RTUs to conduct remote well tests from their PCC and DeltaV system in Brisbane. Likewise, some maintenance tasks used to require three or four days worth of travel and permitting to perform a three-hour job, but now they only need one hour of remote travel, plus the three hours to the job.

Gorey explained that its new PCC design was based on a predicted organization model for 2013, which consists of:
•    Three stations and spare capacity;
•    Four asset-based collaborative environment (CE) joint workspaces with support for one field location;
•    Two discipline-based CEs for wells, maintenance and projects;
•    Two “war rooms” with video conferencing to support event-based issues; and
•    Hub-based CEs for field team use; and
•    One field office in the town of Roma.

To design and build their new PCC, Gorey added that Santos team held meetings with relevant teams, including engineering and technical support, and field-based team representatives. Next, they met with their design teams, including local architects, services consultants and construction team. Together, they all generated and optimized a floor-plan layout, and created detailed scope and costs for it. Emerson serves as the main automation contractor (MAC) for the GLNG, and helps collect and deliver its real-time information. Lastly, the new PCC was built, and it’s been up and running for about 12 months.

“We also hired a change-management company, Step Change Global, to help us look at what were doing now, and how we wanted to do it in the future,” explained Gorey. “We all assessed our opportunities and challenges, and obtained a good understanding of the operation and what we were trying to achieve. We couldn’t just set up new screens and fancy tools. We had to evaluate our specific business goals, and use them to help define our control center goals. This meant asking basic questions like, ‘If we need an important KPI, are the operators able to see it?’ Equipment makes gas, but people are essential to make sure its runs properly and efficiently. So, what we’re really doing is driving behaviors to achieve our goals.”

Gorey added that the new PCC has:

  • Fundamentally changed the way that Santos operates its gas fields in the Bowen and Surat basins;
  • Integrated its planning, scheduling, operations and maintenance;
  • Improved its ability to monitor and control all its upstream facilities and collaborate with teams in the field to improve operations;
  • Improved compliance with its planning and production;
  • Reduced personnel in the field and trips to the field by more than 50{b5f554a57bdf937f120bd279bce2e076afbdb96072e0d19c4adcd5530523b70d}; and
  • Already achieved payback in less than one year.

“We have developed a world-class remote operation centre,” Gorey stated. “The centre has changed the way our gas fields in the Bowen and Surat Basins are operated. Emerson’s team brought the process automation expertise we needed to meet global standards, and their solutions have equipped us with the ability to centrally monitor the production and progress of our intelligent assets up to 1,000 kilometres apart.”

this article was published by Emerson Exchange: https://community.emerson.com/process/emerson-exchange/b/weblog/archive/2013/10/01/santos-integrates-intelligent-field-with-transformative-operations-center.aspx?WT.i_asset_id=ExchangeDailyTues&WT.mc_id=1401

 

It’s Not Just the Technology By Tony Edwards

Clearly Integrated Operations (IO) is not just about technology.  The most significant change that has taken place is in the way companies are organized in order to maximise the value from having this real-time data and information.

Many businesses see this as a technology-enabled transformation programme, whereby the way people work, from the offshore technician to the HQ commercial analyst, is fundamentally changed.  In order to achieve this, core work processes need to be updated, resistance to change needs to be addressed and overcome, and organizational models and structures need to be re-aligned.

Commonly referred to as the integration of “people, process, technology, and organization” this fundamental change delivers a capability that adds value in day-to-day operations.

Having this real-time information at one’s fingertips allows the company to:

•          Maximize the throughput of production systems

•          Reduce and recover from unplanned events

•          Balance short- term production goals with long-term ultimate recovery

•          Reduce costs by optimizing maintenance planning

•          Maximize the use of scarce resources

•          Carry out remote operations and remove people from harm’s way

Technology on its own is not going to work as shown by our favourite equation:

 NT + OO = EOO

or

New Technology plus Old Organization = Expensive Old Organization

  Investment in technology without investment in people, process and organization could prove to be a very expensive mistake in the long run.