Increasing productivity by taking away software
Friday, June 10, 2011
Many oil and gas companies have a complex array of software tools used to work with subsurface data to build models and make calculations on which key decisions are made. Can it be simplified?
Many oil and gas companies have a complex array of software tools used to work with subsurface data to build models and make calculations on which key decisions are made.
Bringing in new software is often justified by calculating productivity gains from using the individual new software application; however the introduction of yet another tool may reduce productivity.
The company may gain more by taking away software or rationalising. “If you want to make productivity gains it might be worth looking at whether you can simplify things,” said Ed Evans, cofounder and Managing Director of New Digital Business(NDB), a consultancy to the upstream oil and gas industry, and a past manager of technical systems with BG Group, speaking at the April 20th Finding Petroleum London conference “business opportunities with subsurface data”.
“Subsurface software functions within a complex environment of infrastructure, data and operating systems. This complexity often negatively impacts end user productivity.”
Looking at a particular workflow, like planning a well path for example, it is not unusual for a company to have say 3 or 4 tools for visualising and modelling the reservoir, a range of tools for plotting well paths, and then the drillers want to use their own tools for well paths, bottom hole assembly and drilling and sampling operations design.
This leads to multiple internal data transfers, data dead ends for calculated risk or multiple realisations and extended timescales for work and rework.
When it comes to managing the portfolio of software tools, some companies do it very well, some companies do it on occasion and results deteriorate, other companies don’t do it at all, he said.
Productivity can decrease if there are too many (or not enough) choices of software tools; if people don’t know how to use them; if there is a lot of searching for data, or reformatting it; if it is difficult to move data between different tools; if there is a lot of system downtime or the network is slow; if people spend too much time having to re-do work to fit into the wider business process; if there is no support or help if things go wrong; if people aren’t confident in the system.
Productivity can be increased where the software tools mirror or enhance the existing business processes; if people know how to use the software; if information is available in the right format to load into the software; if people trust the tools and trust the data; if the systems are available and responsive when required; if people understand how this particular business process fits into the broader business; if support is available; if there is confidence in the system.
So you can see how people can be much more productive overall if the company’s technical systems environment is well managed and delivers the data and applications effectively to a well trained workforce. A well-managed application portfolio is a critical element of that.
“We have fewer and fewer resources. We need to know that when we ask people to do a task they can do that with confidence using the software tools available,” he said.
There are many reasons for over-complex application toolsets: due to preferences of individual staff members for what tools they want to use, due to inheritance, due to a lack of pruning or simply due to a lack of control or planning in this area.
Individual users are often adept at justifying the need for new technology or retaining the status quo according to their preference. For example, a reservoir modeller in Egypt might say that the reservoir is very complex so it requires a special tool to model it, rather than the one the company usually uses.
But then the team doing reservoir simulation might want to use a certain tool because that’s what they’ve always done, and that one doesn’t integrate well with the reservoir modelling tool.
There are other examples of geologists being sent to remote sites and being expected to use software applications they have never used before. Is it better to build up the users’ skill set or to rebuild the model in the more familiar package? “You can get different results with different tools,” he said.
Who should lead the process of controlling applications? The company IT department are often concerned about the range of applications and the cost of support and maintenance but are not in a position to decide or dictate which software applications the company should use. “Where a CIO may be confident in questions of infrastructure or data and information management, they are often much less confident with the application software. They feel that it’s more of a user domain,” he said.
So where is the business case for reducing complexity in the applications portfolio and who should lead this work and own the results?
“Every time you add a new application, you’ve got to integrate it with the others. So managing the applications suite is not just about the purchase cost of new applications but the net impact of new tools on user productivity.”
“The person who is responsible for how a function is carried out in the business should be responsible for the tools used in carrying out that function.”
Depending on the organisation structure this person may be a Chief Geologist or Head of Reservoir Engineering. The process can be facilitated by IT or a project manager.
Making a choice
If you have several software tools which all do the same thing, and you want to simplify things, then a decision needs to be made as to which tools the company is going to use as a standard.
It is much easier if there is a “discipline head” in the company who will make decisions about which tools that people in the discipline is going to use.
One of the barriers to controlling applications can be the difficulty in in understanding the value of each tool to the business and clarity around “who uses it to do what?” Ideally you would start with the business process and match the application to the function, but subsurface business processes are difficult to map and defy conventional process modelling.
Process modelling
Conventional process modelling defines the tasks which need to be carried out, the order in which they should be carried out and which tasks need to happen before others (dependencies). When each part of the process is completed the ‘dependent’ tasks can go ahead.
In modelling the subsurface it is quickly apparent that the tasks undertaken, the order in which they happen and the amount of effort or value placed on each task all depend upon the geological context, the amount of data and the ‘size’ of the envisaged investment, so the process if different each time. Also the ‘products’ of the processes are never finished, the seismic structural model can always be refined and updated based on new data, for example.
To break this problem and ascribe the software to the business process, New Digital Business (NDB) suggests that you define the specific tasks as components of the business process but don’t try to combine them or work out the schedule or dependencies. The existing software tools can be listed against the components as in NDB’s ‘Dog-Tag’ model.
The ‘Dog-Tag’ model can be used to classify the existing software tools according to the stage of the subsurface data process they are used in.
For example at the “play evaluation” stage, you can have tools to analyse wells, do basin dynamics, hydrocarbon charge, framework and reservoir at the appropriate level of detail. At the “prospect evaluation” stage you can have tools to analyse different aspects of the prospect. You have other subsurface tools for developing the reservoir; and tools used during production.
This mapping exercise makes it is easier for a discipline head to make a decision about which software tool the company is going to standardise on to do each specific task.
You can develop lists of software tools which every asset should have available, and specialist tools which need to be available to certain individuals for tasks they do every now and again perhaps as a service to the asset teams.
Don’t set targets for how many software applications you want to ultimately be using – because users might actually need all of the software tools on their computers. “It is more important to try to work out exactly what people need,” he said.
Once you have developed this clear model, everybody in the company can use it, even if they are not subsurface specialists. “It is something even IT managers can understand,” he said.
The ‘Dog-Tag’ model can be used to fill in gaps or remove duplicates in functionality as determined and agreed by the function. By aiming for a tool per task it is much easier for users to make choices about their training and technical development and for the support organisation to develop their data management processes and infrastructure plans. Controlling the applications portfolio is an essential cornerstone of an effective technical systems environment.