Use Case Examples

The term use case is used here to avoid confusion with the term workflow, which has special meaning in this document. This topic includes a number of descriptions of use cases involving workflows and activities. Although the examples mention E&P concepts from the drilling and earth modeling domains, these are not part of the activity model and can be found elsewhere in Seabed.

We recommend that you briefly skim over all of the use cases, and then review in detail the one or two that are of highest interest.

1. Drill a well. A drilling activity is initiated for exploration well red-cable-27 for client Big Oil, Inc. This activity begins by locating a best practice (workflow) based on the specific conditions expected: a deepwater North Sea well, high temperature and high pressure expected in the overburden, and a completion-interval in the Cretaceous Zone. Subsequently, study data is identified and pulled together in a project. The quality of this data is considered to be of known high quality, based on being able to quickly ascertain the original source of the data, the identity of the analysts and vendors involved with its handling, and the processing and QA steps to which it has been subjected. A plan is drawn up, and after the authorization for expenditure (AFE) is approved, staff, material, and equipment are assembled, and drilling commences. Tubulars and casing shoes are set and cemented three separate times during the process to avoid well control and formation damage problems. Drilling is interrupted on three occasions, two for hole stability problems, and once due to weather conditions. During the drilling, the trajectory is continuously compared to the plan, and variances with expected geometry and progress are noted. An encounter with an unexpected fault forces review of the overburden structural model (and in particular a recalculation of the stress field), and generation of a revised plan. Depth-based drilling parameters are sampled and continuously recorded (rate of penetration, weight on bit, revolutions per minute, torque, hook-load, drilling fluid parameters, and so on), as are more static data such as drill-string descriptions, and so forth. Throughout the drilling, daily reports are sent from the rig back to the operator's field office in Aberdeen. The reports summarize the day's progress, account for the time and money spent by coded categories, list quantities of consumables on hand, and detail any safety or environmental incidents. Data from this and similar operator work in the same field is pooled and used to generate monthly periodic management reports grouped by activity class and costs.

2. Create an interpretive backtracking computational example. A geoscientist selects a workflow for execution. The goal is to create an initial earth model by interpreting available seismic data, calibrating the velocity model with well data from a nearby field, converting the seismic to depth, creating a fault framework model, and creating a 3D property model from a combination of seismic and well data. The 3D property model includes a 3D stress field and a pore pressure model calculated for the overburden, and fluid saturation and transmissibility data in what appears to be a promising reservoir. This data is subsequently used to identify drilling targets and well trajectories, along with full drilling plans for five exploration wells. Subsequent experience from the drilling team on one of them (red-cable-27) indicates stability problems along its trajectory, and the presence of an unidentified fault. Based on these findings, another version of the earth model is generated starting from a revised fault framework model, and the stress field is recalculated. The differences explain the stability problems encountered and are used to recalculate a revised trajectory plan to reach the original targets with a trajectory less likely to be disrupted by formation instability. The resulting model holds up well in the resumed drilling and initial well testing. At the management exploration drilling review meeting, the predicted production volume estimates were challenged based on similar fields nearby, but the estimates were successfully defended by demonstrating data traceability from initial acquisition, quality control and acceptance steps, and all subsequent processing right through to its use in the proposed business decision. As a consequence, there is a decision to put three of the original five exploration wells into production. A dataset consisting of the key and consistent elements of the field subsurface earth model and associated processing history is generated, and added to the drilling records for all five wells, all of which are archived for later use in simulation studies aimed at minimizing lifting costs, and creating an infill drilling program.

3. Promote and refine an actual history for reuse as a best practice. Upon review of the records of the drilling of red-cable-27, the driller pointed out that a proposed method of avoiding swabbing-induced fluid inrush during the drill-string removal phase prior to setting of casing had been effective in a situation that had rapidly escalated to a dangerous situation. The method involved a protocol that synchronized the controlled tool upward acceleration with a transient mud pressure pulse, every time the tool-string was raised to remove a joint of drill-pipe. Although the procedure had required more time than the usual practice, it had been effective in defusing a very tense situation. A drilling and workflow expert was called in to try to leverage this experience for reuse as a best practice. In discussion with the driller, he identified several details about the events at the time, and the specific conditions that appear to have been key to the need for, and the success of the method. Based on this, he was able to create a revised version of the Tool_Runout workflow called a Pressure_Compensated_Tool_Runout, and associate it with the conditions for which it is appropriate. An XML version of the resulting workflow description is exported and distributed to three other field offices, for incorporation into their catalogs of best practices.

4. Create an Activity_Template for a computational activity. The IT department of Big Oil decides to use a particular vendor�s product for computing deviation stations from well directional survey data, and negotiates a corporate license. An IT technical expert creates the template objects defining this application, the various input and output data roles involved, and the best practice definitions describing when it is appropriate to use the application. These template-level definitions accompany the application distribution kit, making it easy and efficient to install the application as a component in an application framework system. The use of these template objects make it easy to know when to use the application, help bind its input data when the application starts, and create the data dependencies between the inputs and outputs as required. Because of the minimal human interaction required for the computation, they are eligible to be run in batch mode. The field sites quickly find that the sure knowledge of which input directional survey data was used in computing the output deviation data makes it unnecessary to keep the input data under most circumstances. This allows earlier pruning of this data and less recomputation, with the result that projects are simpler and less cluttered.

5. Workflow composition example. Several sites mention that the processing steps for a common workflow involve repetitive and time-consuming actions, the repeating of some data selections multiple times, and so on. In order to simplify the process, a couple of expert users assemble a composite workflow from existing workflow steps or sub-workflows in their repository. Plumbing details of hooking data outputs from one step into the inputs of one or more other steps are specified to simplify user involvements when activating the composite workflow. Overall, inputs and outputs from the composite workflow are exposed, making the resulting composite workflow a uniform appearance to the application framework. As a result, as with the atomic template example (use case 4), input expectations can be bound to actual data when the activity class is selected for execution�data dependencies are recorded, all in a standard manner. Since the workflow involves non-computational approval steps (AFE), it effectively mixes computational and non-computational workflow classes. Again, the best practice definitions are provided to guide people in selecting the resulting workflow.

The following table summarizes the major features of the Activity model touched by these use case examples:

 

Use Case

Best Practice

Workflow

Data Dependency Involvement

Workflow Sharing

Activity

Kind of Activity Supported

Data Involvement

Other

1

Use

Selection and activation

Used to quantify data quality

No

Adapts to real world

Oilfield

Yes

Tracking plan versus actual; data lifecycle and generalization

2

Use

Select and activation

Recorded and used to keep computed results fresh

No

Adapts to real world

Mixed Computation and Oilfield

Yes

Data versioning and support for backtracking interpretation; data lifecycle

3

Create

Created from actual history

None

Yes

Reuse

Oilfield

No

Learning from experience

4

Create

Created from product specifications

Creates

Yes

Batchable

Computation

Yes

Data lifecycle

5

Create

User tailored for simplicity

Creates

Yes

Composite

Mixed Computation and Oilfield

Yes

Uniformity between simple and compound workflows; data lifecycle