In using source data, understanding the behavior of a well can be tied to the details found in the signals provided by that well. However, what if those signals are somehow masking issues rather than illuminating them? To better explain, consider the figure above. This shows the gas injection rate on a single well over a period of 20 days. The data displayed is downsampled to show an average point every 4hrs. The candlestick bars show the minimum and maximum value each of those 4 hours. As can be seen in the top view graph, there is constant change in the injection rate.
On this project, during a detailed well review, a Production Engineer asked why the gas injection rate appeared to be changing on the subset of wells for one of his routes. The Production Engineer knew that changes were not being made and the compressors were operating correctly. Additional review found more wells with similar fluctuating characteristics. As this was more carefully examined, it was determined that the proximity of the compressor to the injection sensor was causing interference. Acoustic filters were added to the compressors and the result was a clear and consistent signal (shown on the lower view graph on the figure above).
Production Engineers can’t just accept that the source data is correct. If this engineer had used the highly variable gas injection rates to make recommendations, the projected injection rates would have been consistently wrong. In this case, it was a small set of wells, but from an organizational perspective, this likely would have resulted in questions for all injection recommendations across all fields.
One of the key takeaways here is that good data goes beyond sensors. Over the next few weeks we will be discussing what is “good data” and what does good data quality look like. We hope that you will join in on our discussions by posting in the comments section on each blog. If you want to get a jump start on the ideas, you can request our white paper entitled “Data Quality Fuels the AI Race.”