The oil industry put fiber-optic sensing cables down holes to better understand why hydraulic fracturing doesn’t free trapped oil at expected rates in shale reservoirs, but the massive streams of information received are hard to analyze. Now a team of researchers from Texas A&M University and the Colorado School of Mines has created an algorithm to clean up this subsurface data and offer a clear view of how and where these fracturing processes succeed and fail.
“Our quantitative characterization retrieves more information about fracture geometries within a reservoir than a simple qualitative analysis would,” said Dr. Kan Wu, associate professor and Chevron Corporation Faculty Fellow in the Harold Vance Department of Petroleum Engineering. “We’ve tested our algorithm and already applied it in the field.”
The results were published in the Society of Petroleum Engineers’ SPE Production & Operation journal.
Traditional data interpretation methods, though incredibly helpful to engineers, are based strictly on qualitative information or probabilities based on patterns of information. In contrast, the algorithm was developed to gather quantitative data that’s countable, like temperature, pressure, or rock deformation changes within a reservoir. It recognizes the outcomes that occurred to create the changes and accurately models how far and fast the fractures traveled, what directions they went and how big they became.
Low-frequency distributed acoustic sensing (DAS) data gathering has only been around for five years, so not all information received from the wells with fiber optics has been fully deciphered. Also, each well has its own range of characteristics due to the enormous variations of subsurface structures. This complexity is why Wu and her colleagues, fellow faculty member Dr. George Moridis, professor and Robert L. Whiting Chair, and Dr. Ge Jin, assistant professor of geophysics at Mines, needed a considerable amount of time to meticulously develop their algorithm.
First, the researchers tested the algorithm’s ability to clean the data and interpret simple streams from known fracture processes. That way they could backtrack or reverse the information to find the starting point of a fracture’s growth. As the algorithm was expanded to understand more complex information, they improved its ability to think in a forward manner and predict how new and complex fractures initiate and grow.
Wu is an expert in rock mechanics, Jin an expert in geophysics and DAS technology, and Moridis is an expert in advanced numerical methods and high-performance computing of coupled processes. Because of the multidisciplinary backgrounds of the project team, the algorithm possesses incredible flexibility to grow and adapt to the type of data it receives. For instance, Yongzan Liu, the graduate student on the project for over two years, is now a postdoc researcher using similar methods and modeling on fiber-optic data from hydrate-bearing sediments to monitor natural gas production for the Lawrence Berkeley National Laboratory.
Liu, Wu, Moridis, and Jin are the first to develop this type of algorithm and publish results. The goal of their research is to eventually automate the algorithm so that feedback from fracturing events happens in near real time on a drill site. This way, engineers can quickly tailor fracture design efforts to each well’s particular composition.
“The industry needs this type of tool to understand fracture geometry and to monitor fracture propagation,” said Wu. “The more efficient it becomes, the better it will help optimize hydraulic fracture and completion designs and maximize well production.”
- This press release was originally published on the Texas A&M University website