Leveraging HPC for Monitoring in Oil and Gas
The speaker, Ashe Menon, is the VP of Operations at National Oilwell Varco (NOV). NOV utilizes high-performance computing (HPC) to optimize drilling, reservoir modeling and data analysis. As a global leader in oil and gas technology, NOV’s innovative use of HPC enhances operational efficiency and drives industry advancements. Menon is also responsible for Auredia, a startup focused on using technology to augment frontline manufacturing employees to increase profitability. He previously served as the Senior VP of Global Operations for the Grant Prideco Division at NOV. Prior to that he spent multiple years as an entrepreneur until his company was acquired by NOV, the largest Oil Field Equipment manufacturer in the world.
Menon began his talk by discussing the motto of NOV: We power the industry that powers the world. He said that when there aren’t any other options, NOV is called. For instance, the plane that was successfully landed in the Hudson years ago was removed from the water by NOV’s cranes and the company was called in to supply power to New York after Hurricane Sandy.
Then he touched upon some of the greatest challenges in oil and gas drilling. For instance, in some cases once the drill bit hits the seabed, you still have one and a half Mount Everest’s worth of depth to traverse in order to hit oil. Other challenges include the remote locations, the rugged terrain, low connectivity and how expensive (up to and including $2 million per day) it can be to the operation when things break. Machine Learning helps to predict what is going to fail.
He went through several simulations that a data scientist produced and explained it this way – what if your car stops because the fuel is empty, but all your dials are going and you don’t know which one measures the fuel level? The gauge that changes the most is probably the one which has caused the problem. Data scientists must go in blindly and determine which gauge is the one that measures the fuel. There is definitely a correlation between drilling failures and machine data, but scaling it is the difficult part.
“Everything should be made as simple as possible, but not simpler,” said Albert Einstein according to Menon. He used this to demonstrate that there may be 10 years of hard data but that that data needs to have integrity to be worth any good to the industry. For instance, “the longest instance of downtime” was going to be determined by a large dataset but the number one most common answer for why ended up being, “Other” and the second-most common answer was “N/A.” This shows that data integrity and proper labeling are of utmost importance in data collection.
Menon discussed the great surplus of workers and scarcity of HPC talents. Since AI is going to do the work in the future, he expounded on how the shortage of data scientists is not as big a problem as one might expect. “Data is doing the work,” said Menon. “But we can train anyone to do the work.” He gave several examples of how forensic anthropologists and baristas alike are working in data science. That’s why he hires based on attitude instead of only those who have come out of school with data science degrees.
Menon finished his online and in-person talk with a question and answer period.
Menon has been published in over 10 Oil and Gas Industry Magazines; he has also presented papers at conferences around the world. He is an alumnus of University of Houston and Harvard Business School. He received the 2017 Smart Industry Top 50 Digital Innovators award.