Process modelling: A complex matter
23 Nov 2010
At a recent modelling software conference in London, delegates heard how Spanish petrochemicals major Repsol had achieved massive savings by employing predictive modelling to design out two large distillation columns at a new petrochemical/propylene oxide facility - an industry first.
A group of refiners, meanwhile, was reported to be performing model-based energy optimisation of refinery preheat trains, while Proctor & Gamble (P&G) is achieving major performance gains from its application of model-based design and operations techniques.
These - and other case stories from companies such as Shell, PURAC and Sulzer Chemtech - highlighted the major strides being made in the use of predictive modelling to optimise process design and operation.
Given these achievements, which were presented by end-users of PSE’s gPROMS software, the mood at the event turned surprisingly downbeat when discussing the future of this advanced technology.
The problem, it seems, is that the technology is struggling to find champions - termed ’priests’ by one delegate - outside the core user base, which tends to share strong expertise in applied mathematics. Indeed, delegates reported that modelling was sometimes a ’dirty word’ at their companies, as it was seen as costly, time-consuming, complex and outside the everyday scope of most mechanical, civil and chemical engineers.
Setting the scene, Nilay Shah, professor of process systems engineering at Imperial College London, highlighted a need to get scientists, rather than just engineers, involved in modelling - and at the earliest possible stage of a project: “[We must] get an appreciation of the value of modelling into people and processes upstream of where it is traditionally used. In most organisations, that means transitioning modelling from engineering to science.”
From a process systems engineering viewpoint, therefore, modelling needs to come into play much earlier in the whole value chain, before it becomes disruptive and inefficient. It should, argued Shah, be applied at a very early, data-poor stage - otherwise the wrong decisions can be made and the wrong data applied.
Cultural barriers
For Ben Weinstein, head of head of chemical systems modeling and simulation at P&G, there remain many “cultural” barriers to the use of modelling in a whole range of business units across organisations. Many engineers, he added, feel threatened by models and need convincing that the technology will actually make them more effective, rather than replace them.
“We don’t want to go from incremental success to incremental success, but any time you [show a value proposition] to a new business unit, it is like starting all over again. Even if you can show them results from other areas,” said Weinstein.
“We have this saying - ’Not invented here’ - so that if it is not invented in your business unit it doesn’t seem to have as much credibility,” continued Weinstein, who believes that one way to break down these barriers is to establish a comprehensive unit operational library to drive investments and decisions based on predictive modelling.
“Even still,” he said, “it is difficult to convince management, even when you have success. Right now, hopefully, we are starting to get lots of traction, as we have done some very successful things that have taken years to develop.”
In similar vein, Michael Baldea of industrial gas company Praxair said that the use of modelling on applications such as flowsheet optimisation, equipment design and dynamic processes, “generates significant value, with a lot more potential going forward.”
So the challenge is more organisational, said Baldea. “You have an intelligent, educated user base, but their education is not around the skills needed to handle models for optimisation at this point in time. These skills are not part of the curriculum and, the way I see it going, they are not going to become part of the curriculum.”
Early experiments
While employees need to be taught to use these models, it must be done without major impact on their work flow and workload, noted Baldea, who is currently pushing to create an internal user group at Praxair.
This, he said, “will ensure that: a - everybody gets educated in how to use the latest products; b - we have a consistent approach to modelling across the organisation; and c - we minimise the amount of re-work that we have to do when we do modelling.”
Meanwhile, Shah highlighted the value of modelling in guiding early experiments, explaining scale issues and supporting technical due diligence. It can also, he said, help people to identify the real intellectual property of their new invention, as well as improving their understanding of the process.
Despite these paybacks, Shah estimates that, 90% of the time, process modelling is used after the equipment has been built, and the company needs to de-bottleneck, troubleshoot or scale-up a process.
“This applies whether it is carbon capture and storage, where you want to employ your model before you build on a very large scale and spend a lot of money, or indeed if it is a very small-scale process, such as looking at new bio-based materials.”
Richard Jarvis of the UK’s National Nuclear Laboratory alluded to experiences, where experimental facilities were designed without enough consideration of how they were to be used, or precisely the data they would be required to collect. Applying model-based techniques to identify data uncertainties at an earlier stage of the project would avoid this problem, he said.
Likewise, Weinstein of P&G said that 70-80% of the cost and success of any enterprise is fixed at the front end. “That’s where we need to push modelling,’ he said. “Once the process is in place, you are going to improve it marginally by a couple of per cent, but the front end is where you make and break a business proposition.”
Delegates at the gPROMS user conference in London voiced concerns about the shortage of people coming into the engineering professions with any real understanding of the mathematics of modelling and its capabilities.
According to Richard Jarvis of the National Nuclear Laboratory, less than 10% of his job interviewees show a basic understanding of mass transfer and heat transfer coefficients, and their correlations.
“It seems there are walls between the different areas and people can’t think across those areas,” Jarvis suggested. “Simulation might be a way of breaking down those barriers.”
However, Ben Weinstein of P&G noted that, with many advanced modelling tools, you need a modelling and simulation expert to run them.
“Until the tools get to the point where they are readily usable by someone - creating flow sheets, topologies, allowing them to be creative with the tools - then it is going to continue to be a barrier,” he said.
Likewise, Nilay Shah of Imperial College London said that the decline in the number of mathematics graduates and undergraduates highlighted the need to provide functionality that works across a range of users with different levels of mathematical ability.
In recent years, he explained, Imperial College has completely changed the way it teaches engineering towards instilling a much more fundamental understanding of mathematics from the outset.
“Many students have a fear of mathematics and develop a recipe approach to dealing with the subject. We teach that mathematics is the most powerful tool within engineering, which helps students to understand chemical reactions, engineering, mass transfer, heat transfer and process design,” he said.
“We break mathematics down to the concepts that are important to engineering and go through them from a fundamental basis and then up towards getting into some of the engineering applications of that. As a result, final year and MSc students learn how to model quite complex systems and solve real problems.”