Blog
Printer-friendly versionPDF version

"Prediction is very difficult, especially if it's about the future"
 - Niels Bohr, Nobel laureate in Physics

Whether you like it or not, as a simulation engineer you are in the prediction game. Put simply, your job is to predict how an abstract design would perform in the real world, hopefully accounting for the most challenging operating conditions that it would likely experience during its working life.Compared with other professional forecasters such as economists, television meteorologists or political commentators, the audience for engineering predictions is more critical and less likely to forgive. While incorrect weather forecasts are quickly forgotten (at least those that don't involve hurricanes), and one rarely takes economists seriously, the cost of getting an engineering prediction wrong can be enormous.  The failure of a product in service can have serious consequences, particularly in the case of safety critical applications where unforeseen failure can result in injury or loss-of-life. Even in less serious circumstances, the unexpected failure of a product can act to de-motivate consumers, damaging brand reputation, potentially incurring large warranty expenses.

The problem is that uncertainty is a fundamental part of all prediction; no engineering prediction is perfect and no simulation model is a complete representation of the real world scenario. Every model is based upon a set of underlying assumptions that allows it to be solved numerically, but ultimately influences the accuracy of the prediction.  As engineers, we are responsible for acknowledging and understanding the uncertainty in our predictions and, wherever possible, to try and minimize that uncertainty through the application of judicious modeling assumptions.

As Bill mentioned, modeling choices are often dictated by economic and practical constraints. Not so long ago, almost every model had a symmetry plane (even when we knew deep down that asymmetrical flow was at least a possibility). Our simulations were usually 'steady state' (even though we knew the most fluid mechanics problems are inherently unsteady), and if we didn't make them 'isothermal' or 'incompressible', then then our simulation models were usually surrounded by adiabatic or fixed temperature boundary conditions. It's not that these simulations were necessarily bad*, it's just that they were the best that we could do with the resources available at the time (and of course any of these assumptions are still perfectly acceptable today if used in the right context).

Man falling into netWhatever the problems of yesteryear, simulation technology continues to rapidly evolve. When combined with the increased availability of computer processing power, it provides an opportunity for engineers to improve the quality and accuracy of their engineering simulations.  Deciding exactly how much physics (and chemistry) to include in a numerical model is one of the key challenges of any simulation project. It requires a deep understanding of the complexity of the system and often involves making difficult choices.

The best simulation engineers deal with this by 'solving the problem in their head' before even starting to build the model. They undertake a careful process of 'identification and formulation' in which they attempt to determine the relative influence of various physical phenomena that underlie their engineering problem. It's not just a matter of seeing how much physics you can throw at a problem; the art of the simulation engineer is largely involved with knowing how much physical complexity can be safely excluded from the simulation.
 
Whereas previous generations of engineers could take some comfort in the 'safety net' of extensive physical testing to rescue them from the occasional poor prediction, Computer-Aided Engineering is increasingly the victim of its own success as simulation continues to displace hardware testing as industry's verification method of choice. Although this increased confidence in simulation is well-deserved (and has been hard-earned through many years of successful prediction), it brings with it a great deal of pressure to 'get the answer right' almost every time.

Although none of us will likely ever make 'the perfect prediction', the best way of improving our prediction accuracy is through a continual process of critical review and assessing how well our previous predictions matched up to reality. For me, 'Simulating Systems' is about a constant process of challenging modeling assumptions, wherever possible making better choices that reduces the amount of uncertainty in the prediction - ultimately resulting in better and safer products.

 

 

*Actually some of them really were bad simulations. In the early days of my career I still remember someone enthusiastically showing me a fantastic simulation of flow past a two-dimensional oil-rig.

Brigid Blaschak
Communications Specialist
Dr Mesh
Meshing Guru
Stephen Ferguson
Communications Manager
Sabine Goodwin
Senior Engineer, Technical Marketing
Joel Davison
Product Manager, STAR-CCM+
Prashanth Shankara
Technical Marketing Engineer
Jean-Claude Ercolanelli
Senior Vice President, Product Management at CD-adapco
Bob Ryan
President Red Cedar Technology