On 22-23 June 2013, a symposium was held to celebrate the careers of three giants of CFD: Antony Jameson, Phil Roe, and Bram van Leer. The so-called JRV Symposium, sponsored by the Air Force Office of Scientific Research and the University of Kansas, was actually titled Four Decades of CFD: Looking Back and Moving Forward (conference website). About 85 people gathered in San Diego prior to the annual AIAA Fluids conferences for this once in a lifetime event.
Over the course of two days, speakers – many of the biggest names in CFD and computational science – provided insight into the honorees’ early careers and described how their work is being used today. Late on the second day the honorees themselves took to the podium to share their respective visions about what lies ahead. As conference organizer Z.J. Wang said, examining the body of work created by these three gentlemen will allow us to move forward with confidence and conviction.
What follows are just a few moments that stood out for me.
In his look back at Jameson’s career, John Vassberg (Boeing) touted the magnificence of the FLO22 code which he called a major paradigm shift in aerodynamics. The code has been run literally hundreds of times per day since 1975 and, remarkably, only one bug has ever been found since its release. In discussing the early benchmarking of the code, Vassberg joked that the ONERA M-6 wing must have been invented before the circle due to its extensive use in validation. Vassberg also ran FLO22 in real time in the background while giving his presentation. The solution on 154 million nodes only took 1,400 seconds.
Meng-Sing Liou (NASA Glenn) presented some work on multi-disciplinary optimization and made the interesting statement that aerodynamic optimization involves making the flow more 2D (or maybe I misunderstood him). Liou mention openmdao.org, an open-source Python-based framework for optimization.
Princeton’s Gigi Martinelli cited work from 1984 on structured grid RANS computations for high Reynolds number flow which was notable for the use of multigrid time stepping for convergence acceleration. A conclusion of this work was that care must be taken to counteract the negative affects of high aspect ratio meshes. As more evidence of a Jameson game-changer, Martinelli cited his work on limiters in the 90s. Martinelli also discussed how CFD has changed over the years from basic research to software engineering. As an example of that, he spoke of an effort to rewrite his codes in an object-oriented framework for better long-term maintenance.
NRL’s Elain Oran gave a frank presentation on computation of reacting flows. With the current state of the art, they don’t know whether the laws of fluid mechanics are applicable to the phenomena they seek to simulate. They know definitively that the modeling of chemical reactions is wrong.
INRIA’s Frederic Alauzet came at the issue from the meshing standpoint, because as he said “no mesh, no simulation.” Even though mature meshing technology was widely available at the end of the 90s and the 2000′s saw progress with increasing geometric complexity the trick now is to how to correctly generate anisotropic meshes and specifically what metrics do we consider to know what these highly stretched meshes are good? The future of meshing in his opinion is anisotropic meshing for viscous resolution, for high-order elements, and for moving geometry.
Phil Colella (Berkeley Lab) made an interesting observation. He expects bytes per flop in computers to decrease by an order of magnitude over the next decade due to power concerns. Therefore, to remain efficient we must get more computations per stored data and this leads us to high order methods.
I was looking forward to Chris Depcik’s talk on the Dangers of Commercial CFD. (I felt bad for spoiling his joke that he was the only guy in attendance without a Ph.D. in aero.) Because undergraduates treat CFD as a black box it’s imperative that it work correctly. Cool pictures do not imply validation.
- Both geometry and physics must be accounted for when meshing.
- Best practice calls for the use of industry specific tools tuned for specific problem types.
- Quick turnaround is essential.
- You must realize the the CFD user likely doesn’t do CFD as a full-time job.
- You can’t be effective at CFD w/o practice and training.
Cessna’s David Levy was very clear when he said that the Drag Prediction Workshop (DPW) was all about the grid. Because of that he believes that adaptive grids are exactly where we need to go. DPW results indicate that a 10 count variation in drag is common across all CFD. And even with grids of 130 million points the flow is still under resolved and not trending toward asymptotic convergence.
Paul Ullrich does earth system climate modeling with 100 km grid resolution. While he’s hoping for 25 km resolution by 2014, most macro features of the earth are sized around 1 km.
Kozo Fuji (JAXA) asked what does RANS CFD tell us about fluids? Back in 1977 Stanford’s Dean Chapman said that there were only two major motivations for the use of CFD: roving new important technical capability and economics. Fuji does not expect this to change in the coming decades.
During his talk, Antony Jameson repeated something attributed to him from another event – CFD has been on a plateau for 15 years. During that time period we’ve been living on 2nd order accurate methods developed in the 80s and 90s. And while those methods are fine for transonic wings, high lift geometries need high order methods. What’s also needed is a move toward LES or DES. He anticipates mesh sizes of 1011 grid points by 2040. Despite that, he remains highly motivated by the idea of a “numerical wind tunnel” that exists at the intersection of math, computer science, aerodynamics, and fluid mechanics.
As Phil Roe put it, back in 1978 nothing worked. Now in 2013 almost everything works but nothing works particularly well. In his opinion, all the difficulty in CFD comes from trying to simulate complexity using grids that are too coarse. Stated another way, at small scales nothing about fluid dynamics is particularly difficult. But at large scales the flow becomes arbitrarily complex. What we should be seeking are methods that allow us to accurately bridge the small and large scales.
It was quite an honor to be present for what Bram van Leer called his last conference presentation of a 47 year career. In his mind, 1980 was a pivotal year in the development of CFD – the year of the Riemann solver. Everything we’re doing today is an evolution of that early development.
When asked about CFD in 2030, the honorees said the following.
- Roe: It’s hard to say without knowing what computers are going to look like then.
- Jameson: How are we going to simulate turbulence in DNS?
- van Leer: How will our software have to change in order to use 100,000 processors?
When asked what has kept them up at night over the course of their careers:
- Roe: He was never quite certain early on that CFD was going to last and was concerned about developing a transferable skill set.
- van Leer: Developing hyperbolic relaxation equations for rarified gases
- Jameson: Proving that something was going to work in practice.
Finally, Roe had a very interesting take on debugging CFD codes.
- The bug is not in the section of code you’re looking at.
- Rule #1 is of no practical use.
You may be curious why I attended. It’s because deep inside every mesh generation guy is a little CFD guy struggling to get out.
No one should infer anything from the fact that I didn’t cite all the speakers or include a photo of each one. Keep in mind that I’m a terrible photographer and not all my photos were worth using. Despite my [barely legible] notes, it’s been two months since the event and my memory isn’t what it used to be. And because I’m not a solver algorithm guy, let’s just say that the more technical presentations would be difficult to reduce to blog form.
Here are a couple of online resources for those of you who wish to dig deeper.
Thank you, Z.J., for organizing and hosting this event. It was a pleasure to attend.