Hi, I’m Reza Djeddi and I’m a Lead CFD Application Engineer at Cadence Design Systems. Some of my friends call me “The Jedi” because of how they think my last name is pronounced and the fact that I’m a big Star Wars fan, but it’s actually pronounced more like “jeddy” (though I wish I were an actual Jedi). I earned a Ph.D. in Mechanical Engineering as well as an M.Sc. in Aerospace Engineering from the University of Tennessee Knoxville. Before joining Cadence, I worked as a Research Assistant Professor in the Mechanical, Aerospace, and Biomedical Engineering Department at UT-Knoxville for four years where I led NSF/DoE/DoD-funded research projects with my former Ph.D. advisor and mentor, Professor Ekici.
As part of my doctoral research, I developed a full-fledged CFD solver (based on the Finite-Volume method for unstructured grids), called UNPAC. This solver has, at its core, several advanced schemes for convergence acceleration, various turbulence and transition models, adaptive mesh redistribution and refinement tools, and more. This was a daunting task as it involved writing millions of lines of code!

JC: How did you find time to write millions of lines of code while pursuing a PhD? And what language is UNPAC written in? And because I like unraveling the acronyms in program names, what does UNPAC stand for? (Willing to bet the “UN” is unstructured.)
RD: Well, writing the core functionalities of UNPAC took me only about 2.5 months (not to pat myself on the back but I’m actually a fast programmer and this was not the first CFD code that I had ever written) with an additional month spent on extending it to three dimensions. Before all of this, I spent a few weeks planning for how all the data structures would look like which helped a lot along the way as I was able to expand and grow the code very quickly. I wrote UNPAC in Modern Fortran using the latest standards for object-oriented programming. While I’m an experienced programmer in C/C++, I wrote UNPAC in Fortran since all our other in-house codes were in Fortran and keeping things in the same programming language was key to a quick knowledge transfer within our research group. UNPAC stands for (as you correctly guessed it) UNstructured PArallel Compressible Code. I’m actually a big fan of clever acronyms. The automatic differentiation tool that I also developed as part of my PhD research was called FDOT which stands for Fast Differentiation using the Overloading Technique, while also being synonymous with “f_dot” which is how we refer to a function’s derivative. Also, earlier during my PhD, I developed an incompressible Navier-Stokes solver based on the generalized coordinate system, which I called GENESIS (also named after one of my favorite prog rock bands).
JC: Here’s a serious question. Do you think writing millions of lines of code was a good use of your time as a PhD student? I mean as opposed to using an existing code like SU2 or OpenFOAM and simply adding to it versus reinventing a lot of wheels.
RD: This is a very valid question and one I have heard and been asked a lot. As I mentioned, UNPAC was born out of necessity for an unstructured grid solver to complement the suite of codes we had at the time in our research group. It also started as a standard CFD solver with basic functionalities but written in a flexible and dynamic way that allowed me to easily grow and expand it. This led to a steady development of the code throughout many years (started in around the last year of my PhD research and continued throughout my post-doctoral years). Adding different turbulence and transition models, various flux schemes, gradient calculation schemes, and various other features and capabilities happened over the span of several years and honestly, in the grand scheme of things, it wasn’t a huge time investment for me. Additionally, there is always this argument that learning and understanding someone else’s code, in most cases, can take a lot of time which could be spent on writing a “better” solver (provided you know the right ingredients). Over the years, I have learned (and am still learning) a lot about CFD codes, how they operate, as well as best practices, and I strongly believe that this knowledge is a direct result of the time I spent over the years developing and writing my own codes rather than using some existing codes like SU2. At the end of the day, I believe that knowing the ins and outs of a CFD code is essential to becoming a successful CFD scientist and that knowledge can be acquired mostly through writing your own code.
Before joining Cadence and over the last four years as a research faculty at UT-Knoxville, I continued developing these tools while mentoring several graduate students as well as teaching ME/AE courses. Over the years, my research in the field of CFD has led to over 35 papers in leading journals and AIAA conference proceedings.
My fascination with CFD started back in 2006 when I got to write Boundary Element Method (BEM) codes for my undergraduate research project toward my B.Sc. degree in Ocean Engineering and Naval Architecture. I continued working in the field of CFD for hydrodynamics applications in my graduate studies (toward an M.Sc. in Mechanical Engineering) where I was part of a team that developed a numerical towing tank CFD code for marine applications.
In 2013, I joined the CFD lab at UT-Knoxville for my Ph.D. studies and this is where I had to quickly switch gears and venture into the field of CFD for aerodynamics applications with transonic to supersonic and hypersonic flows, turbomachinery, and rotordynamics. I must add that early on during my journey in CFD, I mostly used in-house structured grid generators for airfoil and wing configurations. However, as I branched out into developing the UNPAC solver, I started to exclusively use Pointwise (now Fidelity Pointwise) for my meshing applications. This is when I fell in love with Pointwise, the love that has been continuously growing ever since! I always enjoined watching Tutorial Tuesday videos every week and was fascinated by how dedicated the Pointwise team has been to “knowledge sharing” and to providing quality technical support to their customers and all Pointwise users. So, I think you can imagine how excited I am that I’ve joined this great team at Cadence where I will be working with other meshing and CFD experts in our North American CFD Application Engineering team.
- Location: Fort Worth, TX
- Current position: Lead CFD Application Engineer
- Current computer: Dell Precision 5550, Intel® Core™ i7-10750H CPU @ 2.60 GHz, 32 GB DDR4 RAM, NVIDIA Quadro T2000 (Max-Q, 4 GB GDDR5)
- One word that best describes how you work: Organized
What do you see are the biggest challenges facing CFD in the next 5 years?
Nowadays, everybody is talking about how to increase the fidelity (pun intended) of the CFD simulations for various aerodynamics, rotordynamics, and automotive applications. However, this higher fidelity comes at a price of higher computational cost and requires greater time/resource investment which is not what many are willing to pay for to achieve more accurate results. Therefore, I believe a big challenge is to make current tools and numerical schemes faster without sacrificing accuracy. Now, I am not talking about coming up with algorithms that can be scaled up better for parallel computations but algorithms that can boost your solver speed, even when run on a single core. Examples would be novel convergence acceleration techniques, AI-guided solution initialization, and robust shock-capturing schemes, to name a few. Again, there is a lot of talk about moving away from RANS and investing more in DES or even LES. However, IMHO, except for some specific applications, an extremely fast RANS simulation can get us to where we want to go!
Another challenge is to increase the confidence in CFD results, especially for off-design and edge cases where CFD simulations might be our best bet. In such cases, Uncertainty Quantification (UQ) can be a great add-on that can push us in the right direction while giving us more (or in some cases, less) confidence in our CFD simulation results. Normally, we have only a few quantities of interest (like drag, lift, figure-of-merit) while there are many input variables (like model coefficients, solver settings, mesh parameters, etc.) making adjoint methods the logical choice for UQ. However, developing robust adjoint codes for large-scale multiphysics CFD solvers can be very challenging!
What are you currently working on?
Since joining the team in June, I have been working on a few benchmark cases, aside from handling support cases and working with the software developers on getting the latest and greatest version of Fidelity Pointwise released. One of these cases involves using Pointwise for a “mining” application where the customer is looking for meshing an entire mine with all the underground infrastructure, ore body, surface topography, as well as fault lines. Another case is focused on meshing a gas turbine rotor with a complex tip geometry (both fluid and solid blocks) for conjugate heat transfer applications as well as scripting the entire workflow to automate the process. Moreover, I have been looking into some classical benchmark test cases, like the transonic ONERA M6 wing, and comparing the meshes generated in Fidelity CFD (using the surface to volume and volume to surface techniques) with a Pointwise generated mesh. I am running simulations with the Fidelity CFD, FUN3D, SU2, and OpenFOAM solvers to compare the results obtained on the same mesh.

What project are you most proud of and why?
Developing the early versions of the UNPAC solver and the FDOT toolbox during the last two years of my Ph.D. would be the two projects that I am most proud of. I must say that it was a challenging experience because I wanted to have both tools developed independently while having them work together interactively (for the aerodynamic design optimization framework). All of this was supposed to happen in the span of two years while, at the same time, I was also leading the efforts on a completely different DoE-funded project with the Oak Ridge National Laboratory on the design of small modular hydro-turbines. Again, saying that it was a stressful time would be an understatement, but I think that in the end I gained a lot of experience in handling multiple tasks in parallel and was able to become more organized in my workflow.
Are you reading any interesting technical papers we should know about?
I have been closely following the works of the team developing the NASA Advanced Supercomputing (NAS) division’s Launch, Ascent, and Vehicle Aerodynamics (LAVA) software for the past couple of years. More specifically, I am interested in many of the recent papers presented by the team at NASA Ames on Wall-Modelled Large Eddy Simulation (WMLES) as well as hybrid RANS-LES using LAVA for maximum lift coefficient predictions of HL-CRM. I am currently reading two of their most recent papers that were presented at the AIAA AVIATION 2022 conference in Chicago (papers 2022-3434 and 2022-3523).
What software or tools do you use every day?
I use OneNote for taking notes and documenting my benchmark studies, and support cases, as well as tips and tricks on how to resolve some day-to-day issues (being meshing, CFD, or in general, IT-related) for my future reference. I have been doing this for many years now and have found the practice very rewarding as I can always go back and review my well-organized notebooks (with different sections and subsections) to find the necessary info on any specific project/topic! I use Teams to stay connected with the application engineering team and the developers. This may sound like I am a big fan of everything Microsoft (which I’m not), but I also use Microsoft To Do (and lately, Asana) for task management to handle multiple tasks and projects at the same time. For code editing and scripting, I use vim (and not emacs!) on Unix/Linux servers and workstations, Xcode on Mac, and Visual Studio Code on Windows. I use Paraview and VisIt for CFD post-processing and visualization and Google Chrome for browsing the web. It goes without saying that I use Fidelity Pointwise and Fidelity CFD on a daily basis for Meshing/CFD applications (testing, SPRs, benchmark, etc.) as well as handling support cases.
JC: So glad to have another vim user onboard. Let’s see if an emacs user will comment. Seriously, I see why you classify your workstyle as “organized.” For years I vacillated between paper notes and many different online note-taking systems. In the end, I’m back to paper and don’t see that changing.
RD: In the early years of my graduate studies and research, I always preferred to write my notes on a piece of paper which, over the years, resulted in a huge pile of notebooks, binders, and folders just to keep things organized. I guess two main reasons pushed me toward using online note-taking apps. First, I needed to be able to “search” my notes instead of going through all my binders to find something I was looking for. Second reason was the ability to copy/paste code snippets as well as figures/plots. As a researcher and a code developer, I would sometimes try out several different things and would like to keep a record of those code snippets as well as the solution plots in my notes. I must add that I still enjoy handwriting my notes while including some sketches of an idea I’m working on but a couple of years ago I switched to using my iPad with an Apple Pencil instead! These days at work, I sometimes go back to using a small notebook to take quick notes but try to move the important stuff from that notebook to my OneNote app at a later time.
What does your workspace look like?
Although I started my job at Cadence back in June, my wife and I only moved to Fort Worth, Texas in late July and, at the time of writing this, I have only been coming to the office for a week! That means that I am yet to get fully settled in my new office, and I am still moving stuff around. However, I have my essentials in place that would allow me to do my job while trying to make my workplace look more pretty!
What do you do outside the world of CFD?
I love reading, going out for a walk with my wife, and watching movies but, without a doubt, what I do most outside the world of CFD would have to do with music. I am a semi-professional musician and play (and own) multiple instruments (electric/acoustic guitars, bass, drums, piano, ukulele, and mandolin). I love to get together with friends to jam but if I’m all by myself, I would lay down multiple tracks (with drums, bass, and keys) and then play solo (with my electric guitar) on top (almost like a one-man band!). I would say, there is not a single day that passes by that I don’t practice, or make, or at least listen to music!
JC: I too am a music lover but I don’t play, only listen. I’m playing music virtually all the time while working. What are your favorite styles or who are your favorite artists?
RD: That’s great, me too! My favorite genres would be mostly progressive rock although I also occasionally enjoy classic rock and pop rock. My favorite bands would be Pink Floyd, Rush, Toto, Dream Theater, Led Zeppelin, and Genesis. I’m also a big fan of solo guitarists such as B.B. King, Eric Clapton, Stevie Ray Vaughan, and last but not least, John Mayer. In addition, my wife and I are mutually fans of Coldplay since we have a lot of great memories listening to their songs and seeing them live.
What is some of the best CFD advice you’ve ever received?
I am a big believer in questioning the CFD results very rigorously and that is something that I’ve always been doing, whether as a CFD solver developer or a user. Now, grid convergence would be an essential part of the process but so is solver convergence and having, for example, lift and drag coefficients settled down to within a count.
My PhD advisor and mentor used to always tell me about the importance of boundary conditions in a CFD simulation. Typically, when you write a CFD solver, discretizing and coding the algorithm for the interior nodes is always so much more straightforward to figure out and implement. However, a lot can go wrong at the boundaries, whether being the far-field boundaries for external flow applications or interface boundaries in multi-stage turbomachinery cases, and in general, near-field boundaries for any type of flow simulation. In many cases, the sources of instability (or solver divergence) can be traced back to the boundaries. Also, apart from proper mesh resolution near the wall, accurate gradient calculation at these boundaries would be crucial for boundary layer modeling, heat flux predictions, laminar-to-turbulent transition, as well as Shock Boundary Layer Interaction (SBLI) and shock-induced boundary layer separation. So, I always keep an eye on the boundaries and how they are set up and also try to look into how they are implemented in the solver (if open-source).
If you got to choose, where would you and I go for dinner?
Given that we have just moved to Fort Worth, I still don’t know many great places to eat around here (just yet). Good thing is that I love all kinds of food and for me, it’s the company that matters more than the food. With that said, I would say we can eat at Uncle Julio’s (one of our favorite Mexican restaurants as we’ve been to their Reston, Virginia location many times before and twice here in Fort Worth, so far!). I am a big fan of Thai cuisine and after trying out a couple of Thai places here in Fort Worth, I liked the food at Malai Kitchen, so I’ll say we can grab their Chicken Pad Thai or Drunken Noodles. I must add that I love spicy/hot food, so a good Thai food for me would have to be spicy!
If we were in Knoxville, then I would say we can eat at Season’s Innovative Bar & Grille (we loved their Beef Medallions as well as their Rack of Lamb) or maybe some delicious Hickory Smoked BBQ Baby-Back Ribs at Calhoun’s On The River. But again, given that I am new in town, I am going to keep an open eye/ear for good restaurants here in the Cowtown (so, maybe ask this question again in like a few months and I’ll probably have a better answer!)
JC: I think leaving the reader with thoughts of Knoxville BBQ is a great place to end. Thanks for taking time to share this with us.