(adaptively) Refining Work/Play Balance

"these waves we're seeing may be mesh induced artifacts"

Weaving creativity and cleverness into your work is a lot fun when you can do it. In this post we’ll be sharing some image-processing inspired tests we performed to exercise some new (in development) mesh adaptation capabilities within our 3D Voronoi body-fitted mesh generator Stitch.

Like most adaptive mesh refinement approaches, Stitch leverages a scalar field (sometimes referred to as a sensor, marker, or refinement criterion) to expresses the desired mesh resolution within a region of space. In terms of improving a CFD solution’s predictive capability, the choice of sensor (or sensors) is extremely important. In this exercise, however, we only want to assess the robustness of the implementation to challenging features that might occur in a sensor field, such as:

  • large length scale disparities,
  • thin interfaces,
  • lack of smoothness.

Instead of using a simulation result to produce the sensor, we wanted to see the result when deriving the refinement criterion from an image. The workflow is depicted from left-to-right in the images below: first a source image is chosen and converted to grayscale. This grayscale image is our sensor field, where the grayscale magnitude at each pixel acts as the refinement metric. Next we bin the grayscale values into discrete levels, providing regions of the mesh with isotropic length scales. The number of bins used determines the number of refinement levels you find in the resulting mesh. Stitch is initialized with a coarse uniform mesh, and all refinement is prescribed by the binned grayscale image. A very CFD approach to building a Voronoi diagram image filter!

source grayscale image
binned into five discrete values
refinement of a uniform initial mesh

We gave these tools to our excellent summer intern Parker McLaughlin, who put Stitch through its paces by selecting a battery of images and adjusting various mesh generation parameters. Below are a few of the images she produced, in addition to the Great Wave Off Kanagawa featured at the top of this post.

Of course, the natural progression after mesh generation is to run the flow solver on these meshes, and what better flow to evaluate than the image itself? In CharLES, our LES flow solver, we specify the temperature and velocity initial conditions as functions of the source grayscale image (no binning). Two such animations that Parker generated are shown below. Without watching the videos, which are animated backwards in time, can you guess their initial condition (hint: both initialize from an images on this page)?

Clearly there are more practical applications of this and the other technologies we’ve been developing over the past two years, but we wanted to share this entertaining misuse of Stitch. If you need a brief diversion, come up with a caption for the Great Wave image at the top of the post and leave it in the comments!

Credit where credit is due:
The mesh adaptation developments described here were funded through NASA SBIR Award #80NSSC19C0091, titled “Fast, Parallel, High-Quality Voronoi Mesh Generator”. The Another Fine Mesh blog provided inspiration to both resuscitate this blog and to meld mesh generation and art.  


One Response

Leave a Reply

Your email address will not be published. Required fields are marked *