Spherical Harmonics Lighting from ARKit Environment Probes Sun, 31 Mar 2019 16:03:46 +0100
I've been playing with ARKit a bit, and I've added support for it in VidEngine. There's a small AR app there if you want to check it out: SampleAR.
One of the things I got excited about in ARKit2 are the environment probes: ARKit automatically generates environment textures from camera imagery. It completes the missing bits of the texture with Machine Learning, and then it keeps updating them as you move the camera around.
I thought it would be a good idea to compute a Spherical Harmonic (SH) light approximation from those environment maps, so during rendering I can just render the diffuse light based on a few coefficients. The environment probes can be defined with an extent, so the area covered doesn't need to be the whole scene. That goes well with the SH lights, since you may want to place different probes at different locations.
The way I've implemented this is through a LightSource called SHLight. The SHLight has an extent as well, defined as a cube volume. When you place an SHLight in the scene, an environment probe is placed at its center. Then, for a few frames, I initialize the samples across the sphere, using jittered stratification. This could be done offline and saved to a file, but I'm doing it on init for the time being, just in case I want to change the number of samples. After init, I read all the samples with just one draw call, since the cubemaps ARKit generates aren't that big. I'm currently reading 10000 samples. With all the samples, I compute the Spherical Harmonics during a few frames, and store the matrices for irradiance approximation in a buffer that the DeferredLightingPlugin uses for rendering.
You can read about the irradiance map representation in this paper: An Efficient Representation for Irradiance Environment Maps. An all about Spherical Harmonics in Spherical Harmonic Lighting: The Gritty Details.
So it takes a few frames to see any light. I initialize the irradiance matrices with a gray ambient light, so it's not totally dark. And every time ARKit updates the environment map, I read again the samples and recompute the Spherical Harmonics. I blend the coefficients of the irradiance matrices so the transition from one light to the updated one is smooth. The blending is done in the CPU, since for most of the time there's a single set of coefficients, so I don't want to do unnecessary processing every frame in the GPU.
The SampleAR app doesn't do much at the moment, other than showcasing these SH Lights and plane detection. You can place cubes, spheres, and a building from Santorini (I made this model for Snake on a Sphere). When you place an object on a detected surface, a SHLight is placed slightly above the object, covering one cubic meter, so the object is inside. For every object you place, a new SHLight is created. If you toggle on the debug mode, you'll see something like this,
The grids are the planes detected by ARKit. The green wires define the extend of the SH lights and the environment probes, and the "reflective spheres" are debug objects to show how the cubemaps captures by ARKit look like. Here's a small video,
Because there are several SH lights in the scene, I'm using the stencil buffer to mask the volumes that have already been rendered. This doesn't account for intersecting volumes. We could blend the lights in those cases, but I'm not considering it for now. The stencil buffer set up can be summarized in this table,
Comp. Ref. Read Write Fail Depth Pass
func. Val. Mask Mask Fail
Face ≠ L|A A W Keep Repl. Keep
Face ≠ 0 L L Keep Zero Keep
light ≠ A L L|A Keep Keep Replace
So I need to do 3 passes per light, which it's not ideal. And even less ideal is that I have to create a different encoder each time that I change the shader. There are only 2 shaders needed for this, but because I need to swap between one and the other, I had to create a horrible loop that creates many encoders. Check the drawSHLights function in DeferredLightingPlugin. I think I will ask in Stackoverflow, because there may be another way of doing this in Metal.
The app still has some glitches. The cursor starts to "flicker", or rather, leave a trail, from time to time. I'm not sure if it's because ARKit goes nuts, or if I should smooth out the movement of the camera and objects by myself. I'll be investigating these issues next.