This page is the latest incarnation of my official website. It’s a compendium of things I have published and other stuff I made during my spare time.

In this blog, I will keep track of the updates of this site. If you are interested, just subscribe to the RSS feed.


Spherical Harmonics Lighting from ARKit Environment Probes
Sun, 31 Mar 2019 16:03:46 +0100
I've been playing with ARKit a bit, and I've added support for it in VidEngine. There's a small AR app there if you want to check it out: SampleAR.

One of the things I got excited about in ARKit2 are the environment probes: ARKit automatically generates environment textures from camera imagery. It completes the missing bits of the texture with Machine Learning, and then it keeps updating them as you move the camera around.

I thought it would be a good idea to compute a Spherical Harmonic (SH) light approximation from those environment maps, so during rendering I can just render the diffuse light based on a few coefficients. The environment probes can be defined with an extent, so the area covered doesn't need to be the whole scene. That goes well with the SH lights, since you may want to place different probes at different locations.

The way I've implemented this is through a LightSource called SHLight. The SHLight has an extent as well, defined as a cube volume. When you place an SHLight in the scene, an environment probe is placed at its center. Then, for a few frames, I initialize the samples across the sphere, using jittered stratification. This could be done offline and saved to a file, but I'm doing it on init for the time being, just in case I want to change the number of samples. After init, I read all the samples with just one draw call, since the cubemaps ARKit generates aren't that big. I'm currently reading 10000 samples. With all the samples, I compute the Spherical Harmonics during a few frames, and store the matrices for irradiance approximation in a buffer that the DeferredLightingPlugin uses for rendering.

You can read about the irradiance map representation in this paper: An Efficient Representation for Irradiance Environment Maps. An all about Spherical Harmonics in Spherical Harmonic Lighting: The Gritty Details.

So it takes a few frames to see any light. I initialize the irradiance matrices with a gray ambient light, so it's not totally dark. And every time ARKit updates the environment map, I read again the samples and recompute the Spherical Harmonics. I blend the coefficients of the irradiance matrices so the transition from one light to the updated one is smooth. The blending is done in the CPU, since for most of the time there's a single set of coefficients, so I don't want to do unnecessary processing every frame in the GPU.

The SampleAR app doesn't do much at the moment, other than showcasing these SH Lights and plane detection. You can place cubes, spheres, and a building from Santorini (I made this model for Snake on a Sphere). When you place an object on a detected surface, a SHLight is placed slightly above the object, covering one cubic meter, so the object is inside. For every object you place, a new SHLight is created. If you toggle on the debug mode, you'll see something like this,

The grids are the planes detected by ARKit. The green wires define the extend of the SH lights and the environment probes, and the "reflective spheres" are debug objects to show how the cubemaps captures by ARKit look like. Here's a small video,

Because there are several SH lights in the scene, I'm using the stencil buffer to mask the volumes that have already been rendered. This doesn't account for intersecting volumes. We could blend the lights in those cases, but I'm not considering it for now. The stencil buffer set up can be summarized in this table,

      Comp.  Ref.  Read  Write  Fail  Depth  Pass
      func.  Val.  Mask  Mask         Fail
--------------------------------------------------
Back
Face   ≠     L|A    A     W     Keep  Repl.  Keep

Front
Face   ≠      0     L     L     Keep  Zero   Keep

Render
light  ≠      A     L    L|A    Keep  Keep Replace

So I need to do 3 passes per light, which it's not ideal. And even less ideal is that I have to create a different encoder each time that I change the shader. There are only 2 shaders needed for this, but because I need to swap between one and the other, I had to create a horrible loop that creates many encoders. Check the drawSHLights function in DeferredLightingPlugin. I think I will ask in Stackoverflow, because there may be another way of doing this in Metal.

The app still has some glitches. The cursor starts to "flicker", or rather, leave a trail, from time to time. I'm not sure if it's because ARKit goes nuts, or if I should smooth out the movement of the camera and objects by myself. I'll be investigating these issues next.

 
Inverse Transform without matrices?
Sat, 23 Mar 2019 22:27:57 +0000
Given an affine transform, expressed as a translation (or position), a scale, and a rotation, how do you compute its inverse?

Well, if you write the transform as a matrix (for column vectors, so the first operation is at the right hand side),

M = T * R * S
then the inverse is (I'm using a single quote instead of -1 to write the inverse),
M' = (T * R * S)' = S' * R' * T' 
Can we write that as another affine transform? That is,
M' = S' * R' * T' = T_ * R_ * S_
Well, a person from the future wrote in math.stackexchange how to extract the translation, the scale, and the rotation of a given affine transform.

So far, so good. But if you look at the comments, someone comments that the extracted rotation matrix might be a combination of shear and rotation!

This might be a trivial problem, but I never encountered it before. The issue is that, if the scaling is anisotropic, then there's certainly shearing going on in the inverse. That is, you can extract T, R, and S from M using what's described in math.stackexchange, but you can't extract T_, R_, and S_.

What can we do, then?

Well, the translation can still be computed the same. If t is the position in the original transform, then the new position is,

t_ = S' * R' * (-t)
and we don't have to use the matrix forms for this. If your rotation is a quaternion, simply invert the quaternion and rotate t with it. Then, divide each component by the scale.

But can we convert (S' * R') into (R_ * S_)? Well, we can use the Singular Value Decomposition to see how it would look like,

S' * R' = U * Σ * V'
Σ is a diagonal matrix, so a scaling matrix. But unless V is the identity, I don't see how this would look like an (R_ * S_)

So in the end I bit the bullet and used matrices when I need the inverse and I know the scaling may not be isotropic...

What was I trying to do? I was computing ray to object intersections in an AR sample app. These are computed in the CPU, and it would be costly to transform all the triangles of the object to test the intersection, so I'm converting the ray to model space instead. That's why I needed the inverse of the world transform of the object. You can see the final commit with the bug fix and several unit tests that exercise the different conversions: Fix transform.inverse (VidEngine)

Please message me in twitter @endavid if you have any comments or suggestions!

 
Cleaning up my Mac drive
Fri, 25 Jan 2019 21:43:15 +0000

We accumulate lots of crap in our hard drives. Even more than in our homes, because the "crap" is not usually visible, so we keep storing and storing. Space is not much of a problem... or is it?

It turns out it is a problem for me. Every now and then I get warnings that I'm running out of space and I have to do some clean up. But because I only do this every so often, I always end up spending some time Googling about mysterious big system files, to check whether or not it is OK to delete these.

So I thought I could write a small guide for myself that I can come back to next year. Some parts of it may be useful to others, but some are dev-oriented.

But first, how much space do we have left? You can right-click on the hard-disk icon in any Finder window and click Get Info,

Or from the console, just type,

    df -h

    Filesystem      Size   Used  Avail Capacity	Mounted on
    /dev/disk1s1   234Gi  223Gi  6.7Gi    98%    /

Then, if you have no idea where to start, use a tool like GrandPerspective to get a visual overview of the biggest offenders. Here's my hard drive right now,

Big blocks are big files. You can hover over the blocks to find their location in disk. The blocks are also grouped. So you can see all the big blocks on the top left belong to the same folder. To see more details, I just go to that folder in the terminal and type du -h. In this case, that folder is the iOS DeviceSupport folder and is taking 42GB.

So let me start my check list of usual suspects.

Development

  • ~/Library/Developer/Xcode/iOS\ DeviceSupport. This folder contains device symbols for debugging iOS apps. It gets huge quite fast (I had 42GB today!). I usually delete symbols for devices I don't support anymore. You can read about it here in this Stackoverflow answer
  • ~/.android. Similarly, you may find caches and devices here if you are also doing Android development. Since I only develop for Android from time to time, I usually delete the whole folder when I don't need it.
  • ~/Library/Caches/com.apple.dt.Xcode. It's also safe to delete this file, but I usually leave it there, because I'm scared Xcode will become slower or something.
  • ~/Library/Developer/CoreSimulator/Devices. These are the iOS simulators for Xcode. This also gets big (I had 18GB today). You can delete the ones you don't need, but then it may take you some time to restore them when you need them. The safe way to delete these is using simctl. Read this answer in Stackoverflow.
  • ~/Library/Developer/Xcode/Archives. These are the archives of the apps you have published in the App Store. You probably don't want to delete these, but you could move the old ones to an external drive.

Media

  • Photos. If you use the Photos app, by default it stores your pictures in the Photos Library file inside your ~/Pictures folder. But you can move the file to an external drive, and open that file from the external hard disk.
  • Movies. Same as above. If you use iMovie, you can keep your iMovie Library file in an external hard disk.
  • Music. At the moment, I keep all my music in my laptop hard disk, because I listen to them often. I guess the best option would be to clean up from time to time and move albums to an external disk? Haven't put much thought into this. If you haven't started accumulating mp3, I guess I would recommend just to use Spotify or any other streaming service.
  • iPhone backups. This is a bit tricky for a normal user. These backs take lot of space. You can delete the older ones from iTunes → Preferences → Devices. The location where these backups are stored can't be changed, but you can create a symbolic link to an external disc. Move the ~/Library/Application\ Support/MobileSync/Backup folder to an external drive, and type ln -s /Volumes/YourExternal/Backup ~/Library/Application\ Support/MobileSync/ from a terminal window.

A bit of Marie Kondo

While we are at it, we could do some "marikondoning" on our drive. Is that 300MB PDF from 2001 still useful? Do you really want to move it to an external drive? Wouldn't it be better to just delete it completely and reduce entropy? Are you ever gonna come back to it? Have you EVER read it? Do you smile when opening that file? Burn it if not!

There are a quatrillion million small files that won't catch our attention in GrandPerspective because they are small. But they clutter our disk. It takes too much time to sort them once it's become like this. So you could move all the crap to a folder called "unsorted", for instance. That's what I do with the things in my Desktop, to keep it always tidy. Then, just rely on Spotlight to search and find them. But for important things, make sure to keep them tidy in folders with relevant names (e.g. Documents/bills).

And that's my 5-cents! I hope it's useful for others as well.

Happy 2019!

 

Previous year