Welcome to EnDavid.com. You can find here a compendium of things that I have published and other stuff I made during my spare time.

If you get lost, try visiting the Site Map.

In this main page, you can find my main blog, where I keep track of the updates of this site, and post some technical articles from time to time. If you are interested, just subscribe to the RSS feed.


Words don't come easy to me
Tue, 17 Sep 2019 08:52:31 +0100

Syllabits & word discovery

I've been working on an English version of my word puzzle game Silabitas. I'm going to call it Syllabits, and it's looking like this at the moment,

The basic rule is that you can only place a piece on the board if you make a word with it, so you have to connect it with something already on the board every time. So above "se" you could place "po", to make the word "pose". Or you could put "re", to make the word "resell", since "se" is also connected to "ll".

Every stage has multiple solutions, so you can try placing pieces at random at times. And when you do that, more often than not, you discover new words you didn't know about. I think this aspect of word discovery is quite fun.

Dictionaries

So it is very important that you have access to a dictionary to check those words you discovered. You can check the list of words you've made in a stage, and click to get its definition from an online dictionary. In Silabitas, that is the Diccionario de la lengua española de la Real Academia Española, and in Sil·labetes it is the Diccionari de la llengua catalana de l'Institut d'Estudis Catalans. For Syllabits, I decided to go for the Oxford Dictionary of English at Lexico.

In order to check if a word exists I'm using the word list from the system spellchecker, a 2MB file stored at /usr/share/dict/web2, augmented with even more words that I found here, for a total of 370,103 words. This includes verbs conjugations and other word transforms of the English language, like adding -er to adjectives.

However, I soon found out that this spell checker has way too many words. While playing, I would make words that later I couldn't find in the Oxford dictionary. I find this really annoying and I don't think it's a good user experience. I guess spell checkers keep accumulating words, whatever the source, and whatever they old they are. But I needed something more up to date.

Filtering words

I decided to check the words from the spell checker against Lexico. If the word is not found, it sends you to an error page. But more interestingly, if you search something like "bigger" it will redirect you to the root word, "big" in this case. The same for verbs.

So just by checking the headers, if I see the page is redirecting me to a definition, I don't need to look any further, since I know the word is in the dictionary. The only problem is that I should make at least 370K web header requests to Lexico. A request was taking on average 300ms, so it would take at least 31 hours.

I wasn't too worried about the time, but I was afraid they may think I was attacking their site or something with so many requests, although I was throwing only one request at a time. It turns out the requests were coming back even at a slower pace, but I kept filtering the spell checker list...

... until I went on vacation for a few days and when I came back they must have noticed that I was making too many requests and they started returning 429 errors. Those didn't appear last week. So I guess in a sense I helped them better their site? Should I be proud? 🙈

The 429 errors came with a "retry-after" field, set to 5 minutes. So I changed my script to retry after the amount of time requested. For the last 20% of words it took for the script more than 5 days to check them. The final filtered list contains 165,589 words.

It was worth the effort. Now you can play the game and be reassured that you are going to be able to find online the definition of the words you discover while playing it.

Words that didn't make it

Exploring the list of more than 200K words that got discarded from the spell checker list is quite interesting. This whole thing started because I noticed the word "garrafa" in the spell checker list. That's a Spanish word. I know Merriam-Webster includes many Spanish words in its dictionary, from American influence, I suppose. But "garrafa" isn't there either. Perhaps it had been there in the past, but it's been removed already. But the spell checker hasn't been updated.

Another funny word I found is "naricorn". It's not in the Oxford dictionary, but I found it in Merriam-Webster (naricorn). Interestingly, when you visit that definition, you get this message:

Love words?
You must — there are over 200,000 words in our free online dictionary, but you are looking for one that’s only in the Merriam-Webster Unabridged Dictionary.
Start your free trial today and get unlimited access to America's largest dictionary,

So it must be a rare word indeed! 😃

Some other interesting ones not in the dictionary are words like "peacefuller". I guess if you are already "full", you can be "fuller" than that 😃 Merriam-Webster does redirect you to the definition of "peaceful", but Oxford/Lexico doesn't.

There are also a lot of words beginning with "anti-" or "ultra-" that didn't make it either. Like "antiagglutinating", "antiauthoritarianism", "antibenzaldoxime", "antimeningococcic", "ultradolichocranial", "ultrafashionable", "ultrafederalist", or "ultraphotomicrograph". Again, some of these appear in Merriam-Webster. I'll let you guess which ones.

More fun coming!

Stay tuned for the release of the game! I want to release it with plenty of stages. I targeting 60 for the release. And I'm also planning to add support for iOS13 dark mode. In the meantime, try the Spanish version, Silabitas.


Spherical Harmonics Lighting from ARKit Environment Probes
Sun, 31 Mar 2019 16:03:46 +0100
I've been playing with ARKit a bit, and I've added support for it in VidEngine. There's a small AR app there if you want to check it out: SampleAR.

One of the things I got excited about in ARKit2 are the environment probes: ARKit automatically generates environment textures from camera imagery. It completes the missing bits of the texture with Machine Learning, and then it keeps updating them as you move the camera around.

I thought it would be a good idea to compute a Spherical Harmonic (SH) light approximation from those environment maps, so during rendering I can just render the diffuse light based on a few coefficients. The environment probes can be defined with an extent, so the area covered doesn't need to be the whole scene. That goes well with the SH lights, since you may want to place different probes at different locations.

The way I've implemented this is through a LightSource called SHLight. The SHLight has an extent as well, defined as a cube volume. When you place an SHLight in the scene, an environment probe is placed at its center. Then, for a few frames, I initialize the samples across the sphere, using jittered stratification. This could be done offline and saved to a file, but I'm doing it on init for the time being, just in case I want to change the number of samples. After init, I read all the samples with just one draw call, since the cubemaps ARKit generates aren't that big. I'm currently reading 10000 samples. With all the samples, I compute the Spherical Harmonics during a few frames, and store the matrices for irradiance approximation in a buffer that the DeferredLightingPlugin uses for rendering.

You can read about the irradiance map representation in this paper: An Efficient Representation for Irradiance Environment Maps. An all about Spherical Harmonics in Spherical Harmonic Lighting: The Gritty Details.

So it takes a few frames to see any light. I initialize the irradiance matrices with a gray ambient light, so it's not totally dark. And every time ARKit updates the environment map, I read again the samples and recompute the Spherical Harmonics. I blend the coefficients of the irradiance matrices so the transition from one light to the updated one is smooth. The blending is done in the CPU, since for most of the time there's a single set of coefficients, so I don't want to do unnecessary processing every frame in the GPU.

The SampleAR app doesn't do much at the moment, other than showcasing these SH Lights and plane detection. You can place cubes, spheres, and a building from Santorini (I made this model for Snake on a Sphere). When you place an object on a detected surface, a SHLight is placed slightly above the object, covering one cubic meter, so the object is inside. For every object you place, a new SHLight is created. If you toggle on the debug mode, you'll see something like this,

The grids are the planes detected by ARKit. The green wires define the extend of the SH lights and the environment probes, and the "reflective spheres" are debug objects to show how the cubemaps captures by ARKit look like. Here's a small video,

Because there are several SH lights in the scene, I'm using the stencil buffer to mask the volumes that have already been rendered. This doesn't account for intersecting volumes. We could blend the lights in those cases, but I'm not considering it for now. The stencil buffer set up can be summarized in this table,

      Comp.  Ref.  Read  Write  Fail  Depth  Pass
      func.  Val.  Mask  Mask         Fail
--------------------------------------------------
Back
Face   ≠     L|A    A     W     Keep  Repl.  Keep

Front
Face   ≠      0     L     L     Keep  Zero   Keep

Render
light  ≠      A     L    L|A    Keep  Keep Replace

So I need to do 3 passes per light, which it's not ideal. And even less ideal is that I have to create a different encoder each time that I change the shader. There are only 2 shaders needed for this, but because I need to swap between one and the other, I had to create a horrible loop that creates many encoders. Check the drawSHLights function in DeferredLightingPlugin. I think I will ask in Stackoverflow, because there may be another way of doing this in Metal.

The app still has some glitches. The cursor starts to "flicker", or rather, leave a trail, from time to time. I'm not sure if it's because ARKit goes nuts, or if I should smooth out the movement of the camera and objects by myself. I'll be investigating these issues next.


Inverse Transform without matrices?
Sat, 23 Mar 2019 22:27:57 +0000
Given an affine transform, expressed as a translation (or position), a scale, and a rotation, how do you compute its inverse?

Well, if you write the transform as a matrix (for column vectors, so the first operation is at the right hand side),

M = T * R * S
then the inverse is (I'm using a single quote instead of -1 to write the inverse),
M' = (T * R * S)' = S' * R' * T' 
Can we write that as another affine transform? That is,
M' = S' * R' * T' = T_ * R_ * S_
Well, a person from the future wrote in math.stackexchange how to extract the translation, the scale, and the rotation of a given affine transform.

So far, so good. But if you look at the comments, someone comments that the extracted rotation matrix might be a combination of shear and rotation!

This might be a trivial problem, but I never encountered it before. The issue is that, if the scaling is anisotropic, then there's certainly shearing going on in the inverse. That is, you can extract T, R, and S from M using what's described in math.stackexchange, but you can't extract T_, R_, and S_.

What can we do, then?

Well, the translation can still be computed the same. If t is the position in the original transform, then the new position is,

t_ = S' * R' * (-t)
and we don't have to use the matrix forms for this. If your rotation is a quaternion, simply invert the quaternion and rotate t with it. Then, divide each component by the scale.

But can we convert (S' * R') into (R_ * S_)? Well, we can use the Singular Value Decomposition to see how it would look like,

S' * R' = U * Σ * V'
Σ is a diagonal matrix, so a scaling matrix. But unless V is the identity, I don't see how this would look like an (R_ * S_)

So in the end I bit the bullet and used matrices when I need the inverse and I know the scaling may not be isotropic...

What was I trying to do? I was computing ray to object intersections in an AR sample app. These are computed in the CPU, and it would be costly to transform all the triangles of the object to test the intersection, so I'm converting the ray to model space instead. That's why I needed the inverse of the world transform of the object. You can see the final commit with the bug fix and several unit tests that exercise the different conversions: Fix transform.inverse (VidEngine)

Please message me in twitter @endavid if you have any comments or suggestions!


Cleaning up my Mac drive
Fri, 25 Jan 2019 21:43:15 +0000

We accumulate lots of crap in our hard drives. Even more than in our homes, because the "crap" is not usually visible, so we keep storing and storing. Space is not much of a problem... or is it?

It turns out it is a problem for me. Every now and then I get warnings that I'm running out of space and I have to do some clean up. But because I only do this every so often, I always end up spending some time Googling about mysterious big system files, to check whether or not it is OK to delete these.

So I thought I could write a small guide for myself that I can come back to next year. Some parts of it may be useful to others, but some are dev-oriented.

But first, how much space do we have left? You can right-click on the hard-disk icon in any Finder window and click Get Info,

Or from the console, just type,

    df -h

    Filesystem      Size   Used  Avail Capacity	Mounted on
    /dev/disk1s1   234Gi  223Gi  6.7Gi    98%    /

Then, if you have no idea where to start, use a tool like GrandPerspective to get a visual overview of the biggest offenders. Here's my hard drive right now,

Big blocks are big files. You can hover over the blocks to find their location in disk. The blocks are also grouped. So you can see all the big blocks on the top left belong to the same folder. To see more details, I just go to that folder in the terminal and type du -h. In this case, that folder is the iOS DeviceSupport folder and is taking 42GB.

So let me start my check list of usual suspects.

Development

  • ~/Library/Developer/Xcode/iOS\ DeviceSupport. This folder contains device symbols for debugging iOS apps. It gets huge quite fast (I had 42GB today!). I usually delete symbols for devices I don't support anymore. You can read about it here in this Stackoverflow answer
  • ~/.android. Similarly, you may find caches and devices here if you are also doing Android development. Since I only develop for Android from time to time, I usually delete the whole folder when I don't need it.
  • ~/Library/Caches/com.apple.dt.Xcode. It's also safe to delete this file, but I usually leave it there, because I'm scared Xcode will become slower or something.
  • ~/Library/Developer/CoreSimulator/Devices. These are the iOS simulators for Xcode. This also gets big (I had 18GB today). You can delete the ones you don't need, but then it may take you some time to restore them when you need them. The safe way to delete these is using simctl. Read this answer in Stackoverflow.
  • ~/Library/Developer/Xcode/Archives. These are the archives of the apps you have published in the App Store. You probably don't want to delete these, but you could move the old ones to an external drive.

Media

  • Photos. If you use the Photos app, by default it stores your pictures in the Photos Library file inside your ~/Pictures folder. But you can move the file to an external drive, and open that file from the external hard disk.
  • Movies. Same as above. If you use iMovie, you can keep your iMovie Library file in an external hard disk.
  • Music. At the moment, I keep all my music in my laptop hard disk, because I listen to them often. I guess the best option would be to clean up from time to time and move albums to an external disk? Haven't put much thought into this. If you haven't started accumulating mp3, I guess I would recommend just to use Spotify or any other streaming service.
  • iPhone backups. This is a bit tricky for a normal user. These backs take lot of space. You can delete the older ones from iTunes → Preferences → Devices. The location where these backups are stored can't be changed, but you can create a symbolic link to an external disc. Move the ~/Library/Application\ Support/MobileSync/Backup folder to an external drive, and type ln -s /Volumes/YourExternal/Backup ~/Library/Application\ Support/MobileSync/ from a terminal window.

A bit of Marie Kondo

While we are at it, we could do some "marikondoning" on our drive. Is that 300MB PDF from 2001 still useful? Do you really want to move it to an external drive? Wouldn't it be better to just delete it completely and reduce entropy? Are you ever gonna come back to it? Have you EVER read it? Do you smile when opening that file? Burn it if not!

There are a quatrillion million small files that won't catch our attention in GrandPerspective because they are small. But they clutter our disk. It takes too much time to sort them once it's become like this. So you could move all the crap to a folder called "unsorted", for instance. That's what I do with the things in my Desktop, to keep it always tidy. Then, just rely on Spotlight to search and find them. But for important things, make sure to keep them tidy in folders with relevant names (e.g. Documents/bills).

And that's my 5-cents! I hope it's useful for others as well.

Happy 2019!


Previous year

Next year