XaiaX (Dev)
banner
developer.xaiax.net
XaiaX (Dev)
@developer.xaiax.net
Software development focused sub-account of xaiax.net
It's not perfect. (I wrote an init for LABColor with a generic for BinaryInteger so it doesn't need to cast the RGB values to Int8 here)

Actually this was three or four successive tab completes starting with everything after "granularity:" (I'm going to default it to 5 so it doesn't do 16M checks)
December 10, 2025 at 7:35 PM
Ok, so WTF am I actually going to do today?

1. go look at the indexer code and re-internalize how it all works.
2. figure out how to build the coordinator I mentioned above to delegate image types to sub indexers
3. build the sub indexers
4. maybe add UI elements to control these
November 26, 2025 at 8:07 PM
When writing the sheets to local storage, I take a two tiered approach. Any sheet that is not full (has all 16 rows, all rows are full) is stored as a TIFF (lossless compression) so it can be appended to without generational loss.

Completed sheets are saved as HEIC, which is fine for tiles.
November 26, 2025 at 8:07 PM
With the image sheets I can load however many I can fit into main memory. Since they're immutable when running the generation task, there are no concerns about concurrency because order of operations is irrelevant, thus no overhead for keeping track of what's accessing what.
November 26, 2025 at 8:07 PM
Ok, but why store the sheets at all? Why not just read from the local thumbnail store when you're doing the index?

This would work, it would save some overhead and space on the index side, but pulling the individual thumbnails out while *drawing* the resulting image has more overhead.
November 26, 2025 at 8:07 PM
They each build their own sub-index and then when they're all ready I can build a merged index on the fly as the user makes changes to what media types they allow.

Currently I just completely exclude Hidden images but this could allow the user to opt-in to including them.
(Would not pre-index them)
November 26, 2025 at 8:07 PM
So I think my new approach will be to have an index coordinator that figures out what each incoming image type is, and hands it off to a separate indexer per relevant type. This would allow me to potentially do things like "use all frames of an animated GIF/Live Photo/etc".
November 26, 2025 at 8:07 PM
Some people might have larger libraries. You could just use a 64 bit int and have 35 bits left over to store 35 billion sheets and 10 trillion total images but that would bring additional storage complications that are beyond the scope of this project.

I just store a sheet ID separately.
November 26, 2025 at 8:07 PM
You also need to store which sheet it's in, but with a typical image size of 114x64 (16:9 frame) or 86x64 (4:3) you can fit ~18 to ~24 source images per strip (call it 20) and 16 strips per sheet so 320 per sheet. With a photo library of 20,000 images that's 63 sheets, which would give you 35 bits.
November 26, 2025 at 8:07 PM
If I want to be hyper efficient, the location of an image can trivially be stored in a single 32 bit integer.

Row/Column 4 bits (≤ 16 strips)
Linear offset 11 bits (2048px)
Short axis 6 bits (64px)
Extent 8 bits (256px)
That's 29 bits.

(You need short axis for greater than 4:1 aspect ratio images)
November 26, 2025 at 8:07 PM
The indexing process doesn't actually take that long, and merging takes negligible time. (I also check on startup to see if there are new images to add to the index since the last run)

iOS also has a volatile storage area for caches and the like that can be recreated on the fly to save space.
November 26, 2025 at 8:07 PM
What I realized this morning is that I could break these sheets up by source type. All screenshots, all Live Photos, bursts, etc. Then the user could enable or disable a type in the settings and it wouldn't require rebuilding the sheets.

I can store partial indexes for each type, and combine.
November 26, 2025 at 8:07 PM
Once I get a full strip, I set it aside until I get 32 full strips, which I then write to a single 2048x2048 image. I store the metadata needed to extract which image is in which rectangle on this sheet later.

(There are multiple non-full strips so that I can reduce unused space)
November 26, 2025 at 8:07 PM
So to build the index I parse through all of the photos, ignoring whatever I don't want to include, and write a new shrunk image that's 64px on the short side, in vertical strips for 64x wide images and horizontal strips for 64px tall ones.

I (arbitrarily) limit these to 2048px on the long axis.
November 26, 2025 at 8:07 PM
I experimented a lot with different tile resolutions and it turns out going above 64x64 is kind of pointless because you either end up with a gigantic output image or you use fewer tiles and it gets very chunky.

So I only need to store images as 64px along the short axis. (Mostly)
November 26, 2025 at 8:07 PM
On iOS the thumbnail for every image is contained in a 256x256 image, constrained by aspect ratio. So a 4:3 image is in there at 256x192, a 3:1 panorama shot is in there at 256x85, a portrait mode 9:16 image is 144x256, etc.

This is ideal for my tile scenario because I don't need anything bigger.
November 26, 2025 at 8:07 PM
I don't like the way I've built out the individual image parser's asynchronous functionality, and Swift's support for asynchronous code has changed a lot since I last touched all this so I may take that opportunity to learn how to use the new stuff and rewrite all of that.
November 26, 2025 at 3:58 AM
It has been too long since I looked in here, and there's a bunch of code that isn't used by anything, but now I can't remember whether that was something I was adding for future use or just hadn't got to deleting yet.

So I got to defactor some things, which is nice.
November 26, 2025 at 3:58 AM
Getting screenshots out was relatively trivial but I went in circles a bit thinking about pulling animated gifs out (since it only shows the thumbnail frame) but decided I don't care that much.

Might need to give the user the ability to specify either an allowed or denied collection, or both.
November 26, 2025 at 3:58 AM
Well, that doesn't appear to have broken anything. Committed a bunch of changes that Xcode made automatically. (mostly adding @retroactive to a bunch of extensions I added which seems appropriate)

Ok, I guess now it's time to figure out how to filter out screenshots from the index.
November 25, 2025 at 10:49 PM
Actually, maybe don't? Seems like this method isn't actually called by anything in the project. Maybe I fixed this by writing a different method that avoided this problem.

Alrighty let's see what happens if I just comment this out.
November 25, 2025 at 10:29 PM