The tragic part is that I'm also responsible for the complexity that's already there. I started this project a couple years out of college and now, sixteen years later, I can tell immediately upon opening any given source code file that, "oh, this part was written by Intern Nick" (as opposed to "haggard, old, battle-hardened Nick").
There are maybe 3 or 4 distinct coding styles that have come and gone over the years and whenever I encounter the earliest one when I'm going in to make a change or add some feature, I know I'm in for trouble.
Still, it's getting there. Slowly.
This is a low-level, nitty-gritty, "bare metal" programming talk (hilariously dunking on the use of C++ as too high level a language, delivered at the premiere C++ conference to C++ programmers). So I wouldn't recommend it unless that's something you're interested in (although he's got a pretty quippy answer during the Q&A starting at 1:17:40 that may be entertaining to a general audience.) It was recorded in 2014, I think I first saw it in 2016, and it took a couple more years to really sink in.
I saw
this joke a few years ago, around the same time. There's a little bit of inside baseball going on there, but the general idea is that as you develop as a low-level systems programmer, you discover the allure of all the fancy toys built into your language, but eventually learn that they're more trouble than they're worth. When I discovered that video, I was knee deep in the middle panel. Now I'm
almost ready to make the transition to the last one.
The horrifying discovery that started leading me down that path came when I found a clever way to ask Microsoft's C++ runtime to tell me how often I was asking the system for more memory. (Technical version: they give you a hook that will run your callback whenever "malloc" or "new" is called so not only can you track the number but you can also see exactly where in the code the request was made.) I wondered how bad it could be...
Pretty bad.
On the play screen, with just about every feature disabled, as of Synthesia 10.8, with the song paused(!), the app asks for new memory more than 1,500 times
per frame. Much worse: if you turn on
any label mode (even just "Octaves"), that number jumps to more than 4,100 allocations per frame! So, assuming the usual 60Hz, Synthesia is banging on the memory allocator around a
quarter million times per second. (This scales linearly with the number of notes on the screen at once. These numbers are from a relatively "intense" song. For a simple song you get ~700 / ~2400 with no labels vs. labels enabled.)
Whenever I actually let myself stop and think about it, I become disgusted that I let it spiral out of control like that.
It is only because we have these fantastically powerful super computers that an absurd number like that can fly under the radar in the first place. At least I'm in good(?) company. Each key press in the address bar in Chrome
used to cause 25,000 allocations.
This same kind of carelessness is why browser tabs idle around a couple hundred MB of RAM these days. Abstractions sloppily built on top of abstractions, ad infinitum.
Assuming "Watch and Listen Only" with a completely hands-off user (no MIDI, mouse, keyboard input, etc.), Synthesia knows precisely 100% of
everything that is going to happen during the course of the entire song, completely in advance... so that number should be
zero allocations per second (after the initial loading).
But to get there would take a dramatically different programming style. Something closer to pure C. I'm not crazy enough to rewrite everything yet, but Synthesia 10.9 is going to have a new label system (with near-zero allocations) and I have vowed that each release after that is going to at least halve the number of per-frame allocations on the play screen.
These changes won't really do anything visible to the user unless you're on a really low-spec machine, but fixing all of this will help me sleep better at night. (If I were forced to pick silver linings: your tablet battery will probably last longer after these changes and it might help pave the way for something like a Raspberry Pi version of Synthesia someday. Those aren't the reason I'm doing it, but they're benefits that will come along for the ride.)