Lukrecia continues to have difficulty downloading the dataset from the AWI servers. She has informed IT and has learned that there was an ‘issue’ with the server, and that they hoped to get it repaired within a couple of days. So we wait.
In the meantime, I continue to build a model based on the Antarctic data sample. Many of the issues to be addressed are the same, so I am not too perturbed about the delay in getting hold of the model data.
Yesterday, I dismantled the metro based timing system to make the software run against global transport. This opens the potential to have multiple rhythmic elements working in sync which may be a means of being able to listen to the movements of a single element in the mix. I had been concerned about how to accommodate the sine wave drones which require a very slow tempo. I resolved this by dropping in a counter object and a sel object after an instance of metro. The counter runs through 8 bars, and the sel pushes out a bang every 8 bars. That more or less equates to the 30000ms timer I had running before linking it all up to global transport.
The trouble now is that with quarter notes and triplets the rhythms already sound maddeningly dense and complex. Plus the CPU is taking a hammering so I have upped the I/O and signal vectors up to 2048 - despite having no idea what that actually means! I presume its increasing the buffer range. It certainly drops CPU down from 95% to about 45%.
With CPU and rhythms becoming too dense, it does seem that a more useful approach would be to isolate parts of the model and calibrate them individually. But how will they sound when they come back together?
Global transport allows multiple metros to run as slaves, which can trigger on quarter tones (4n) quarter note triplets (4nt). Using counter and sel I can refresh the drones on much longer cycles.
Setting up global transport was worth the pain as it, theoretically, enables me to link the engine up to Logic and/or Live. This may help shift my sense of the work being suitable for an installation and bring it back into the territory of being the foundation of a performance or a recording.
Finally, in this pretty productive spurt, I built up a standalone patch of midi values corresponding to 16 different scales. Using a combination of dict and colls I can now pull out a series of midi values which - again, theoretically - will give the option to constrain output within a key signature and modality.