There's a big problem in music which I solved today. It started when I was awake in the night listening to Yesterday Once More by The Carpenters. I became acutely aware of the power of the subtleties of rhythm, despite the fact that it was supposed to be regular.
There are two forces in recording music now: live played parts, generally recorded all in one go and layered, just as has been done since the time of the Beatles and before; and sequenced parts which are generally tied to a grid of timing, a fixed metronome. Playing live is far more expressive and efficient but there is less control; once it's done the only real option is to retake it. You have ultimate control over sequences but they sound cold and mechanical almost all of the time. I spend most of my time trying to inject feeling into them; the crunch point is the timing.
Now I mix both methods, but this has problems. When playing live I can easily hear when the beats are and where the rhythm is, and I can pick these out in my sequencer BUT they won't be neatly lined up on the screen, the grid of time - yes, in a way this is the point - the live music has feeling exactly because it is not exact, but this makes editing slow and awkward. My music, since I started to play long live piano parts, is a hybrid: some parts sequenced to the machine, sometimes loose and free, with extra parts added off-grid to match some sound-track, typically a piano recording (it would probably be a drum track for musicians that use drums).
But today I worked out how to combine both. It's by essentially squishing the grid, changing the tempo every beat, so that it looks all lined up but is in fact organic and live. I did this by programming Prometheus to look at a midi sequence of notes which I will play by hand like a tick-track. The software calculates the speed between notes, then continually changes the tempo to this ebb and flow of time. This way the organic timing of the hand played piano is exactly replicated in the digital sequence, while lining it all up so that the beats look like beats.
Musically, I could have done everything this way by ignoring the grid, and of course I toy with the sequenced tempo in lots of ways too, but it makes editing so much faster and easier to see where the beats and measures are, laid out in the proper place, and more efficient to play the song once with feeling rather than painstakingly hand-program every nuance of mood. Now I can simply, for example, use 'live' timing but have every other instrument programmed. The timing, beat, groove of a song is so strongly the driver of the feeling in a song that it's bound to change the result. With this change I needed to add a suite of tools for processing the timing data - of course it can all be tweaked, adjusted, copied, flipped and abused, etc., but that's done too now... all I need to do it try it out.
For the moment I think I'll pause for a day or two to regain some musical energy and passion. I've worked solidly for months on music. A year ago at this time I was working on Burn of God. I'd not sang a trained note or played guitar then - beyond the tinny strums in Palace of Skeletons (one of my favourite tracks on that album). So much has changed since.