Hi there, I think I did this custom nav bar thing. I added this aggregator page to my faircamp and it integrates that way with a customised nav bar at the top that matches the rest of the site.
But the page has to be added after the site is generated.
The page is constructed from a few basic elements. You could take a look at the source html of the page if it helps:
Hey, if you've never listened to my track "Psychedelic Ghost Stories," from my album of the same name, I would really appreciate 3 minutes of your attention!
It's one of my two favorite tracks from the album but it's got waaay too few streams (it's near the end of the album, so a lot of folks don't make it that far).
If you've found my work a bit too dense/intense/chaotic for your taste, it's worth a listen; it's very chill and atmospheric.
Niice. Really enjoyed that. Thank you. Yes I can picture a mid 80s trip to the corner video store. And a strange movie in a language I do not know catches my eye. It's on beta. There is a robot and a desert and weird ancient weapon and a crystal involved somehow. The owner can't remember getting it in. And I remember that we have a betamax player in the attic...
I don't know if this suits... those vsts look pretty advanced. But I was having a lot of fun recently putting the output of Festival (an old open source speech synthesis program) through Surge XT's vocoder.
Trying to get back into creative coding. The one thing that I've always wanted to do and never figured out is synchronizing music with something like p5.js or p5.py or something similar. What I'd like to do is drive the sketch with MIDI data, which is also being used to play music.
Had a go with the mido library in python. You can poll the incoming midi in the p5 draw function and use the data there. The biggest problem is the draw update and midi are not synchronised*. Maybe not a big problem unless you have loads of incoming midi messages and want to use all of them.
*Kind of got around this by passing all the currently pending backlog of messages in the input port to the output port each time draw is called. Output sounded okay to me, but...
... I am probably not the best judge of that. Anyway it was fun to try and I hadn't heard of p5 before. If you want to see my attempts let me know and I'll post them somewhere (just adapted the basic p5 example to have the circle color and size set by the incoming midi notes)
Could be even simpler - from what you mentioned above about using Ableton Live, I guess you don't need to pass the messages through the p5 sketch. So no output port needed.
mido is a really nice python midi library. Used it in a bunch of procedural music generation stuff. Recently tried something like the reverse, generating midi from an image:
The influencer... was seduced by the Donk Side of the Bonk. They ceased to be Everything BonkWaver and "became" Donk Fader. When that happened, the fun that was bonk music was destroyed.*
So, what I told you was true... from a certain point of view.
*it wasn't destroyed really. The bonk is eternal. As is the notBonk. But that is a story for another day.
If you're average Victorian or Edwardian person was to find themselves transpirited in time to hereabouts... I reckon they'd be like 'yeah it's alright... but not weird enough'
These guys were already psychedelic except that was just how they saw stuff... no 'scene' or anything like that. It was just the re-imagining of everything. I personally get the 'fin de siecle' vibe from their stuff, very similar to today
Have you seen those Victorian christmas cards, they are bonkers
These are some python midi tools developed for procedural music generation (most of the musical theory is from a great module called python-musical, but it needed converting to use midi for stuff I did in the past. Since then I've embellished it a bit and added more scales and scale manipulation.)
I'm working on another example script to convert an image to midi. It's working quite nicely so I'll tidy it up and add it later
You can get an idea of how it's working from this test image. After generating the midi file, the midi can be visualised (using Rosegarden here). You can kind of see how the colors are translated to notes. You can even make out the shuttle among the midi notes! I was pleasantly surprised by that.
It was a lot of fun trying this out and I'll definitely use it to make some tunes at some point. If anyone wants to try it or use it, instructions are in the readme in the repository.
Yeah that would be fun. Each still of the video is a step in time. So could decide on how elements or sections of a frame could be translated to simultaneous notes...
I'm out and about & away from PC, so can't right now. It's a bit frustrating as I really want to try it on those bit art bot posts!
Coding knowledge is not needed to use it, but you need python on your computer. If you're up for installing python, no coding is required to use the script.
edit: Though you've got to be prepared for a bit of command line / powershell action