The video of my TEDx talk is now available on the official YouTube channel.
Please share it around and promote it!
I think this is a very useful concise way to explain what the Eigenharp and other new electronic musical instruments are about. The more people understand what can be done, the more they’ll ask for improvements from software and hardware makers and the more we’ll get out of it as musicians.
Wolfgang Palm is working on creating a computer version of his WaveGenerator iPad synth … and it includes full voice-per-channel MIDI supports that works out of the box with the Eigenharp’s poly mode for hosting plugins.
Here’s a quick noodle I recorded with it while testing:
Last weekend, Paul Harriman, Randy Brown and me (Geert Bevin) met up at the Electro-Music 2013 festival in New-York and performed an unprepared Eigenharp Trio Improvisation. This video is the recording from this unique experience that is probably a world’s first, but certainly not the last!
Programmers are the wizards of our age. We weave the fabric of the modern society, tell the stories and bring dreams to life. Geeks are ruling the world of today and programmers make it all happen. The uninitiated marvel at what we accomplish without an inkling about how everything comes to fruition. Many centuries ago, musicians held a similar role in society but today they have lost that grasp on the world. Programmers are in an unprecedented position in which we can not only make a good living, but fuel our work with passion, creativity, collaboration, innovation, and forward thinking. Programmers can make a significant difference in a world that is truly our oyster, and that is worth celebrating.
It’s still a work-in-progress, but for the benefit of Randy Brown and anyone else who’s interested, here’s a quick (very rough) demo of my work on the Eigenharp Ableton control surface script. It’s currently dependent on a very Heath Robinson modification to the Midi Input agent, so not published just yet, but once I’ve tidied things up I intend to properly issue it so that others can use/amend to their hearts’ content.
Because I’ve had to use the built-in Macbook webcam for now, the Eigenharp is shown sideways on, but essentially each of the five columns on the Alpha represent a track in Live. Rows 1-6 mirror the clips that have focus (determined by the 5×6 blue box on the Live screen, which can be moved from the Alpha). Row 7 allows each track to be turned on/off, 8 allows each track to be solo’d, 9 allows each to be armed for record. It’s multi-touch, so you can arm/solo multiple tracks by clicking several buttons at once. Also any changes in Live are automatically reflected on the Alpha. Row 10 gives a Stop button for each track and will thus stop any clip currently playing on that track. Rows 11 and 12 are there to give an up/down/left/right controller to move the focus box in Live. Because it’s working with a control surface script in Live, there’s no project-by-project config to do at that end – if the EigenD setup containing the relevant agents and talkers is set up then it’ll just work with a brand new Live set. Likewise, although this is setup for an Alpha, with fairly minor tweaking it’s possible to setup other sized interfaces for different Eigenharps. With this 5×12 grid there seems to be no latency on my i7 machine, so theoretically you could have a page in your EigenD setup that shows a 5×24 clip launcher. It’s not shown in the video, but you can also map buttons to control Live’s transport controls, metronome, tempo, etc., all direct from the ‘harp.
The modular setup scripts and configs are designed to allow anyone to create EigenD setups from scratch, only selecting what they want. This is useful, as it not only allows unique customizations but also results in smaller setups which are quicker to load.
Before is a short overview video that gives you a taste of the possibilities that this opens up.
This is a first look at the application that I’ve been working on for the Leap Motion Controller. It provides multi-dimensional MIDI expression through hand gestures.
Geco has been designed for live performance and it operates at extremely low latency, while requiring very little resources on your computer. It can thus perfectly run alongside any MIDI capable software.
In this demo I show some of the configuration capabilities while using Native Instrument’s Razor synth in Reaktor.
40 different control streams with both hands
any control stream can be mapped to MIDI CC and Pitchbend messages on 16 different channels
instantly switch between related control streams by opening or closing your hands
carefully designed GUI for an immediate overview of the active MIDI mappings
real-time low-latency visual feedback of your hand movements and MIDI data
integrated virtual MIDI port on MacOSX
connects to any known MIDI output port on your computer
fully customisable user interface (colours, graphical elements)
flexible document management that can be loaded while performing gestures
high performance and near-zero latency engine with virtually no CPU impact when the real-time visualisations are hidden
MIDI decimation setting to allow integration with legacy hardware that has limited MIDI bandwidth
Yesterday afternoon I took a few hours to write an EigenD agent for the Leap. Currently it only supports sending out the x, y and z axis of two palms relative to the Leap device itself. The API is very intuitive and it shouldn’t take long to gradually add support for hand directions and fingers, I wanted to first play around with it for a while though.
In this experiment I control two effects in Native Instruments’ Guitar Rig 5. It’s processing one of my songs that’s playing from iTunes and you can hear the recorded result. I did have to manually align the audio to the video, so that might not be perfect.
This developer version of the Leap device supports a little bit less range and a restricted field of view, which is why it’s sometimes missing some detection at natural boundaries. The production versions will not suffer from this.