Project: LEDfx via MIDI

Something I’ve always wanted to do is to be able to throw together MIDI + audio reactive light shows. The day I got my first Philips Hue White+Ambience bulbs (maaany years ago) the idea hit me, gradually reinforced by the appearance of WS2812 LED strips paired with ESP32 microcontrollers flashed with a copy of WLED.

LEDfx has been around for a while, but MIDI control has only been properly (if very basic) integrated into it’s UI recently, so I promptly installed the latest version on my MacBook.

What makes this all the more desirable is my acquisition of an Akai MPC Live II. Akai’s MIDI over IP solution works fine on MacOS so makes this a fairly wire-free decoupled system (other than audio – sent to the MacBook via a Behringer UCA222).

Annoyingly LEDfx’s Web-MIDI implementation doesn’t work with Safari, so it has to be run in Chrome, but once that discovery is made it’s just a case of popping into LEDfx’s settings to turn on the MIDI scene control feature, saving some scenes, then firing a MIDI note on event to each scene’s setting page.

What we get is: audioreactive scenes that can be switched by MIDI. There’s no program or control change support and no access to scene parameters yet… but what’s there is enough for a very low cost and very usable system with little-to-no learning curve to get results. All the scene reactivity is done via FFT analysis of the incoming audio built into LEDfx and LEDfx’s builtin effects have various parameters that can be driven by power or volume of specific frequency bands etc.

I do long for MIDI CC realtime control over effect parameters though.

In the vid below, a single dedicated MIDI track per-sequence running on the MPC switches my LEDfx scenes, which are then controlling my TV ambilight WLED strip (configured in LEDfx as 4 virtual strips – top, bottom, and mirrored sides), ARGB fans in my PC case (they’re not linked to the PC itself) & Hue side lamps (set up as a 2-pixel strip). I can also mute the sequence playback and trigger scenes in realtime via the MPC’s pads.

The onscreen content is a pointcloud of me, sat on the sofa filming it with my phone, generated in realtime using an old Kinect360 with Delicode’s Z-Vector software running on the same MacBook (which is also being driven by the incoming audio, and the MIDI beat clock from the MPC)