CoreMidi/OSC support

@Fred no problem. Like I said, I’m happy to implement it, but I’m not sure what MIDI users need API-wise — I assume it’s all about interacting with external hardware.

Mirroring the CodeMIDI API would be the weakest option, as it seems highly complex.

@Simon I imagine something simple: a function to create a midi port (specifying which channels to send or receive, OMNI/1/2…), and two functions for sending or receiving MIDI messages. Processing’s MidiBus is a good place to start as an inspiration

I would also like to be able to send midi from Codea, I would like to design my own ipad instrument, and have it send virtual midi data to a ipad synth running in the background to control the sound playing in that instrument. There are sdk,s for Jack midi, and related subject Audiobus, is there anyway of getting these files accessible in Codea or do they have to be thrown in the mix at the Xcode stage? I’m just new to this stuff so sorry if this is ridiculous. This is the new midi/audio ios connection app (Related topic) this is the current audio app interconnect standard
If Codea could interact with these apps, Codea would would become the Max For Live/ Native Instruments Reaktor for ios.

I know a lot of people (iPad musicians) that would be thrilled with MIDI in Codea!

Hi @Simeon, having now read the coreMIDI API, I agree it is very complex… ! Having only used MIDI functions in programs, I didn’t realise there was so much to it. Does the Processing interpretation that @SaveAs mentioned match the simplicity we are looking for in Codea? On @Asynchronous’s comment, I’ve also got Audiobus, and it is great! They claim their API integration is very easy… You have to apply to see it though.

Here is an example of MidiBridge in Processing:
The script sends MIDI notes to Ableton Live DAW when the tentacles blink their (eyes) and receives MIDI notes from the kick drum so the tentacles stretch.
Imagine something like this in CODEA! I’m thrilled :slight_smile:

Hello @saveAs there is just a big blank in safari. We are using ipads here… So thanks to Bannana (or some other fruit), many things do not work…

I also don’t have coremidi experience, but I heard Jack has an easy API, handling both audio and midi (

@Jmv38 LOL! Here is the non embedded link
For some reason vimeo gets always embedded

Just want to say that I am very interested in midi (and Audiobus) support, have been hoping Gideros and/or Corona will acquire virtual midi support for about a year now, but it seems hopeless.

Recently Jack has been released for IOS with a Jack Audio kit for IOS

As jack has been developed for a few years on other platforms such as OSX. it has optimised latency, probably the most important aspect of any midi implementation. It also allows parallel branches to talk with more than one app at the same time.
Importantly you are allowed to use the Jack iOS sdk in any commercial app.
They say the kit can be integrated quickly into any app.

So my question is would it be possible to write a wrapper for the Jack iOS sdk in Lua and run it on the ipad. Midi messages could then be sent to a synth running in the background to handle the actual sound.

Also imagine a second branch sending midi network messages to a second ipad running OnSong to control your backing tracks and song sheets and so on!

This would be superb while developing virtual instruments on the ipad!

Just wanted to say Great Great app!
Midi support would be fantastic!
not sure how relevant this is, just in case.

Jack support in Codea would be phenomenal. It’s the best way to route audio and midi for Linux, and now for iOS as well. This way we can hook up our USB midi devices to Codea and make midi sequencer apps… synthesizers… great stuff. This would be so unbelievably awesome I would quit my job never shower and spend the rest of my days hacking in Codea :slight_smile:

While you’re at it throw in SuperCollider :smiley:

Jack FTW!

I would love to see OSC and/or MidiCore support build into Codea.
Here are two Frameworks that might help and make it easier
to get started with CoreMidi


As both a Lua programmer and a hardcore MIDI user and coder, I’d point out that there are many functional similarities between games and interactive music apps. I already have my own Lua MIDI implementation running on main OS’s based on a simple Iterate ports / open In port / open Out port / send message / subscribe to receive messages (by port+message) (coroutine style, Lua is a natural here), and finally close port. Nic at Audionics is currently writing an iOS CoreMIDI wrapper (MidiBus) that will help candy wrap a lot of the CoreMIDI complexities (VM, NM, CM) and nicely integrate Codea into the iOS Musician community. Which hosts a lot of extremely hot developers, and interestingly already has some Apps running on Lua under their hoods. But only Codea so far has figured out how to let users write code. A match made in heaven. For taste of the community check out the forums. And then dive into the list of AudioBus compatible Apps. Several have very interesting experimental touch interfaces.

pgmidi is the framework mostly used so far, but I think it exposes a little too much. but start there, yes.

And of course I’d be coding up a storm if MIDI appeared in Codea. I already have plenty of ideas on OS X / W7 that I want on my iPad.

Just a Little shout out. Would still love to see midi implemented. Kino is a ipad app with midi built in, and luaish script but no English instructions. there are so many midi remote apps out there, but all with limited scripting ability, Codea could still be the Reaktor of iPad. Fingers crossed, Peace.

This was posted over 4 months ago…

I would like to see this in, but this is a dead thread.