MIDI - why is there no interest?

In the early days of Codea the iPad was a finicky and really only a single-tasking beast. MIDI was not even available.

That has changed in the past 2 years. Really changed. I mean, amazingly so, to where iPads are now regularly in use on stage. With AudioBus and MidiBridge and IAA it is now possible to design sound stages of arbitrary complexity (limited only by imagination and hardware, an ever rising limit:) including fully professional synthesizers, effects, and recording workstations, and supports multi-channel audio interfaces and multiple discrete controllers, not to mention the Virtual MIDI connectivity entirely inside the box. I have one setup with an iPad Air and a Focusrite 18i20 interface that is capable of full HD surround sound.

But developing controllers for it that have any kind of significant intelligence is out of the reach of non-developer users, and seemingly beyond many actual developers. If Codea could just add the basics - enumerate available ports, open an Input, open an Output, call-back on receive message, send message, and close: there is even a free code library of professional robustness for iOS called MidiBus (by Audeonics) that makes this simple to implement and reliable, and Apple are already quite happy with it. And Codea have for your part dealt with Apples’ squeamishness about allowing end user access to Lua scripting. This is a unique combination and opportunity.

Not to mention that making music has a lot in common with game play. And, in the context of Codea, MIDI on iOS provides solutions to making really immersive sound effects for games, far beyond the simplistic tone generator that is all one has today. Implementing MIDI also opens up a world of hardware controllers for game interactions.

But there is nothing like Codea for doing scripted interactive musical instrument UI and algorithmic compositions. Nothing. And many thousands of potential users.

So why the meh?

(fyi: I’m a seasoned professional Audio engineer, MIDI expert, embedded systems programmer, and musician, currently employed by a well known gaming console company as their consoles’ audio engine developer. I also have developed a dataflow mesh scripting engine which embeds Lua that runs natively on standard PC platforms. I know this stuff backwards, I can help, I can teach, I really want to be able to use Codea for this, there are no alternatives).

Thanks for reading this far, thanks for thinking about it, and especially thanks for acknowledging this post, as prior and less detailed posts have been ignored. Especially thanks if something comes out of it.

I would definitely like seeing MIDI support in the future, I would enjoy fiddling around with my Launchpad using Codea (Launchpad Tetris?), or writing some synthesizers for my Launchkey.

@dwarman I would really like to include APIs relating to audio and MIDI. The main problem is I don’t know what those APIs should look like. I don’t work with audio (much) and so don’t have a good understanding of how iPads are used in audio creation, MIDI, and especially how apps interoperate through things like AudioBus.

I would be very happy to look over an API proposal that implements functionality you’d like to see in Codea. Basically, write the code the way you want it to work and I’ll have a look at what it might take to implement. We can use this thread to discuss a potential API.

I’m pretty fussy about API design, so I’d like anything we include relating to this to be well-thought-out and not tacked on just for the sake of having the features.

@dwarman - welcome! We love to have people with different skills like yours. I hope you can find a way to use them with Codea.

Thanks for such immediate and positive responses! I’m going to refresh and deepen my knowledge of the MidiBus iOS library, but as a first hack I’d be talking about exposing the library API in Lua. Or some subset. Or after some thought a candy-wrapper. I’ll be a little while doing this, but in the mean time you might want to explore it yourself:

http://www.audeonic.com/midibus/

Whatever I come up with, will probably be a subset of what I have running in my datamesh system. I’ll have to minimize it some since it has elaborations not totally necessary for Codea environment.

Hi Dwarman,
Have you thought anymore about this midi with codea? I am so desperate to create something very specific and something that does not exist at this point and I think would be very well received in the music creation world. Maybe I could explain what I am looking for and you can tell me if I am dreaming or not?
Cheers
Nic

@nrprioleau No way! your name is Nic and you are into making a music / midi app? My name is Nico and I just posted a thread outlining a concept for an audio thread API, and I have done low level DSP and MIDI work on my hardware / software synth products.

I have the skills to help make this happen, I hope @Simeon or @John or someone internal contacts me back about my proposal. Codea will get MIDI and DSP ASAP if I can help it.

@AxiomCrux Nico, Thanks for that info!! I would be willing to work together to make it happen! Anything I can do!

@nrprioleau would you care to provide an outline of what your concept is / or what you “are looking for” to tell “if you are dreaming”? :stuck_out_tongue_winking_eye:

I personally have mainly been using Codea for prototyping ideas, as well as for learning Lua to see if it is a viable way to impliment some things. Currently working alot with shaders.

My biggest goal in life is to create a comprehensive scalable modular creative toolset, that makes an inherent relationship between audio and visual elements from the bottom up. Audulus 3 is one app that I am working on a little bit with the main developer, if you are into audio apps I highly recomend checking it out, extremely capable yet fun and easy to use (compared to Reaktor and Max/PD).

@Simeon @John Fundamental access to audio input/output would be so awesome and welcome. I already have couple of ideas where I could apply this: for example a podcasting app or static site generator for audio driven blogs.

We have already access to the speakers (although it would be nice to have more options, e.g. 3D sound). But microphone is still missing and would be endlessly great to have, even at the very low-level (buffers)…

I looked at Apple’s API’s and microphone input works in two parts. There is a recorder (start, stop, etc.) and a session (settings). You setup a session and then use the recorder to control it.

I imagine using it like this in Codea:
(maybe someone has a better idea)

function setup()
    local audio_settings = {
        channels = 2, -- 1 mono, 2 stereo
        sampleRate = 44000,
        inputGain = 1,
        inputLatency = 0,
        volume = 1,
        fileFormat = "mp3" -- file suffix (mp3, wav, m4a,..)
    }

    record_session = recorder(audio_settings) -- call recorder class with custom audio settings
    -- under the hood a soundbuffer() is created and filled while the session is active and running
end

function appInterrupted() -- by an incoming call or something (works like orientationChanged callback)
    record_session.pause()
end

function appResumed() -- counterpart to app interruptions
    record_session.resume()
end

function touched(touch)
    if touch.state == BEGAN then
        record_session.start()
    end

    if touch.state == ENDED then
        record_session.stop()
        saveAudio("Documents:myCoolRecording") -- saves current session buffer
    end
end

Playing the recorded session would be:

function setup()
    recorded_audio = readAudio("Documents:myCoolRecording") -- returns a soundbuffer() with {get} properties for playheadPosition and audioLength, etc.

    recorded_audio.volume = .5 -- you can also change some props... to control speakers

    parameter.watch("AUDIO_LENGTH", recorded_audio.length)
    parameter.watch("PLAYHEAD_POSITION")
end

function draw()
    PLAY_POSITION = recorded_audio.position
end

function buttonPlayTapped(t) ... end
function buttonStopTapped(t) ... end

function touched(touch)
     if touch.state == ENDED then
         if buttonPlayTapped(touch) then
             recorded_audio.play()
         elseif buttonStopTapped(touch) then
             recorded_audio.stop()
         end
     end
end

But I don’t know how MIDI api could look like. I’ll better leave that proposal to somebody who is experienced with that kind of stuff…
BUT here is something worth looking at: http://audiokit.io

I have an idea for a virtual midi keyboard, I just need a channel thru (on ch #1), to play midi preset patches, but i dont know the code…I would create keys in the draw function, place in a touch function, and whatever note is touched, it would go through ch#1 and play the note #nub accordingly and use that patch. From there I could build on it with other ideas and add them on accordingly…