Hello everyone, I’m trying to create a simple rhythm game as a project of mine and then work on it, once I have it up and running with one song. My target audience is the visually impaired or blind gamers. I want to create a rhythm game similar to the nintendo game “Rhythm Heaven”, the main goal of the game is to tap or flick your finger on the screen to the rhythm of the song which can be repetitive with slight variations or a memorized rhythm that will be played and will ask the user to play the same rhythmic melody exactly, kind of like simon says for the second part. The tap would be one sound and the flick would be another sound effect to differentiate the two for the non-visual player. I want to know how can I apply the BPM of a song, position in time, and intervals in the music sequence to match the timing of when the player is to tap or flick the screen. I’ve been trying to come up with a solution, but haven’t found one yet. Please help me ASAP
I don’t think Codea has any ability to “read” a song file, it just plays them, so I don’t know how it could know bpm, intervals etc. It does know position, though.
I was thinking something like
BPM = (num of beats in song) * 60 / (num of sec in a song)
To test user input, but how can I translate that into codea code
look up the sound/music part of the Codea reference to get song length, number of beats you have to figure out yourself
I also was thinking that a song can be read through a button which would call a function for a specific song in the game, so if I choose “Electrik” codea would assume that I’m calling the Electrik function and would use it’s length in second, number of beats in that song, and an array of beats to calculate when the user should tap or flick the screen to beat of the song, and use a time function to detect the input of the user and then use a history to store the correct number of beats hit to score the user based on his performance.
The number of beats I have that figured out and if I need the BPMS I could just use the previous equation I posted and just divide the 60 by 1000 to get the MS (milliseconds) beats in the song. I looked up the music/sound part of codea, but I’m confused on what functions to use
how about music.duration (length of song) and music.currentTime (current position in song)?
@flashstep11 Do you mean the player taps to the beat or the rhythm? The beat is doable becuase thats constant throughout most songs. The rhythm would be a LOT harder to do. To do rhythm, I would input the rhythm pattern mannuly for each song,
@ignatz, how would I use those in terms of creating the rhythm for the user to follow ? @Goatboy76 I mean the rhythm and I just want to do two-three songs if possible, because I will have to demonstrate it next month
you’ll probably need to use a DAW to figure out the tempo of the original song, and from there you can calculate the specific frame number that each beat should land on. if tempo is 60bpm, then there is 1 beat per second, so 60 frames per beat. use a frame counter to keep track of how many frames have occurred (from when you press song->play) and use modulo to see if your screen tap lands near the song’s beat. you’ll probably need to use an if() statement to determine if the tap was less than X frames from where a beat in the song is, as described by your frame counter (meaning song beats occur on frames 1, 61, 121, 181, etc, so look for taps occurring within X frames of that number )
Where is the music coming from? Are you using commercial recordings, or royalty-free works (from Soundcloud or wherever), or producing the music “inhouse”?
I would say that this would be very tricky to do with regular recordings of mp3s. In addition to calculating the bpm (which will change by milliseconds in most non-electronic genres over the duration of the track), there is the question of where the beat falls (ie the first beat is unlikely to drop at 00:00). Sure, professional DJ apps can do this MP3 analysis, but I don’t know if any have open sourced their algorithms.
A rhythm game, by its nature, has to get this absolutely correct, it’d be a very frustrating and pointless experience if it was even slightly off
I’d be tempted to say that the easiest way to implement this (and this method is by no means easy), would be to actually sequence the songs yourself in the app (ie not use a prerecorded MP3). Have a look at the Cargobot example. I think because it was written before Codea had its own music api, it uses an ABCmusic format sequencer. Apparently you can search the web for music files in this format, or convert midi files to abc format. I think there is an open source karaoke sharing community, with people sharing midi versions of Beatles songs or whatever. The ABC player might give you the hooks you need to determine when the first beat is falling and what the bpm is
Rhythm heaven! Awesome game
Thanks for the advice guys, really appreciate it
if you had access to the PCM data, you could just look for peaks in the wave data that last longer than 3 samples, but are shorter than another amount. but Codea isn’t designed to work with PCM streams so i’d probably avoid this project altogether in codea. You’re better off checking out the JUCE framework cuz that’ll give you native access to the PCM data, ala the CoreAudio API.
what is PCM data ?, I just want to get a working prototype just to show my project to my professor so that he can see that I have something working. what other languages do you think this would work much better on, but can also work on my ipad without using xcode
I just found this api called corona, will that be able to handle my rhythm game that I want to create ?
ryhthm is for poem?
If you don’t want to use xcode, and you want a prototype on your ipad, I would say Codea is your best bet. I’ve not used Corona, but I gather that it’s a Lua IDE (ie has quite a bit of crossover with Codea, which I’d describe as an iPad-based Lua IDE). I think you’d need xcode to get your Corona prototype on to the iPad though.
@matkatmusic I’m pretty sure beat detection is way more involved than just looking for peaks. ie according to this paper http://www.eecs.qmul.ac.uk/legacy/dafx03/proceedings/pdfs/dafx81.pdf you need a combination of phase and amplitude onset detection. This is why developers licence APIs to do the analysis (some of which do the analysis on a server).
Unless you have a budget for this project, I would forget about beat analysis and go for a sequencing approach.
For example, say you want to use “Hey Jude” (obviously, if you’re thinking of distributing this, you can’t use Hey Jude). You could use an MP3, which is a digital version of an analogue recording made in 1968. But then you have to try to reverse engineer the recording to work out exactly when Ringo is hitting that bass drum. This is a mathematically complex problem that most devs would use an off-the-shelf solution for.
Or, you could use a MIDI instrumental version of the track, created for karaoke machines, or for piano instruction purposes. This is not a recording, but a set of instructions for triggering, in sequence, all of the notes that make up the song (using either samples for the instruments, or generated sounds). Then you don’t need to reverse engineer the recording to find the bass-drum hits, you just need to listen out for each time the instruction is given in the sequence that says “play bass drum”.
You’d have to find some way to convert that MIDI file to ABCmusic format (or maybe Hey Jude is already out there in ABCmusic format?), drop it in to the ABCmusic player in CargoBot, and work out how to add the hooks you need to the ABCmusic player. So I’m not saying this way is easy. But I think it is a lot easier than doing beat detection in a recording.
can you show me sample code that would work on codea, that would actually get me going to do this project using sequence detection instead of beat detection using the ABC music player ? I really need help with this and any code would be wonderful