Another Quaternion Demo

http://www.youtube.com/watch?v=zQ88J2KZpfE

The video was not done using the inbuild video recorder. Instead, I used the new z-depth feature of setContext (in beta) to record each frame to an image and then put them together in to a video (using ffmpeg). This resulted in a much higher quality video, and I could control the frame rate precisely (my attempts using the inbuilt recorder were too choppy for my liking).

holy sh\t! looks so awesome \.\*

Wow, another awesome demonstration! I’m amazed by how smooth the movement is, even with so many elements being drawn.

Andrew, I’m particularly intrigued by your use of ffmpeg to create the video. I need to implement video recording in my Flip Pad Animator app, so I’d be very grateful for any advice or code you could share in this regard.

@time_trial The ffmpeg was done on my Mac afterwards. So the Codea program produced a couple of thousand pngs, which I then copied to my Mac and used ffmpeg from the command line to assemble them into a move.

The exact command was:

ffmpeg -r 30 -i Tetrahedron%05d@2x.png -c:v libx264 -r 30 -pix_fmt yuv420p Tetrahedron.mp4

It may not be optimal.

Doing it this way is what led to the smoothness of the motion. Since I could compute each image exactly and stitch them together with an exact frame rate, I could be sure that the motion was exactly what I wanted it to be.

@Andrew_Stacey did you try to re-read the images from codea and display them while recodrding? Is the framerate bad too?

@Jmv38 I didn’t try that. Once I’d saved them then merging them by the command line seemed the obvious way to combine them in to a movie. Anything else feels like overkill. Also, this way I get a better resolution, I think.

@Andrew_Stacey i wish i had the command line from the ipad, not from the pc only…

Thanks, Andrew. Like @Jmv38, I’d hoped you’d found a way within Codea itself.

There are possibilities in Xcode though. When I get as far as exporting my app, I am going to try the code samples from http://iphonedevsdk.com/forum/iphone-sdk-development-advanced-discussion/77999-make-a-video-from-nsarray-of-uiimages.html and http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie/3742212#3742212.

hey, @time_trial could we try to be geeks? what about 1/ recording the video with codea, then 2/ finding in the .mp4 file where the frame period is written 3/ replace the value by the one we want? If this makes sense, i dont know how they save the frame rate in mp4…

@Jmv38, I don’t want to take Andrew’s post too far off topic, maybe we should open another discussion?

I think once a project is exported to Xcode, the ‘startRecording()’ command will not be available, so I had ignored that completely.

I can see some settings for frame rates in the code in the links above. I am planning to follow Andrew’s methodology; save 20 images per second and then combine into a video in Xcode.