1.5 beta

Let there be a 1.5 beta! Soon!

And let it support but not require iOS6!

SO MOTE IT BE.

I’ve been racking my brain as to what exciting things may be coming down the pipe, and I’m just clueless. But I saw the tweet the other day, and my mine is full of speculation.

I’m expecting real io.* support - we’re close now, just need to set cwd and os.getenv (and likely jail it to make apple happy - I’m torn there. I don’t want to be jailed, but I want others to be, because I trust noone). But that’s utility - good, important, but not showy. (While I’m asking for boring, more library support - zlib comes to mind - would be nifty. Bit operations, too. Built-in JSON. I had a big-ass list in some other thread.)

I’m hoping for “true 3d” - ie. some lighting. GLSL would be a fun alternative. Both is better. Model loaders, at least for simple stuff.

Access to the camera (not necessarily the photo roll - I’m talking live video) would be fun. Ditto the microphone. And digital sound playback.

iphone version?

Seamless xcode export? or better - seamless app export? (that strikes me as a VERY tall order)

What can it beeeee??? I am truly tortured.

What I’m dreaming about is that I’ve set my sights too low. I gotta tell you - meshes blew me away. Totally unexpected, totally cool. Can lightning strike twice? NO PRESSURE NOW. :slight_smile:

It will support iOS 5.

os.getenv will be un-sandboxed.

The rest will be available pretty shortly — there’s one more big component left to build. I’m hoping you will like it.

On a side note, recent versions of OpenGL no longer have built-in lighting. Lighting is implemented via GLSL now — so you have a normal buffer on your mesh, a lighting shader that computes lighting at each vertex (or pixel) and interpolates the result.

I don’t think I’ve been particularly quiet about the things I want, but just in case, here’s my “love to see it in 1.5” list.

  1. GPS – probably not at the top of anyone else’s list, but vital to some projects that are right in my grill
  2. Compass – GPS’ little brother, and helpful in addressing a whole other class of problems
  3. Improved sound – something that makes it dead simple to produce precise musical notes, multi-part harmonies, etc. Don’t kill the existing sound() function, just give me a note() function and a play() function for supporting canned audio
  4. Unboxing io. – yes, please
  5. Access to photo library from within an application – so that I can give the user the opportunity to open or save images
    6.Access to the cameras. Less important than above (and in fact, I’d argue that it’s not too useful without also having access to the photo library)
  6. Common controls. As much fun as I’ve had rolling my own, I’d prefer a neat box around Apple’s standard bits. I’ve already had one app rejected for using non-standard text boxes that didn’t give all the normal functionality.

By “lighting”, I pretty much meant “some pre-written GLSL stuff that does a lighting function”. So - either that, and you can toggle it on or off, or just allowing us full access to GLSL (which, to do it “right” might suggest quite a bit of extra work - perhaps syntax hilighting and so on).

I just want to specify a light source so my rotating planets look purdy. And the turrets on my spaceships cast shadows, pew pew.

Yes on canned audio - I already have the Sinistar “I LIVE” file saved off. :slight_smile:

I can live with things that can be solved by writing a lua library, so GUI controls and playing notes would be nice, but there are libraries for that in the forum. So more important for me are features, that can’t be solved in LUA. Just my opinion.

For me personally, GLSL will be huge. Its a big change from OpenGL to OpenGL ES, so its normal now to write your own shader even for simple lighting. But when you have done it one time, just copy the shader. Maybe Codea could add a sample for simple lighting with GLSL :slight_smile:

A thing I would really like would be fast rendering to texture. This would save us from using the retained mode, where I’m not able to show a dialog for settings or another GUI control without destroying the content. But maybe thats not possible, because the texture has to be uploaded to the graphics chip and thats already too slow?

Saving pictures into the album would be nice. But if I can do that finally in xcode I’m fine. BTW, would it be possible to save the content of what I have painted in retained mode?

Good to hear that it will support iOS 5 - my 1st Gen iPad will never feel the warm embrace of iOS 6. Which is a bit disappointing but at least that means I will still have a functioning maps app. It could also be the thing that pushes me to buy a new iPad. I think I will wait and see what this mini thing looks like and whether Codea can run on it (I assume so but who knows what resolution it will come out with - my guess is the same as the first gen iPad).

.@KilamMalik setContext speed has been improved quite a lot — still not enough to render at full retina resolution every frame (that’s simply a limitation of the iPad GPU), but good enough to start playing with full screen effects.

Looking forward to GLSL support! That and live camera stream would be really nice. (faster stream than my modifications to the runtime :wink: )

A couple of screenshots while we wait.

New toolbar for asset management:
http://twolivesleft.com/Codea/Preview/Codea-1.5-Toolbar.jpg

In-editor API tips (will have to get your feedback on this). Also highlighting on long lines works much better.
http://twolivesleft.com/Codea/Preview/Codea-1.5-API.jpg

We do have live camera stream functionality as well. It is especially cool when you pipe the live camera into a GLSL shader.

The Shader stuff is very deep, but John and I are still working on it. So I won’t be able to show you that until the beta is ready.

Just a random idea: any chance of being able to extend the in-app documentation? I’ve been finding the lookup feature quite useful, but I’d like to be able to do it with some of my own functions and classes since I often don’t remember all the intricacies. Being able to add an XML file (or whatever) with the necessary details would be useful.

Interesting idea Andrew. I hadn’t thought of allowing that. I’ll keep it in mind as I work on the documentation browser.

Thanks for the screenshots, @Simeon. Looks really interesting - shaders plus camera will be cool. From the looks of your UI, you are treating shaders as another asset like sprites, will that mean we don’t have to mix them with our lua code, but can share them across projects?

The API hints looks wonderful, will be good for long argument lists.

I made a video of a shader example i’ve been working on that makes use of some advanced features:

https://www.youtube.com/watch?v=8KJhjLK7zBU

The aspect ratio is off and there’s some stretching but it works in principal and those issues are just because I’m fitting it to the screen rather than the camera’s natural resolution.

.@John That is wicked cool.

Neat! Shader suppport is VERY exciting. Lots of bang for the buck, especially on a platform like the iPad.

And despite my misgivings on the education thread, I like the popup parameter help! Very handy.

Thank you for sharing a preview! The hints of upcoming awesome were driving me batty. I’m bad waiting for Christmas, too.

Any hints on how the API for shaders and camera will look like? :slight_smile:

Shaders have a fairly simple API and are used in conjunction with mesh. We are trying to make it a simple and elegant process.

Creating a shader in code is basically like this:

myMesh = mesh()
-- create a shader with strings for vertex and fragment program sources
myMesh.shader = shader(myVertexSource, myFragmentSource)
myMesh.texture = "Space Cute:Background"

-- shader uniforms are set with properties
myMesh.shader.uniform1 = 1.5

-- additional samplers can be set with images or sprite names
myMesh.shader.customSampler = someImage

-- a new buffer object type is available for managing custom vertex attributes
customBuffer = myMesh:buffer("customVertexAttribute")
-- buffers can be passed around and manipulated independently of the mesh
customBuffer[1] = vec2(x,y)
...

There will be other ways to create shaders, and we are currently working on making it the best possible user experience.

Speaking of beta - my test flight expires in a couple of days. Should I just go install the release, or is another test flight coming down the pipe? Or am I doing something wrong?

I’m in the same boat. Provision profile expires in less than a day.

What a surprise … me too.