Codea Enhancements: Learnable Programming

So - I got even less far into “kill math” before my spine said “rubbish” and I moved on.

He seems to have a bias against symbol manipulation - and at the very bottom, symbol manipulation is all there is.

I think Andrew hit on a giant point - the computer is a tool, a means to achieve a goal, and programming is learning to use that tool. To use any tool, you need to understand both the tool itself, and the goal you’re using the tool to work toward.

There’s a balance there - but the original article seems to be saying “we need to teach people how to understand the goal you’re working toward”, without teaching about the tools - and that’s not going to work.

I took some art classes in college (I took lots of different stuff in college) - you do need to understand the tools, and how they work, before you can use them toward a goal. I got pretty good with charcoals, surprisingly enough - There are effects you can achieve them them very easily that are quite difficult in other mediums. Now - understanding light and shadow and making the drawing is the real goal, but doing it well requires you understand the tools - if only to know which ones are appropriate to do the job.

There are a lot of “programming” classes that focus on the tool - perhaps he’s right that the emphasis shouldn’t be on “Learning Java” but rather on “learning to program”. But I think he goes too far, in that you cannot “learn to program” without knowing at least some of the tools. Necessary Prerequisite.

Now - Math. math math math. You know - Pythagoras came up with his famous theorem (or rather, the proof for it) sometime around 500 BC - this significantly predates computers, and computer visualization. Mathematicians have been doing some pretty neat things for a long time without having something show it to them. They use their minds. We can too. Indeed - I think an important part of learning is actually using your mind, being forced to think a bit. You learn better by doing than by simply being shown.

.@Bortels Bret Victor definitely has a bias towards the concrete — he likes to create with a concrete, visual structure and then abstract and encapsulate it. Adding and linking symbols as he goes.

He likes to begin with an immediate visual representation, which requires real values that are later abstracted. This is one part I disagree with for non-learning applications. I think starting in the abstract is often a good skill to use.

Regarding the art analogy — I’m not talking about achieving specific effects with a certain tool. I’m just talking about getting the idea that’s in your head onto the page. Your choice of medium will effect that process, but the process itself should not be dependent on your choice of medium. And that process is the one that I’d like to speed up — so you can iterate on more ideas in the same time span.

Like you say, @Bortels:

Now - understanding light and shadow and making the drawing is the real goal, but doing it well requires you understand the tools - if only to know which ones are appropriate to do the job.

Bret Victor’s point is exactly that if the tools are more approachable, more discoverable, and more instantaneous, then they will get in the way less. And you can spend more time developing your internal understanding of light and shadow — surely this is more important than your understanding of how to create certain effects using specific tools.

You say “Your choice of medium will effect that process, but the process itself should not be dependent on your choice of medium”

Thing is - this is patently untrue, and moreover almost cannot be true. If my goal is to, say, draw a face - the process is fundamentally different when I choose to use a pencil and paper, versus say doing it in Codea. I’m not even sure where “should not” can come from. Indeed - should! The process should and must vary depending on the tools you use. This is actually part of the point of tools, and of having an assortment - each tool has strengths and weaknesses, and knowing and using an assortment lets you learn to use the best tool for the job at hand. Saying that the tool shouldn’t matter is saying all tools should be the same - that a hammer is good enough for any job. It ain’t so, and it’s not desirable for it to be so.

As for tools “not getting in the way” - that’s lovely, as far as it goes, but the simple fact is that sometimes, for a complex task, you may need a complex tool. This isn’t something to shy away from - complexity can give you power. Someone who knows photoshop backwards and forwards can work magic - magic you simply couldn’t do with other tools. More approachable is fine until it starts to actually break functionality.

This guy needs less pie-in-the-sky and more hands-on-the-tools.

Lovely discussion. I read @simeon and tend to agree, then I read @bortels and @andrew_stacey and like the argument they are making too.

My 2c. I’ve been developing a Lua IDE (http://studio.zerobrane.com/) that supports live coding for several months. I’ve taught an intro computer science class to middle- and high-school students (13-17) using this IDE and formed my opinions on what seems to work and what not.

How much help do you need from the environment you work in? By watching people who are not familiar with programming or IDEs to learn, I can say as much as you can get, as long as it doesn’t get in the way. There are two benefits in not having to worry about the API and the result of various calls: (1) you avoid context switching that is killing the flow in your work, and (2) you start seeing connections where you saw none previously because you take information through more channels and involve centers that could be dormant before. Seeing immediate effect of your changes also shortens your REPL cycle, which, again, helps to keep you in the flow.

What you may be losing (to some degree) is the learning that happens when you think about what should happen before it happens and then test your theory. This is very powerful and one of the qualities that I believe sets good programmers apart from average ones. When you fix/change something, there is always a tendency to make the change and see the result and the availability of the immediate feedback from the environment makes it almost a reflex. But not so fast. If you give yourself just a tad of time to think about what should happen and why you think so, and then check what really happened, that’s a brief interval of time where most of the learning happens.

Bret is very explicit: “These are not training wheels.” People who complain about applicability of what’s shown to a small set of graphical applications are largely missing a point. Bret talks about visibility of effects from changes in programs. When I monkeypatch a small function in a complex system that involves distributed servers with layers of caching, templating, and various backend systems, knowing the potential impact of that change could save hours of my and someone else’s time. And I’ve been doing this for almost 20 years. What saves me is the ability to think about possible consequences based on my knowledge about the system, but even that knowledge is limited and my environment can go a long way in helping me to assess those consequences.

TL;DR: I’ve been working on enabling my IDE to help my users better see what happens with their programs. For those interested, there is a demo for live coding with Lua (http://notebook.kulchenko.com/zerobrane/live-coding-in-lua-bret-victor-style) and one for live coding with love2d (http://notebook.kulchenko.com/zerobrane/live-coding-with-love).

This is sort of the area where you can have wildly divergent viewpoints - and both have validity, and be complete hokum, at the same time.

I actually think this divides into the “but it could be so much better” camp, and the “if it was good enough for me, it’s good enough for you” camp. Both sides have issues. Am I interested in making learning less painful? no - not really; It was painful for me, and I think the pain helps. Is it good to strive to teach better - sure; but at some point, they have to put their money where their mouth is, and stop bagging on javascript. :slight_smile:

I actually think there’s one other really important thing Codea provides, getting back down to nuts and bolts, that’s heinously important - I referred to it obliquely before, when I talked about typing in programs from magazines: Working Examples.

Moreso than some fancy IDE with context sensitive help, real-world working examples of real code doing real non-trivial things will teach you (with some study, and ideally some tweaking and experimentation) far more than a fancy IDE or hours spent poring over the API. Again - I liken it to a natural language. Class is fine, and a dictionary is fine, but if you want to learn French, go listen to native speakers talk; interact with them, find out what they say, and what they say when you try. Likewise - you want to learn Codea, or lua-in-the-large? Go look at the examples. Understand what they do, and how. Change them, improve them, ideally break them, then fix them. I think this activity, moreso than API issues, gets to the very heart of “learning to program”.

I have a friend that teaches English as a Second Language - fairly advanced, 3rd or 4th year, so the people in class already have basics - they have an ok vocabulary, and a grasp of the grammar. It is apparently almost entirely (or close to it) a study of idiom and custom. The “oomph” of English isn’t the words, or the grammar - it’s the fact that there is a deeper understanding of not taking certain phrases (“the sky’s the limit!”) literally. You’re not expected to re-invent the idioms (I doubt you could) - you learn from example. I suspect all specialized vocabularies - computer languages includes - have these sorts of things - they are extra-language, extra-IDE, meta-programming juciness.

It is unreasonable - in any science - to expect anyone to push the envelope if they have to spend their time re-inventing the wheel: We see farther, because we stand on the shoulders of the giants who went before. A good grounding in the “cultural literacy” of any science/language/pasttime is essential if you want to excel. And examples - good examples - are basically the only way I can think of to get that grounding.

Interesting article and great discussion here. Some of the videos in that article were quite intriguing.

As a trial and error programmer (AKA as a hack), I am all for options that make it easier for me to cut down on the time it takes to get my ideas working as expected. Learning the language is important but the older I get the less my memory works. My mother has Alzheimer disease and I am beginning to think it is genetic.

So any tips, hints, etc. that may be built in to Codea (with an option to turn it off for those not wanting it) would get my vote.

.@paulclinger, I just wanted to thank you for posting your link. Up until downloading your zeroBrane Studio, I haven’t fiddled with Lua on my PC. I am very much liking your interface along with the help, examples, and the real time ScratchPad function. Kudos and good luck in your future endeavors.

I’ve checked Khan academys videos, I see their written in JAVA, I’ve ended up with a lot of errors. Their still very good videos and easy to understand, which is a plus for a beginner…