by Asim Jalis
One of the neat things about math is that reuse is extremely
simple and pervasive. Since mathematical proofs are written for
people you can easily allude to someone else's proof and expect
the reader to get it. You can say, "based on the result by this
other person in this other paper, we know that X is true", and
then you can continue from there.
In computer science it is not as easy. You can't say in the
middle of a program, "let's use the zip functionality, since we
know it exists". You have to be a lot more specific.
I just wrote a console stopwatch in ruby. And then I got stuck.
There appears to be no way to read unbuffered standard input.
That's not quite true. There might be a really complicated way in
which I have to bind to a win32 function and then call it. To do
this I have to cut and paste a lot of magic into my code.
Now in the past I would have struggled to get to the other side
of this dead-end. But now given that I am experimenting with
inverting the necessity->invention paradigm, I just gave up
quickly, and decided to go with Ctrl-C, which the program does
catch. Ctrl-C has a problem: if I am in the wrong window I can
accidentally kill some other program. I guess I'll worry about
this if it becomes a big problem.
Perhaps the right attitude in approaching innovation as an
effectual process is to simply respect obstacles when they creep
up and then to start moving in a different direction.
Ruby is clearly not the right language for applications that
require sensitivity to keyboard events.
The advantage that I can see of abandoning goals quickly are
that: (a) one does not get burned out -- there are fewer negative
associations with programming, (b) one naturally moves in a
direction that produces the greatest yields.