by Asim Jalis
I am not really criticizing wikis or suggesting that they should
be avoided. I am trying to think why there are so few things in
software that are like Unix command line utilities, which can
hook up with each other so easily. And specifically why wikis are
not like this.
While linking wikis to each other is possible it is not
particularly easy, which feels weird. It seems analogous to the
problem of object reuse. While it's easy to reuse code within a
single code base, it's much harder to do it across code bases.
While object-oriented systems fail in this regard, Unix utilities
show us a neat way of extreme reuse.
Someone pointed out that while there is no single program on Unix
that is as complex as Microsoft Word, you can combine them
together to do as much as Word and even more.
I am specifically thinking of ways of publishing some small
utilities on the web. A wiki seems to buy me very little over an
HTML file. For example I have to touch the file system directly
to upload the binary. Of course, I could add an upload
functionality to the wiki. But there is something fundamentally
wrong with this approach. It seems to be the opposite of the Unix
approach. Instead of making tools that do one thing well, adding
an upload functionality to wiki makes the wiki more complex and
Instead of making things more complicated what if we made them
simpler. For example, the wiki is really two different functions.
There is the rendering function, and there is the editing
function. What if we separated them, so that we had separate
Unix-like functions focused on each activity.
The rendering function takes a URL to a text file and just
renders it. That's all it knows how to do.
The editing function takes a URL to a text file or to a binary
file and let's you edit it. You can either upload a file or type
up its contents.
Ward mentioned once when he was talking about the genesis of the
wiki that he had separate cgi scripts for each function. He first
wrote the rendering cgi script. Later he wrote a separate cgi for
Here is a meta-question. Why not write CGI scripts as Unix shell
scripts instead of pure Perl or pure Ruby.
What is the advantage of putting all these function together?
What reuse does it enable? I suspect very little. There are a few
perl or python functions which I might reuse several times, and I
can always put them in a module.
Consider this though. If I create a single edit.cgi script, I can
now use it to edit whatever I want. I could edit wiki files with
it as well as HTML files with it. Edit.cgi could automatically
use CVS for source control.
In fact here is another thought: have a single sh.cgi script
which executes sh commands.
There could be some security issues here. This could be a
security nightmare, so it would need to be secured. It could be
restricted in some way. It could be restricted to only run in
specific sandbox directories. And it could be restricted to only
run specific commands such as ls, and cat.
Also the edit.cgi script could be restricted in the same way. It
too could easily be hijacked to create cgi scripts and then
execute them. It could be restricted to only edit html and wiki
A setup like this could allow neat operations such as global
search and replace.
It could also create neat unobvious leveraging opportunities. The
different scripts might interact with each other to enable new
things that we cannot think about right now.