I wrote many lines of Tcl back in the early .com days, as we had an in-house server stack based on Apache, similar in certain ways to AOL Server, but doing ORM stuff people only discovered a decade later with Ruby on Rails.
Only used a few times later to script Websphere via Jacl.
However, nowadays I don't have any reason to touch it as well.
Programming doesn't have to be hard. As in speech, the more one says, the less one means.
I know shell isn't going to with this popularity contest, but a return to it is what's badly needed in CS today. Instead of attempting to recreate the shell in C# or Java's supplied libraries and subsequently becoming frustrated when interaction with the "outside world" is clumsily accomplished through a FFI pinhole, just use the shell as it was intended: as a lingua franca between utilities.
Write what requires prolog in prolog, what requires c in c, what requires awk in awk, etc. Use flat file databases such as starbase or /rdb and avoid data prisons such as MSSQL, Oracle, MySQL, etc. Make all of these utilities return sane return values and spit out JSON formatted output. Finally, tie it all together with shell. If you need a UI, code it as a thin layer in Tk, python/pytk, ansi c/gtk, or, consider pdcurses, etc. Profile your program and find any weak links in the chain. Recode in a lower level language only when needed.
Weighing the tradeoffs of adding language features is a sign of a false dilemma; rather than a single bigger or smaller language, what is actually needed are more specific languages which speak to each other through a lingua-franca. Tcl accomplishes this communication through a string representation, Powershell through an object representation, etc. Again, rather than choosing one solution over another, use them all where they work best. This is where Unix got it right all those years ago; Unix isn't just a slightly more stable platform for running today's bloated and monolithic software, rather, it's an elegant system for connecting maintainably-small utilities. The shell glues said utilities together into programs. Such an approach combines the best of high and low level programming, reuse and specificity, tradition and novelty, etc.
If you add nc between the pipes you've implemented an ESB :)
But shh, don't tell anyone how easy it is, if everyone figures out just how easy and scalable shell scripts are it will ruin the magic we bring to over-engineered enterprise projects.
If any here think fleitz is joking, please pick up a used copy of Manis, Schaffer, and Jorgensen's "Unix Relational Database Management". The authors create a relational database with little more than awk, cat, grep, od, sed, spell, tail and a bit of bourne-shell glue. After covering the basics of tables and relational theory, the authors create a small-business accounting system. The database became a commercial product (/rdb) and is still sold and used today (allegedly in hospitals).
As a very superficial example of this, pianobar is a fantastic terminal-based Pandora client that will happily read input from a FIFO. I had bound keys to issue various commands through the FIFO, on both my laptop and my desktop. Then I found my hands on my laptop keyboard when my music was playing from my desktop (or vice-versa). Simple application of netcat meant my commands could be typed at either keyboard.
That's a rant I mostly agree with, but I see a couple ways it could be mapped onto the original phrase. Did you mostly mean "sophisticated language features don't span executables anyway"? or "you don't need a sophisticated understanding of what's inside each box that a shell operates on"? or something I'm missing?
http://www.yosefk.com/blog/i-cant-believe-im-praising-tcl.ht...
Shells partition away sophistication.