Once upon a time I used to use https://github.com/wting/autojump as a way for my systems to help me quickly navigate to my favorite Directories Of Interest. Basically, it did (and similar tools also do) the following:
- cd is instrumented to capture directory locations each time one visits a new directory, and store them in a wee sort of database
- an alternative “cd” command is introduced that attempts to Do What I Mean. It takes the input, and sees what stored directory best matches, with a bias towards frequently used directories
autojump was written in Python, which is no grand problem; I did some poking around, and discovered a newer tool, https://github.com/clvv/fasd, which has similar capabilities, perhaps more, and has a slightly smaller footprint, being implemented in “POSIX shell,” so it can happily run on common shells such as Bash (and my fave) zsh.
So far, I have just been using the “zz” functionality that picks the seemingly-most-relevant directory. It does a fine job of this.
It is doubtless a good idea to poke some more at this; running “fasd a b c” searches recent directories for highest-relevance files containing “a” “b” and “c”, fairly successfully. Throwing multiple strings pulls up an interesting list:
cbbrowne@karush ~> fasd tmux conf
Without much effort, this locates tmux configuration files; that’s looking pretty attractive…
I have been poking for a while at the Oh Shell, presented at the 2018 BSDCan. It observes that there are a bunch of things about shells that tend to be painful, which has led to a whole bunch of shells coming out that are (exceedingly thin) veils over other programming languages, which then naturally attends them being of no general interest.
Here are a set of problems that Michael Macinnis pokes at:
- Undefined variables – bash has an option to gripe about such, but it’s no default
- varadic functions
- word splitting versus lists
- return values 0-255 – somewhat intentional, so that functions look like processes
- global variables are mostly all that’s available
- little modularity is possible because everything is in the global environment. This is somewhat like a worse version of Lisp dynamic scope
- tortured syntax, particularly for variable expansions/rewrites
He presents a variety of mechanisms to try to solve these problems:
- same syntax for code and data (conses)
- richer set of data types (strings, symbols, number tower, lists, and some more sophisticated bits
- first class environment via define/export
- Kernel like Fexprs – enabled by first class environment. See John N Shutt’s thesis, vau: the ultimate abstraction
- support dynamic communication patterns – see Squeak (Pike and Cardelli)
The shell is implemented in Go, making it pretty natural to do the “dynamic communication pattern” bit via GoRoutines. It is mostly an implementation of Scheme, with a largely Unix-y syntax. The main places where it deviates towards Scheme (that I have thus far noticed) are:
- It has a preference for prefix notation for arithmetic rather than infix
- The “:” character indicates subsumed evaluation of blocks, which is loosely like a (let (()) ()) structure.
I’m not yet sure that it’s enough “cleaner” than other shells that it is worth going in on to any special degree. The modularity aspects would prove quite interesting, if libraries of code using them were to emerge. The absence of libraries for the existing shells is unfortunate. You can certainly find a zillion extensions for Bash and Zsh, but in the absence of deep modularity, the connections can only be very shallow. You’re just a captured environment variable away from making that extension blow up…