Using Bash to create an scalable environment
"This is the Unix philosophy: Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface."
Bash (and its derivatives) have been present in the history of computing for more than 30 years and although it has never been considered a pivotal language for innovation, it has helped many developers to make their work less monotonous.
It is both a command line interpreter (Unix shell) and a scripting language for (in principle) unix-based environments. Like most Unix shells, it supports wildcard matching on filenames, pipes, command substitution, variables, and control structures for condition and iteration tests. Keywords, syntax, dynamically scoped variables, and other basic language features are copied from sh. Other functions, for example history, are copied from csh and ksh. Bash is a POSIX-compliant shell, but with various extensions.
The shell’s name is an acronym for Bourne Again Shell, a pun on the name of the Bourne shell it replaces and the notion of being “born again”.
This is a small introduction to a project that I have been working on privately for some time, in order to organize functions, aliases and environment variables in a way that makes sense and can be scaled.
I’ve been using the bash terminal for a number of years to shorten the time I regularly do with a GUI, but as the projects I work on grow in complexity, the number of commands I have to use to compile, package and uploading code to test devices (I work in networking) increases as well. I think that, as I have described in a previous article, it takes a lot of creative energy to work on software development, and every time we spend 10 seconds trying to type or remember some complex command in the terminal, we lose some of that energy that helps us to make decisions that are relevant.
Time management is crucial to being a successful developer, not only to deliver work on time, but to ensure quality and optimization when a problem is fixed.
Mathematics does not fail in that sense, if I lose about 8-10 seconds every time I have to test a change, assuming that it takes me about 10-15 minutes to write a significant amount of code, I would be losing between 20-29 hours a year just doing work mechanic and on projects like the one I work on these numbers can be doubled or tripled.
Yes, it seems obsessive-compulsive to count this time, but the time lost in mechanical work is not what is relevant, it is the time lost in truly creative work.
It was very simple, being able to have a project that met the following characteristics:
- Be scalable, allowing the separation of interests (having functions for networking is not the same as having aliases for git).
- Be easily exportable to other platforms (operating systems, servers, raspberry pis, etc.), so its processing has to end in a single exportable file (envrc)
- That its adaptability was simple, that when you need to add a new command or configuration, its place makes sense from the beginning.
Choosing the organization of the files was not so difficult when I had a clear idea, the result was the following:
Scripts are internally organized into: environment variables, aliases, and functions. For example:
The special touch
What makes this an interesting project is that no matter how many files, functions and environment variables are added, with the use of the Makefile, all these files are converted into one with a ‘merge’ function:
The envrc file is temporarily generated and then copied to the preferred directory (in my case /etc).
And to finish
With bash you can control virtually any part of the operating system that is running. It is the gateway to it, therefore, having a personal and adaptable development environment will always make your work possible with the least amount of non-creative activities possible.
The complete project is published on github and you can find it at the following link.