These may be objectively superior (I haven't tested), but I have come to realize (like so many others) that if you ever change your OS installation, set up VMs, or SSH anywhere, preferring these is just an uphill battle that never ends. I don't want to have to set these up in every new environment I operate in, or even use a mix of these on my personal computer and the traditional ones elsewhere.
Learn the classic tools, learn them well, and your life will be much easier.
Some people spend the vast majority of their time on their own machine. The gains of convenience can be worth it. And they know enough of the classic tools that it's sufficient in the rare cases when working on another server.
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
Yeah, this is basically what I do. One example: using neovim with bunch of plugins as a daily driver, but whenever I enter a server that doesn't have it nor my settings/plugins, it isn't a huge problem to run vim or even vi, most stuff works the same.
Same goes for a bunch of other tools that have "modern" alternatives but the "classic" ones are already installed/available on most default distribution setups.
That goes against the UNIX philosophy IMO. Tools doing "one thing and doing it well" also means that tools can and should be replaced when a superior alternative emerges. That's pretty much the whole point of simple utilities. I agree that you should learn the classic tools first as it's a huge investment for a whole career, but you absolutely should learn newer alternatives too. I don't care much for bat or eza, but some alternatives like fd (find alt) or sd (sed alt) are absolute time savers.
Some are so vastly better that it's worth whatever small inconvenience comes with getting them installed. I know the classic tools very well, but I'll prefer fd and ripgrep every time.
One of the reasons I really like Nix, my setup works basically everywhere (as long the host OS is either Linux or macOS, but those are the only 2 environments that I care). I don't even need root access to install Nix since there are multiple ways to install Nix rootless.
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
> Learn the classic tools, learn them well, and your life will be much easier.
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
I like the idea of new tools though. But knowing the building blocks is useful. The “Unix power tools” book was useful to get me up to speed.. there are so many of these useful mini tools.
Miller is one I’ve made use of (it also was available for my distro)
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
> When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
One major difference can emerge from the fact that using a tool regularly inevitably builds muscle memory.
You’re accustomed to a replacement command-line tool? Then your muscle memory will punish you hard when you’re logged into an SSH session on another machine because you’re going to try running your replacement tool eventually.
You’re used to a GUI tool? Will likely bite you much less in that scenario.
IMO this is very stupid: don't let past dictate future. UNIX is history. History is for historians, it should not be the basis that shapes the environment for engineers living in present.
The point is that we always exist at a point on a continuum, not at some fixed time when the current standard is set in stone. I remember setting up Solaris machines in the early 2000s with the painful SysV tools that they came with and the first thing you would do is download a package of GNU coreutils. Now those utils are "standard", unless of course you're using a Mac. And newer tools are appearing (again, finally) and the folk saying to just stick with the GNU tools because they're everywhere ignore all of the effort that went into making that (mostly) the case. So yes, let's not let the history of the GNU tools dictate how we live in the present.
Well, even “Unix” had some differences (BSD switches vs SysV switches). Theoretically, POSIX was supposed to smooth that out, but it never went away. Today, people are more likely to be operating in a GNU Linux environment than anything else (that just a market share fact, not a moral judgement, BSD lovers). Thus, for most people, GNU is the baseline.
Never will I ever set up tools and home environment directly on the distro. Only in a rootfs that I can proot/toolbx/bwrap into. Not only I don't want to set up again on different computer, distro upgrade has nuked "fancy" tools enough times to be not worth it.
I started a new job and spent maybe a day setting up the tools and dotfiles on my development machine in the cloud. I'm going to keep it throughout my employment so it's worth the investment. And I install most of the tools via nix package manager so I don't have to compile things or figure out how to install them on a particular Linux distribution.
L
I know well enough my way around vi, because although XEmacs was my editor during the 1990's when working on UNIX systems, when visiting customers there was a very high probability that they only had ed and vi installed on their server systems.
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
Agreed, but some are nice enough that I'll make sure I get them installed where I can. 'ag' is my go to fast grep, and I get it installed on anything I use a lot.
I have some of these tools, they are not "objectively superior". A lot of them make things prettier with colors, bargraphs, etc... It is nice on a well-configured terminal, not so much in a pipeline. Some of them are full TUIs, essentially graphical tools that run in a terminal rather than traditional command line tools.
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
Along those lines, Dvorak layouts are more efficent, but I use qwerty because it works pretty much everywhere (are small changes like AZERTY still a thing? Certainly our French office is an "international" layout, and generally the main pain internationally are "@" being in the wrong place, and \ not working -- for the latter you can use user@domain when logging into a windows machine, rather than domain\user)
As someone who logs into hundreds of servers in various networks, from various customers/clients, there is so little value in using custom tooling, as they will not be available on 90% of the systems.
I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.
95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.
"servers" is the key word here. Some of the tools listed on that page are just slightly "improved" versions of common sysadmin utilities, and indeed, those are probably not worth it. But some are really development tools, things that you'd install on the small number of machines where you do programming. Those might be.
The ones that leap out at me are ripgrep (a genuinely excellent recursive grepper), jq (a JSON processor - there is no alternative to this in the standard unix toolkit), and hyperfine (benchmarking).
Another reason emacs as an OS (not fully, but you know) is such a great way to get used to things you have on systems. Hence the quote: "GNU is my operating system, linux is just the current kernel".
As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.
Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
The Rust rewrites can become tiresome, they have become a meme at this point, but there are really good tools there too.
An example from my personal experience: I used to think that oxipng was just a faster optipng. I took a closer look recently and saw that it is more than that.
That is a differentiator if your competitors are written in Python or Ruby or Bash or whatever. But yeah obviously for marketing to normal people you'd have to say "it's fast and reliable and easy to distribute" because they wouldn't know that these are properties of Go.
You can write slow unmaintainable brittle garbage in any language though. So even if your competition is literally written in Bash or whatever you should still say what your implementation actually does better - and if it's performance, back it up with something that lets me know you have actually measured the impact on real world use cases and are not just assuming "we wrote it in $language therefore it must be fast".
No. The differentiator is whatever benefits such an implementation might deliver (e.g., performance, reliability, etc.). Customers don’t start whipping out checkbooks when you say, “Ours is written in Go.”
Many of the entries do include this detail — e.g. "with syntax highlighting", "ncurses interface", and "more intuitive". I agree that "written in rust", "modern", and "better" aren't very useful!
Some of this just makes me think that they are compared against the wrong tool though. E.g.
> cat clone with syntax highlighting and git integration
doesn't make any sense because cat is not really meant for viewing files. You should be comparing your tool with the more/less/most family of tools, some of which can already do syntax highlighting or even more complex transforms.
Yup, I made that same point in another comment. Out of interest, though, how do you get syntax highlighting from any of those pagers? None of them give it to me out of the box.
I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
> I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
The core Unix toolset is so good, that you can easily get by with it. Many of these tools are better, but still not necessary, and they certainly aren't widely available by default.
Out of curiosity, how would you recursively grep files ignoring (hidden files [e.g., `.git`]), only matching a certain file extension? (E.g., `rg -g '*.foo' bar`.)
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
Depends on how big the directory is. If it only contains a few files, I'd just enumerate them all with `find`, filter the results with `grep`, and perform the actual `grep` for "bar" using `xargs`:
find . -type f -name "*.foo" | grep -v '/\.' | xargs grep bar
(This one I could do from muscle memory.)
If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:
find . -type f -not -path '*/\.*' -name "*.foo" -exec grep bar {} +
Thanks yeah this is a good example of why I prefer the simpler interface for `rg` and `fd`. Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.
For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).
Curious if that answers the "I genuinely don't know what is going on here" then? Not searching hidden files (or third-party dependencies, which `rg` also does automatically with its ignore parsing) isn't just a nice to have, it's mandatory for a number of tasks a software engineer might be performing on a code base?
I've seen an online radio player in Go which was unusabiily slow on my Atom n270 due to
the badly coded ANSI audio visualization FX' using floating math. Meanwhile a with Cava or another visualizer and mpd+mpc I could
do the same using 200x less resources.
I always enjoy these lists. I think most folks out there could probably successfully adopt at least one or two of these tools. For me, that’s ripgrep and jq. The former is a great drop-in replacement for grep and the latter solves a problem I needed solving. I’ll try out a few of the others on this list, too. lsd and dust both appeal to me.
I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.
Thank you to those who have put in so much effort. You’re making the community objectively better.
I think many of us linux admins have such a list. Mine in particular is carefully crafted around GPL-izing my stack as much as possible. I really like the format of this ikrima.dev one though! The other stuff is great too, worth a peruse.
I find the opposite to be true. Most of these are really just reinventing the wheel of foundational GNU tools that are really powerful provided one has spent some time on them.
So much talk about paying open source developers and when someone actually does something about it and try to make some money, it's again not good enough.
On the contrary, that's exactly what “modern” sounds like. I wonder when all those tools will go unmaintained. Coreutils, with all their problems, are maintained since before authors of many listed tools were born.
Every time such a list is posted, it tends to generate a lot of debate, but I do think there is at least 2 tools that are really a good addition to any terminal :
`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.
The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.
Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.
EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.
> But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}
`find` only support `{}`, it does not support `{/}`, `{//}`, `{.}` etc, which is why you often need to do some parsing magic to replicate basic thing such has "the full path without the extension`, `only the filename without the extension` etc
I’m on a Mac, and some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions. Rather than replace or fight against the system tools, I supplement them with a few extras. Honestly, most are marginal upgrades over what macOS ships with, except for fzf, which is a huge productivity boost. Fuzzy-finding through my shell history or using interactive autocompletion makes a noticeable difference day to day.
>some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions
That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)
duf is pretty good for drive space, has some nice colours and graphs. But its also not as useful for feeding into other tools.
btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.
zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.
would be good to have an indicator if it’s available with your distro by default or what package you’ll need to install it since all tools are only as useful as available they are…
Modern doesn't always mean better. A better replacement for mplayer was mpv, and in some cases mplayer was faster than mpv (think about legacy machines).
- bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
- alias ls='ls -Fh' , problem solved. Now you have * for executables, / for directories and so on.
- ncdu it's fine, perfect for what it does
- iomenu it's much faster than fzf and it almost works the same
- jq it's fine, it's a good example on a new Unix tool
- micro it's far slower than even vim
- instead of nnn, sff https://github.com/sylphenix/sff with soap(1) (xdg-open replacement) from https://2f30.org create a mega fast environment. Add MuPDF and sxiv, and nnn and friends will look really slow compared to these.
Yes, you need to set config.h under both sff and soap, but they will run much, much faster than any Rust tool on legacy machines.
> bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
I agree with this. cat is great for "cating" bat is great for throwing shit on the terminal in a fashion that makes it semantically easier to reason with, two different use cases.
I think that's because it's super common to use cat to quickly view a file. It has the nice property of using your terminal's scrollback rather than putting you into a pager application. For that use-case it is an alternative to cat.
That said, I've never really cared much about missing syntax highlighting for cases where I'm viewing file contents with cat. So the tool doesn't really serve a purpose for me and instead I'll continue to load up vim/neovim if I want to view a file with syntax highlighting.
These may be objectively superior (I haven't tested), but I have come to realize (like so many others) that if you ever change your OS installation, set up VMs, or SSH anywhere, preferring these is just an uphill battle that never ends. I don't want to have to set these up in every new environment I operate in, or even use a mix of these on my personal computer and the traditional ones elsewhere.
Learn the classic tools, learn them well, and your life will be much easier.
Some people spend the vast majority of their time on their own machine. The gains of convenience can be worth it. And they know enough of the classic tools that it's sufficient in the rare cases when working on another server.
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
Yeah, this is basically what I do. One example: using neovim with bunch of plugins as a daily driver, but whenever I enter a server that doesn't have it nor my settings/plugins, it isn't a huge problem to run vim or even vi, most stuff works the same.
Same goes for a bunch of other tools that have "modern" alternatives but the "classic" ones are already installed/available on most default distribution setups.
That goes against the UNIX philosophy IMO. Tools doing "one thing and doing it well" also means that tools can and should be replaced when a superior alternative emerges. That's pretty much the whole point of simple utilities. I agree that you should learn the classic tools first as it's a huge investment for a whole career, but you absolutely should learn newer alternatives too. I don't care much for bat or eza, but some alternatives like fd (find alt) or sd (sed alt) are absolute time savers.
Some are so vastly better that it's worth whatever small inconvenience comes with getting them installed. I know the classic tools very well, but I'll prefer fd and ripgrep every time.
+100
One of the reasons I really like Nix, my setup works basically everywhere (as long the host OS is either Linux or macOS, but those are the only 2 environments that I care). I don't even need root access to install Nix since there are multiple ways to install Nix rootless.
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
> Learn the classic tools, learn them well, and your life will be much easier.
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
And when those classic tools need a little help:
Awk and sed.
I like the idea of new tools though. But knowing the building blocks is useful. The “Unix power tools” book was useful to get me up to speed.. there are so many of these useful mini tools.
Miller is one I’ve made use of (it also was available for my distro)
I tend to use some of these "modern" tools if they are a drop-in replacement for existing tools.
E.g. I have ls set up aliased to eza as part of my custom set of configuration scripts. eza pretty much works as ls in most scenarios.
If I'm in an environment which I control and is all configured as I like it, then I get a shinier ls with some nice defaults.
If I'm in another environment then ls still works without any extra thought, and the muscle memory is the same, and I haven't lost anything.
If there's a tool which works very differently to the standard suite, then it really has to be pulling its weight before I consider using it.
> that if you ever change your OS installation
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
> When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
One major difference can emerge from the fact that using a tool regularly inevitably builds muscle memory.
You’re accustomed to a replacement command-line tool? Then your muscle memory will punish you hard when you’re logged into an SSH session on another machine because you’re going to try running your replacement tool eventually.
You’re used to a GUI tool? Will likely bite you much less in that scenario.
> You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
The point is that sometimes you're SSHing to a lightweight headless server or something and you can't (or can't easily) install software.
I do prefer some of these tools, due to a much better UX, but the only one I do install in every unix box is ripgrep.
I wanted to say we should just stick with what Unix shipped forever. But doesn't GNU already violate that idea?
IMO this is very stupid: don't let past dictate future. UNIX is history. History is for historians, it should not be the basis that shapes the environment for engineers living in present.
The point is that we always exist at a point on a continuum, not at some fixed time when the current standard is set in stone. I remember setting up Solaris machines in the early 2000s with the painful SysV tools that they came with and the first thing you would do is download a package of GNU coreutils. Now those utils are "standard", unless of course you're using a Mac. And newer tools are appearing (again, finally) and the folk saying to just stick with the GNU tools because they're everywhere ignore all of the effort that went into making that (mostly) the case. So yes, let's not let the history of the GNU tools dictate how we live in the present.
Well, even “Unix” had some differences (BSD switches vs SysV switches). Theoretically, POSIX was supposed to smooth that out, but it never went away. Today, people are more likely to be operating in a GNU Linux environment than anything else (that just a market share fact, not a moral judgement, BSD lovers). Thus, for most people, GNU is the baseline.
Never will I ever set up tools and home environment directly on the distro. Only in a rootfs that I can proot/toolbx/bwrap into. Not only I don't want to set up again on different computer, distro upgrade has nuked "fancy" tools enough times to be not worth it.
I started a new job and spent maybe a day setting up the tools and dotfiles on my development machine in the cloud. I'm going to keep it throughout my employment so it's worth the investment. And I install most of the tools via nix package manager so I don't have to compile things or figure out how to install them on a particular Linux distribution. L
This is how I feel as well. Spend some time "optimizing" my CLI with oh my zshell etc. when I was young.
Only to feel totally handicapped when logging in into a busybox environment.
I'm glad I learned how to use vi, grep, sed..
My only change to an environment is the keyboard layout. I learned Colemak when I was young. Still enjoying it every day.
I know well enough my way around vi, because although XEmacs was my editor during the 1990's when working on UNIX systems, when visiting customers there was a very high probability that they only had ed and vi installed on their server systems.
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
Agreed, but some are nice enough that I'll make sure I get them installed where I can. 'ag' is my go to fast grep, and I get it installed on anything I use a lot.
I have some of these tools, they are not "objectively superior". A lot of them make things prettier with colors, bargraphs, etc... It is nice on a well-configured terminal, not so much in a pipeline. Some of them are full TUIs, essentially graphical tools that run in a terminal rather than traditional command line tools.
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
For some people the "uphill battle" is the fun part
Along those lines, Dvorak layouts are more efficent, but I use qwerty because it works pretty much everywhere (are small changes like AZERTY still a thing? Certainly our French office is an "international" layout, and generally the main pain internationally are "@" being in the wrong place, and \ not working -- for the latter you can use user@domain when logging into a windows machine, rather than domain\user)
I've been using Dvorak for 24 years. 99% of the time I'm using my own machines, so it's fine. For the other 1% I can hunt-and-peck QWERTY well enough.
"I don't want to be a product of my environment. I want my environment to be a product of me."
so right.
As someone who logs into hundreds of servers in various networks, from various customers/clients, there is so little value in using custom tooling, as they will not be available on 90% of the systems.
I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.
95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.
"servers" is the key word here. Some of the tools listed on that page are just slightly "improved" versions of common sysadmin utilities, and indeed, those are probably not worth it. But some are really development tools, things that you'd install on the small number of machines where you do programming. Those might be.
The ones that leap out at me are ripgrep (a genuinely excellent recursive grepper), jq (a JSON processor - there is no alternative to this in the standard unix toolkit), and hyperfine (benchmarking).
Another reason emacs as an OS (not fully, but you know) is such a great way to get used to things you have on systems. Hence the quote: "GNU is my operating system, linux is just the current kernel".
As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.
Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.
i wish there was an additional column in the table, that says "what problem does it solve". oh, and 'it's written in rust' does not count.
“It’s written in Rust”
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
The Rust rewrites can become tiresome, they have become a meme at this point, but there are really good tools there too.
An example from my personal experience: I used to think that oxipng was just a faster optipng. I took a closer look recently and saw that it is more than that.
See: https://op111.net/posts/2025/09/png-compression-oxipng-optip...
That is a differentiator if your competitors are written in Python or Ruby or Bash or whatever. But yeah obviously for marketing to normal people you'd have to say "it's fast and reliable and easy to distribute" because they wouldn't know that these are properties of Go.
You can write slow unmaintainable brittle garbage in any language though. So even if your competition is literally written in Bash or whatever you should still say what your implementation actually does better - and if it's performance, back it up with something that lets me know you have actually measured the impact on real world use cases and are not just assuming "we wrote it in $language therefore it must be fast".
No. The differentiator is whatever benefits such an implementation might deliver (e.g., performance, reliability, etc.). Customers don’t start whipping out checkbooks when you say, “Ours is written in Go.”
That is what the post you responding to is saying
Also using a non GPL license does not count.
Many of the entries do include this detail — e.g. "with syntax highlighting", "ncurses interface", and "more intuitive". I agree that "written in rust", "modern", and "better" aren't very useful!
Some of this just makes me think that they are compared against the wrong tool though. E.g.
> cat clone with syntax highlighting and git integration
doesn't make any sense because cat is not really meant for viewing files. You should be comparing your tool with the more/less/most family of tools, some of which can already do syntax highlighting or even more complex transforms.
Yup, I made that same point in another comment. Out of interest, though, how do you get syntax highlighting from any of those pagers? None of them give it to me out of the box.
A lot of those tools are also usable on windows thats why i like them.
I haven't tried delta, but for diffs i swear by difftastic:
https://difftastic.wilfred.me.uk/
It's a huge improvement over purely character-based diffs.
I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
I genuinely don't know what is going on here.
> I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
> Joking aside, have you ever tried to use some of these tools
No.
> I use to not understand why people where using vim until I really tried.
There's your problem. I respectfully suggest installing Emacs.
The core Unix toolset is so good, that you can easily get by with it. Many of these tools are better, but still not necessary, and they certainly aren't widely available by default.
How would you filter and transform a large JSON file, without jq?
Out of curiosity, how would you recursively grep files ignoring (hidden files [e.g., `.git`]), only matching a certain file extension? (E.g., `rg -g '*.foo' bar`.)
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
Depends on how big the directory is. If it only contains a few files, I'd just enumerate them all with `find`, filter the results with `grep`, and perform the actual `grep` for "bar" using `xargs`:
(This one I could do from muscle memory.)If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:
(I had to look that one up.)Thanks yeah this is a good example of why I prefer the simpler interface for `rg` and `fd`. Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.
For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).
grep -ri foo ./*
Hits in hidden files is not really a pain point for me
Curious if that answers the "I genuinely don't know what is going on here" then? Not searching hidden files (or third-party dependencies, which `rg` also does automatically with its ignore parsing) isn't just a nice to have, it's mandatory for a number of tasks a software engineer might be performing on a code base?
That doesn't apply to the very specific case for which the parent asked a solution.
They tend to be popular with the "rewrite it in rust/go" crowd as far as I can tell. Or in other words, you are no longer part of the cool kids.
I've seen an online radio player in Go which was unusabiily slow on my Atom n270 due to the badly coded ANSI audio visualization FX' using floating math. Meanwhile a with Cava or another visualizer and mpd+mpc I could do the same using 200x less resources.
I always enjoy these lists. I think most folks out there could probably successfully adopt at least one or two of these tools. For me, that’s ripgrep and jq. The former is a great drop-in replacement for grep and the latter solves a problem I needed solving. I’ll try out a few of the others on this list, too. lsd and dust both appeal to me.
I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.
Thank you to those who have put in so much effort. You’re making the community objectively better.
I think many of us linux admins have such a list. Mine in particular is carefully crafted around GPL-izing my stack as much as possible. I really like the format of this ikrima.dev one though! The other stuff is great too, worth a peruse.
This is 2023 article. As with most “modern tools” half of them probably already have some newer, shinier and more trendy replacements
There's a lot of tools here. Half still leaves plenty of value.
I find the opposite to be true. Most of these are really just reinventing the wheel of foundational GNU tools that are really powerful provided one has spent some time on them.
Small note that a lot of these tool makers allow sponsorship on GitHub. I use bat / fd almost every day. Happy to support https://github.com/sponsors/sharkdp#sponsors
the second item is
exa modern replacement for ls/tree, not maintained
"not maintained" doesn't smell "modern" to me...
like good open source, it's now forked by a community instead of having only a single maintainer
eza: https://github.com/eza-community/eza
The README has an ad at the top.
Yeeeah, nope.
For a cloud-based terminal emulator that heavily focuses on AI none the less. And they have the stomach to call it "for developers".
The tool itself has no ads. What's wrong with a README having an ad?
everything.
So much talk about paying open source developers and when someone actually does something about it and try to make some money, it's again not good enough.
Damned if you do and damned if you don't.
That's the Lindy effect: old tools like ls last because they've already lasted, while modern ones often don’t stick around long enough to.
https://en.wikipedia.org/wiki/Lindy_effectLiterally the next line lists its replacement eza.
On the contrary, that's exactly what “modern” sounds like. I wonder when all those tools will go unmaintained. Coreutils, with all their problems, are maintained since before authors of many listed tools were born.
Many are available on Windows too.
I know I have hyperfine, fd, and eza on my Windows 11, and maybe some more I cannot remember right now.
They are super easy to install too, using winget.
Every time such a list is posted, it tends to generate a lot of debate, but I do think there is at least 2 tools that are really a good addition to any terminal :
`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.
The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.
Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.
EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.
[0] https://github.com/sharkdp/fd?tab=readme-ov-file#placeholder...
> But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}
`find` only support `{}`, it does not support `{/}`, `{//}`, `{.}` etc, which is why you often need to do some parsing magic to replicate basic thing such has "the full path without the extension`, `only the filename without the extension` etc
I briefly resisted the notion that fd and ripgrep were useful when a friend suggested them.
Then I tried them and it was such a night and day performance difference that they're now immediate installs on any new system I use.
Make sure to check out f2, the batch renaming CLI tool. It's perfectly honed and lubricated magic: https://github.com/ayoisaiah/f2
Got featured here on HN few weeks ago.
I’m on a Mac, and some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions. Rather than replace or fight against the system tools, I supplement them with a few extras. Honestly, most are marginal upgrades over what macOS ships with, except for fzf, which is a huge productivity boost. Fuzzy-finding through my shell history or using interactive autocompletion makes a noticeable difference day to day.
>some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions
That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)
duf is pretty good for drive space, has some nice colours and graphs. But its also not as useful for feeding into other tools.
btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.
zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.
would be good to have an indicator if it’s available with your distro by default or what package you’ll need to install it since all tools are only as useful as available they are…
Modern doesn't always mean better. A better replacement for mplayer was mpv, and in some cases mplayer was faster than mpv (think about legacy machines).
Yes, you need to set config.h under both sff and soap, but they will run much, much faster than any Rust tool on legacy machines.> bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
> bat it's a useless cat
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
I agree with this. cat is great for "cating" bat is great for throwing shit on the terminal in a fashion that makes it semantically easier to reason with, two different use cases.
There's ccze which colorizes stuff without creating a supposed cat(1) replacement.
Part of the problem is “naming/marketing.” Bat compares ITSELF to cat, not to more/less. IMO, this confuses the issue.
I think that's because it's super common to use cat to quickly view a file. It has the nice property of using your terminal's scrollback rather than putting you into a pager application. For that use-case it is an alternative to cat.
That said, I've never really cared much about missing syntax highlighting for cases where I'm viewing file contents with cat. So the tool doesn't really serve a purpose for me and instead I'll continue to load up vim/neovim if I want to view a file with syntax highlighting.
tldr is an incredible tool and 95% of the time I'll quickly find what I'm looking for there instead of having to search through the man page.