Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the main problem with writing large programs as bash scripts is that shell scripting languages were never really designed for complexity. They excel at orchestrating small commands and gluing together existing tools in a quick, exploratory way. But when you start pushing beyond a few hundred lines of Bash, you run into a series of limitations that make long-term maintenance and scalability a headache.

First, there’s the issue of readability. Bash's syntax can become downright cryptic as it grows. Variable scoping rules are subtle, error handling is primitive, and string handling quickly becomes messy. These factors translate into code that’s harder to maintain and reason about. As a result, future maintainers are likely to waste time deciphering what’s going on, and they’ll also have a harder time confidently making changes.

Next, there’s the lack of robust tooling. With more mature languages, you get static analysis tools, linters, and debuggers that help you spot common mistakes early on. For bash, most of these are either missing or extremely limited. Without these guardrails, large bash programs are more prone to silent errors, regressions, and subtle bugs.

Then there’s testing. While you can test bash scripts, the process is often more cumbersome. Complex logic or data structures make it even trickier. Plus, handling edge cases—like whitespace in filenames or unexpected environment conditions—means you end up writing a ton of defensive code that’s painful to verify thoroughly.

Finally, the ecosystem just isn’t built for large-scale Bash development. You lose out on modularity, package management, standardized dependency handling, and all the other modern development patterns that languages like Python or Go provide. Over time, these deficits accumulate and slow you down.

I think using Bash for one-off tasks or simple automation is fine — it's what it’s good at. But when you start thinking of building something substantial, you’re usually better off reaching for a language designed for building and maintaining complex applications. It saves time in the long run, even if the initial learning curve or setup might be slightly higher.



Using ShellCheck as a linter can catch a lot of the common footguns and there are a LOT of footguns and/or unexpected behaviour that can catch out even experienced Bash writers. However, Bash/shell occupies a unique place in the hierarchy of languages in that it's available almost everywhere and will still be around in 30 years. If you want a program that will run almost everywhere and still run in 30 years time, then shell/Bash is a good choice.


I'd almost always prefer c99 to shell for anything more than 100 lines of code or so. There is even a project I saw here recently that can bootstrap tcc in pure shell (which can then be used to bootstrap gcc). I'm somewhat skeptical that bash will still be used for anything but legacy scripts in 30 years, despite it's impressive longevity to this point, but I could sadly be proven wrong.


So, if you wanted to write something that you would be pretty sure could easily run on machines in 30 years time, what would you use?

I don't think c99 would be a good choice as processors will likely be different in 30 years time. If you had your program on e.g. a usb stick and you manage to load it onto a machine, it'd only be able to run if you had the same architecture. Even nowadays, you'd run into difficulties with arm and x86 differences.

Some kind of bytecode language might seem better (e.g. java), but I have my doubts about backwards compatibility. I wonder if Java code from 20 years ago would just run happily on a new Java version. However, there's also the issue of Java not being installed everywhere.


> I wonder if Java code from 20 years ago would just run happily on a new Java version.

Absolutely.


That's good to know. I haven't touched Java myself in years, but at work I hear of developers complaining that our code runs on Java 11 and they haven't been given the time to move it to a more recent version.

Personally, I've encountered great difficulties with some old SAN software that required a Java 6 web plugin that I couldn't get running on anything other than Internet Explorer - I kept an XP VM with the correct version just for that. I suspect a large part of the problem was the software incorrectly attempts to check that the version is at least 6, but fails when the version is newer (they obviously didn't test it when later versions got released).


Dealing with this at work right now. Digging through thousands of lines of Bash. This script wasn’t written a long time ago, so no clue why they went with Bash.

The script works but it always feels like something is going to break if I look at the code the wrong way.


If you have thousands of lines of bash, don't like maintaining it, but don't necessarily want to rewrite the whole thing at once, that's what https://www.oilshell.org/ is for!

See my comment here, with some details: https://news.ycombinator.com/item?id=42354095

(I created the project and the wiki page. Right now the best bet is to join https://oilshell.zulipchat.com/ if it interests you. People who want to test it out should be comfortable with compiling source tarballs, which is generally trivial because shells have almost no dependencies.)

The first step is:

    shopt --set strict:all  # at the top of the file
Or to run under bash

    shopt -s strict:all 2>/dev/null || true
And then run with "osh myscript.bash"

OSH should run your script exactly the same as bash, but with better error messages, and precise source locations.

And you will get some strictness errors, which can help catch coding bugs. It's a little like ShellCheck, except it can detect things at runtime, whereas ShellCheck can't.


Bash/ksh have -x as a debug/tracing argument.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: