- cross-posted to:
- programmerhumor@lemmy.ml
- cross-posted to:
- programmerhumor@lemmy.ml
That’s why I use nushell. Very convenient for writing scripts that you can understand. Obviously, it cannot beat Python in terms of prototyping, but at least I don’t have to relearn it everytime.
We have someone at work who uses it and he’s constantly having tooling issues due to compatibility problems, so… yeah.
I’m sure it’s fine for sticking in the shebang and writing your own one-off personal scripts, but I would never actually main it. Too much ecosystem relies on bash/posix stuff.
So the alternative is:
- either an obtuse script that works everywhere, or
- a legible script that only works on your machine…
a script that only works on your machine
That’s why docker exists :D
I am of the opinion that production software shouldn’t be written in shell languages. If it’s something which needs to be redistributed, I would write it in python or something
On a more serious note, NOTHING with more than a little complexity should be written in shell scripts imo. For that, Python is the best, primarily due to how fast it is to prototype stuff in it.
I tend to write anything for distribution in Rust or something that compiles to a standalone binary. Python does not an easily redistributable application make lol
Yeah but then you either need to compile and redistribute binaries for several platforms, or make sure that each target user has rust/cargo installed. Plus some devs don’t trust compiled binaries in something like an npm package
For a bit of glue, a shell script is fine. A start script, some small utility gadget…
With python, you’re not even sure that the right version is installed unless you ship it with the script.
I try to write things to be cross-platform; with node builds, I avoid anything using shell scripting so that we can support Windows builds as well. As such, I usually write the deployment scripts in Node itself, but sometimes python if it’s supported by our particular CI/CD pipeline
I keep forgetting windows exists.
Most common development platform in the world
I quit using it in the WfW days and never looked back.
deleted by creator
Ruby and calling bash like this
`cat a.txt`
Nu is great. Using it since many years. Clearly superior shell. Only problem is, that it constantly faces breaking changes and you therefore need to frequently update your modules.
Not a problem for me in Nix, seems like a skill issue /j
They’ve slowed down with those a bit recently, haven’t they?
Yesterday, I upgraded from
0.101.0
to0.102.0
anddate to-table
was replaced equally (actually better) withinto record
, however it was not documented well in the error. Had to research for 5 to 10 minutes, which does not sound much, but if you get this like every second version, the amount of time adds up quickly.Actually had been deprecated beforehand, you should have gotten a warning. The deprecation cycle certainly is quite short, I’m still on 0.100.0, If I were to upgrade now I’d jump the version with the warning.
Yes, I switched to an older version and there was the warning. However, there was no warning on
0.101.0
whatsoever, so upgrading just one patch version broke my master module.Sometimes, I skip some versions, so I am certain, that I jumped from <
0.100.0
straight to0.101.0
and here we are, without any deprecation warning.
Not really. They’ve been on the stabilising path for about two years now, removing stuff like dataframes from the default feature set to be able to focus on stabilising the whole core language, but 1.0 isn’t out yet and the minor version just went three digits.
And it’s good that way. The POSIX CLI is a clusterfuck because it got standardised before it got stabilised.
dd
’s syntax is just the peak of the iceberg, there, you gotta take out the nail scissors and manicure the whole lawn before promising that things won’t change.Even in its current state it’s probably less work for many scripts, though. That is, updating things, especially if you version-lock (hello, nixos) will be less of a headache than writing
sh
could ever be. nushell is a really nice language, occasionally a bit verbose but never in the boilerplate for boilerplate’s sake way, but in the “In two weeks I’ll be glad it’s not perl” way. Things like command line parsing are ludicrously convenient (though please nushell people land collecting repeated arguments into lists).Fully agree on this. I do not say, it’s bad. I love innovation and this is what I love about Nushell. Just saying, that using it at work might not always be the best idea. ;)
Unironically love powershell
For a defacto windows admin my Powershell skills are…embarrassing lol but I’m getting there!
Incredibly true for me these days. But don’t fret, shellcheck and tldp.org is all you need. And maybe that one stackoverflow answer about how to get the running script’s directory
Je comprend tellement! Je répond en français pour ma première réponse sur Lemmy juste pour voir comment ça va être géré!
I so understand! Answering I. French for my first Lemmy reply just to see how it’s handled.
Realizing now that language selection is mainly for people filtering. It be cool if it auto translated for people that need it.
Si yo también comprendo, qué necesidad de comentar todo el tiempo en anglais?
En un mundo ideal. Todo se traduciría automáticamente del idioma original al idioma del lector y viceversa
¿No nos volvería lentos y flojonazos? (not a real word if you translate, more like slang meaning to be really lazy)
There’s always the old piece of wisdom from the Unix jungle: “If you write a complex shellscript, sooner or later you’ll wish you wrote it in a real programming language.”
I wrote a huge PowerShell script over the past few years. I was like “Ooh, guess this is a resume item if anyone asks me if I know PowerShell.” …around the beginning of the year I rewrote the bloody thing in Python and I have zero regrets. It’s no longer a Big Mush of Stuff That Does a Thing. It’s got object orientation now. Design patterns. Things in independent units. Shit like that.
I initially read “UNIX jungle” as “UNIX jingle” and thought I had been really missing out!
You have, look up the SuSE songs.
I consider python a scripting language too.
They’re all programming languages, they all have their places.
All scripting languages are programming languages but not all programming languages are scripting languages
I use it for scripting too. I don’t need Python as much as before nowaday.
Bash was the first language I learned, got pretty decent at it. Now what happens is I think of a tiny script I need to write, I start writing it in Bash, I have to do string manipulation, I say fuck this shit and rewrite in Python lol
It seems like it does stuff differently for the sake of it being different.
It’s more like bash did it one way and everyone who came after decided that was terrible and should be done a different way (for good reason).
Looking right at you -eq and your weird ass syntax
if [[ $x -eq $y ]]
That was the point where I closed the bash tutorial I was on, and decided to just use python and
subprocess.run()
You better not look at powershell in that case :p
-eq
Yeah, like infix, so between operands, but dashed like a flag so should come before arguments. Very odd.
So true. Every time I have to look up how to write a bash for loop. Where does the semicolon go? Where is the newline? Is it terminated with
done
? Or withend
? The worst part with bash is that when you do it wrong, most of the time there is no error but something completely wrong happens.I can only remember this because I initially didn’t learn about
xargs
— so any time I need to loop over something I tend to usefor var in $(cmd)
instead ofcmd | xargs
. It’s more verbose but somewhat more flexible IMHO.So I run loops a lot on the command line, not just in shell scripts.
It all makes sense when you think about the way it will be parsed. I prefer to use newlines instead of semicolons to show the blocks more clearly.
for file in *.txt do cat "$file" done
The
do
anddone
serve as the loop block delimiters. Such as{
and}
in many other languages. The shell parser couldn’t know where stuff starts/ends.Edit: I agree that the
then
/fi
,do
/done
case
/esac
are very inconsistent.Also to fail early and raise errors on uninitialized variables, I recommend to add this to the beginning of your bash scripts:
set -euo pipefail
Or only this for regular sh scripts:
set -eu
-e
: Exit on error-u
: Error on access to undefined variable-o pipefail
: Abort pipeline early if any part of it fails.There is also
-x
that can be very useful for debugging as it shows a trace of every command and result as it is executed.set -euo pipefail
Fun fact, if you’re forced to write against POSIX shell, you aren’t allowed to use these options, since they’re not a thing, which is (part of) the reason why for example Google doesn’t allow any shell language but bash, lol.
Btw, all three set options given above are included in POSIX since 2024: https://pubs.opengroup.org/onlinepubs/9799919799/
Ooh, you’re totally right!! I forgot about that since it’s not in the older versions.
Knowing that there is still a bash script i wrote around 5 years ago still running the entirety of my high scool lab makes me sorry for the poor bastard that will need to fix those hieroglyphs as soon as some package breaks the script. I hate that i used bash, but it was the easiest option at the time on that desolate server.
Bash scripts survive because often times they are the easiest option on an abandoned server
I don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.
Yes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.
Until the magic incantations you don’t bother to understand don’t actually do what you think they’re doing.
Sounds like a problem for future me. That guy hates me lol
Yeah fuck that guy
Yes, I have never wrote a piece of code that didn’t do what I thought it would before LLMs, no sir.
I wonder if there’s a chance of getting
rm -rf /*
or zip bombs. Those are definitely in the training data at least.The classic
rm -rf $ENV/home
where$ENV
can be empty or contain spaces is definitely going to hit someone one day
In fairness, this also happens to me when I write the bash script myself 😂
Yeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.
IfWhen the script gets too complicated, AI could also convert it to Python.I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.
For building a quick template that I can tweak to my needs, it works really well. I just don’t find it to be an intuitive scripting language.
to be honest I agree and thought we would be using something more intuitive by now
Everything is text! And different programs output in different styles. And certain programs can only read certain styles. And certain programs can only convert from some into others. And don’t get me started on
IFS
.I think the cool kids are using Nu now
Clearly you don’t write enough bash scripts.
Or scripts for basically any other variant of the Bourne shell. They are, for the most part, very cross compatible.
That’s the only reason I’ve ever done much of anything in shell script. As a network administrator I’ve worked many network appliances running on some flavor of Unix and the one language I can count on to be always available is bash. It has been well worth knowing for just that reason.
I wrote a script to do backups on a ESXi it uses Busybox’s ASH, one thing I learned after spending hours debugging my scripts was that ASH does not support arrays so you have to do everything with temporary files.
There actually is an array in any POSIX shell. You get one array per file/function. It just feels bad to use it. You can abuse ‘set – 1 2 3 4’ to act as a proper array. You can then use ‘for’ without ‘in’ to iterate over it.
for i; do echo $i; done.
Use shift <number> to pop items off.
If I really have to use something more complex, I’ll reach for mkfifo instead so I can guarantee the data can only be consumed once without manipulating entries.
Cool, good to know.
When I bash my head into a wall, does that count?
Only if you scripted it
Enough is enough
I’ve had enough of these motherfucking scripts on this motherfucking PC!
PSA: Run ShellCheck on your shell scripts. It turns up a shocking number of programming errors. https://www.shellcheck.net/
I wish it had a more comprehensive auto correct feature. I maintain a huge bash repository and have tried to use it, and it common makes mistakes. None of us maintainers have time to rewrite the scripts to match standards.
I honestly think autocorrecting your scripts would do more harm than good. ShellCheck tells you about potential issues, but It’s up to you to determine the correct behavior.
For example, how could it know whether
cat $foo
should becat "$foo"
, or whether the script actually relies on word splitting? It’s possible that$foo
intentionally contains multiple paths.Maybe there are autofixable errors I’m not thinking of.
FYI, it’s possible to gradually adopt ShellCheck by setting
--severity=error
and working your way down to warnings and so on. Alternatively, you can add one-off#shellcheck ignore SC1234
comments before offending lines to silence warnings.For example, how could it know whether
cat $foo
should becat "$foo"
, or whether the script actually relies on word splitting? It’s possible that$foo
intentionally contains multiple paths.Last time I used ShellCheck (yesterday funnily enough) I had written
ports+=($(get_elixir_ports))
to split the input sinceget_elixir_ports
returns a string of space separated ports. It worked exactly as intended, but ShellCheck still recommended to make the splitting explicit rather than implicit.The ShellCheck docs recommended
IFS=" " read -r -a elixir_ports <<< "(get_elixir_ports)" ports+=("${elixir_ports[@]}")
Then you’ll have to find the time later when this leads to bugs. If you write against bash while declaring it POSIX shell, but then a random system’s
sh
doesn’t implement a certain thing, you’ll be SOL. Or what exactly do you mean by “match standards”?
Thank you for this. About a year ago I came across ShellCheck thanks to a comment just like this on Reddit. I also happened to be getting towards the end of a project which included hundreds of lines of shell scripts across dozens of files.
It turns out that despite my workplace having done quite a bit of shell scripting for previous projects, no one had heard about Shell Check. We had been using similar analysis tools for other languages but nothing for shell scripts. As you say, it turned up a huge number of errors, including some pretty spicy ones when we first started using it. It was genuinely surprising to see how many unique and terrible ways the scripts could have failed.
Meh. I had a bash job for 6 years. I couldn’t forget it if I wanted to. I imagine most people don’t use it enough for it to stick. You get good enough at it, and there’s no need to reach for python.
I have a confession to make: Unless shell script is absolutely required, I just use Python for all my automation needs.