• coldsideofyourpillow@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    4 days ago

    That’s why I use nushell. Very convenient for writing scripts that you can understand. Obviously, it cannot beat Python in terms of prototyping, but at least I don’t have to relearn it everytime.

    • expr@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      We have someone at work who uses it and he’s constantly having tooling issues due to compatibility problems, so… yeah.

      I’m sure it’s fine for sticking in the shebang and writing your own one-off personal scripts, but I would never actually main it. Too much ecosystem relies on bash/posix stuff.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      4 days ago

      So the alternative is:

      • either an obtuse script that works everywhere, or
      • a legible script that only works on your machine…
      • shortrounddev@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        I am of the opinion that production software shouldn’t be written in shell languages. If it’s something which needs to be redistributed, I would write it in python or something

        • coldsideofyourpillow@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          On a more serious note, NOTHING with more than a little complexity should be written in shell scripts imo. For that, Python is the best, primarily due to how fast it is to prototype stuff in it.

        • Hexarei@programming.dev
          link
          fedilink
          arrow-up
          3
          ·
          4 days ago

          I tend to write anything for distribution in Rust or something that compiles to a standalone binary. Python does not an easily redistributable application make lol

          • shortrounddev@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            4 days ago

            Yeah but then you either need to compile and redistribute binaries for several platforms, or make sure that each target user has rust/cargo installed. Plus some devs don’t trust compiled binaries in something like an npm package

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          4 days ago

          For a bit of glue, a shell script is fine. A start script, some small utility gadget…

          With python, you’re not even sure that the right version is installed unless you ship it with the script.

    • Akito@lemmy.zip
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 days ago

      Nu is great. Using it since many years. Clearly superior shell. Only problem is, that it constantly faces breaking changes and you therefore need to frequently update your modules.

        • Akito@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 days ago

          Yesterday, I upgraded from 0.101.0 to 0.102.0 and date to-table was replaced equally (actually better) with into record, however it was not documented well in the error. Had to research for 5 to 10 minutes, which does not sound much, but if you get this like every second version, the amount of time adds up quickly.

            • Akito@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 days ago

              Yes, I switched to an older version and there was the warning. However, there was no warning on 0.101.0 whatsoever, so upgrading just one patch version broke my master module.

              Sometimes, I skip some versions, so I am certain, that I jumped from < 0.100.0 straight to 0.101.0 and here we are, without any deprecation warning.

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          4 days ago

          Not really. They’ve been on the stabilising path for about two years now, removing stuff like dataframes from the default feature set to be able to focus on stabilising the whole core language, but 1.0 isn’t out yet and the minor version just went three digits.

          And it’s good that way. The POSIX CLI is a clusterfuck because it got standardised before it got stabilised. dd’s syntax is just the peak of the iceberg, there, you gotta take out the nail scissors and manicure the whole lawn before promising that things won’t change.

          Even in its current state it’s probably less work for many scripts, though. That is, updating things, especially if you version-lock (hello, nixos) will be less of a headache than writing sh could ever be. nushell is a really nice language, occasionally a bit verbose but never in the boilerplate for boilerplate’s sake way, but in the “In two weeks I’ll be glad it’s not perl” way. Things like command line parsing are ludicrously convenient (though please nushell people land collecting repeated arguments into lists).

          • Akito@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            Fully agree on this. I do not say, it’s bad. I love innovation and this is what I love about Nushell. Just saying, that using it at work might not always be the best idea. ;)

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      For a defacto windows admin my Powershell skills are…embarrassing lol but I’m getting there!

  • synae[he/him]@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 days ago

    Incredibly true for me these days. But don’t fret, shellcheck and tldp.org is all you need. And maybe that one stackoverflow answer about how to get the running script’s directory

  • Pixelbeard@lemmy.ca
    link
    fedilink
    Français
    arrow-up
    3
    ·
    4 days ago

    Je comprend tellement! Je répond en français pour ma première réponse sur Lemmy juste pour voir comment ça va être géré!

    • Pixelbeard@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 days ago

      I so understand! Answering I. French for my first Lemmy reply just to see how it’s handled.

      Realizing now that language selection is mainly for people filtering. It be cool if it auto translated for people that need it.

      • Pixelbeard@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        En un mundo ideal. Todo se traduciría automáticamente del idioma original al idioma del lector y viceversa

        • admin@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          ¿No nos volvería lentos y flojonazos? (not a real word if you translate, more like slang meaning to be really lazy)

  • umbraroze@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    4 days ago

    There’s always the old piece of wisdom from the Unix jungle: “If you write a complex shellscript, sooner or later you’ll wish you wrote it in a real programming language.”

    I wrote a huge PowerShell script over the past few years. I was like “Ooh, guess this is a resume item if anyone asks me if I know PowerShell.” …around the beginning of the year I rewrote the bloody thing in Python and I have zero regrets. It’s no longer a Big Mush of Stuff That Does a Thing. It’s got object orientation now. Design patterns. Things in independent units. Shit like that.

  • JTskulk@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 days ago

    Bash was the first language I learned, got pretty decent at it. Now what happens is I think of a tiny script I need to write, I start writing it in Bash, I have to do string manipulation, I say fuck this shit and rewrite in Python lol

    • _stranger_@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      4 days ago

      It’s more like bash did it one way and everyone who came after decided that was terrible and should be done a different way (for good reason).

      Looking right at you -eq and your weird ass syntax

      if [[ $x -eq $y ]]

  • Gobbel2000@programming.dev
    link
    fedilink
    arrow-up
    12
    ·
    4 days ago

    So true. Every time I have to look up how to write a bash for loop. Where does the semicolon go? Where is the newline? Is it terminated with done? Or with end? The worst part with bash is that when you do it wrong, most of the time there is no error but something completely wrong happens.

    • qjkxbmwvz@startrek.website
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      4 days ago

      I can only remember this because I initially didn’t learn about xargs — so any time I need to loop over something I tend to use for var in $(cmd) instead of cmd | xargs. It’s more verbose but somewhat more flexible IMHO.

      So I run loops a lot on the command line, not just in shell scripts.

    • ClemaX@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      4 days ago

      It all makes sense when you think about the way it will be parsed. I prefer to use newlines instead of semicolons to show the blocks more clearly.

      for file in *.txt
      do
          cat "$file"
      done
      

      The do and done serve as the loop block delimiters. Such as { and } in many other languages. The shell parser couldn’t know where stuff starts/ends.

      Edit: I agree that the then/fi, do/done case/esac are very inconsistent.

      Also to fail early and raise errors on uninitialized variables, I recommend to add this to the beginning of your bash scripts:

      set -euo pipefail
      

      Or only this for regular sh scripts:

      set -eu
      

      -e: Exit on error

      -u: Error on access to undefined variable

      -o pipefail: Abort pipeline early if any part of it fails.

      There is also -x that can be very useful for debugging as it shows a trace of every command and result as it is executed.

  • brokenlcd@feddit.it
    link
    fedilink
    arrow-up
    14
    ·
    4 days ago

    Knowing that there is still a bash script i wrote around 5 years ago still running the entirety of my high scool lab makes me sorry for the poor bastard that will need to fix those hieroglyphs as soon as some package breaks the script. I hate that i used bash, but it was the easiest option at the time on that desolate server.

    • formulaBonk@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 days ago

      Bash scripts survive because often times they are the easiest option on an abandoned server

  • perishthethought@lemm.ee
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    5
    ·
    4 days ago

    I don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.

    • marduk@lemmy.sdf.org
      link
      fedilink
      arrow-up
      25
      arrow-down
      5
      ·
      4 days ago

      Yes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      4 days ago

      Yeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.

    • ewenak@jlai.lu
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      If When the script gets too complicated, AI could also convert it to Python.

      I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 days ago

      For building a quick template that I can tweak to my needs, it works really well. I just don’t find it to be an intuitive scripting language.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 days ago

      Everything is text! And different programs output in different styles. And certain programs can only read certain styles. And certain programs can only convert from some into others. And don’t get me started on IFS.

    • CrazyLikeGollum@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      4 days ago

      Or scripts for basically any other variant of the Bourne shell. They are, for the most part, very cross compatible.

      • Tinidril@midwest.social
        link
        fedilink
        English
        arrow-up
        11
        ·
        4 days ago

        That’s the only reason I’ve ever done much of anything in shell script. As a network administrator I’ve worked many network appliances running on some flavor of Unix and the one language I can count on to be always available is bash. It has been well worth knowing for just that reason.

      • BeigeAgenda@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        I wrote a script to do backups on a ESXi it uses Busybox’s ASH, one thing I learned after spending hours debugging my scripts was that ASH does not support arrays so you have to do everything with temporary files.

        • YouAreLiterallyAnNPC@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          4 days ago

          There actually is an array in any POSIX shell. You get one array per file/function. It just feels bad to use it. You can abuse ‘set – 1 2 3 4’ to act as a proper array. You can then use ‘for’ without ‘in’ to iterate over it.

          for i; do echo $i; done.

          Use shift <number> to pop items off.

          If I really have to use something more complex, I’ll reach for mkfifo instead so I can guarantee the data can only be consumed once without manipulating entries.

    • ethancedwards8@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 days ago

      I wish it had a more comprehensive auto correct feature. I maintain a huge bash repository and have tried to use it, and it common makes mistakes. None of us maintainers have time to rewrite the scripts to match standards.

      • I honestly think autocorrecting your scripts would do more harm than good. ShellCheck tells you about potential issues, but It’s up to you to determine the correct behavior.

        For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

        Maybe there are autofixable errors I’m not thinking of.

        FYI, it’s possible to gradually adopt ShellCheck by setting --severity=error and working your way down to warnings and so on. Alternatively, you can add one-off #shellcheck ignore SC1234 comments before offending lines to silence warnings.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          arrow-up
          4
          ·
          4 days ago

          For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

          Last time I used ShellCheck (yesterday funnily enough) I had written ports+=($(get_elixir_ports)) to split the input since get_elixir_ports returns a string of space separated ports. It worked exactly as intended, but ShellCheck still recommended to make the splitting explicit rather than implicit.

          The ShellCheck docs recommended

          IFS=" " read -r -a elixir_ports <<< "(get_elixir_ports)"
          ports+=("${elixir_ports[@]}")
          
      • stetech@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        4 days ago

        Then you’ll have to find the time later when this leads to bugs. If you write against bash while declaring it POSIX shell, but then a random system’s sh doesn’t implement a certain thing, you’ll be SOL. Or what exactly do you mean by “match standards”?

    • HyperMegaNet@lemm.ee
      link
      fedilink
      arrow-up
      8
      ·
      4 days ago

      Thank you for this. About a year ago I came across ShellCheck thanks to a comment just like this on Reddit. I also happened to be getting towards the end of a project which included hundreds of lines of shell scripts across dozens of files.

      It turns out that despite my workplace having done quite a bit of shell scripting for previous projects, no one had heard about Shell Check. We had been using similar analysis tools for other languages but nothing for shell scripts. As you say, it turned up a huge number of errors, including some pretty spicy ones when we first started using it. It was genuinely surprising to see how many unique and terrible ways the scripts could have failed.

  • jkercher@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 days ago

    Meh. I had a bash job for 6 years. I couldn’t forget it if I wanted to. I imagine most people don’t use it enough for it to stick. You get good enough at it, and there’s no need to reach for python.

  • wwb4itcgas@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 hours ago

    I have a confession to make: Unless shell script is absolutely required, I just use Python for all my automation needs.