There exists a peculiar amnesia in software engineering regarding XML. Mention it in most circles and you will receive knowing smiles, dismissive waves, the sort of patronizing acknowledgment reserved for technologies deemed passé. “Oh, XML,” they say, as if the very syllables carry the weight of obsolescence. “We use JSON now. Much cleaner.”

  • AnitaAmandaHuginskis@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    13 days ago

    I love XML, when it is properly utilized. Which, in most cases, it is not, unfortunately.

    JSON > CSV though, I fucking hate CSV. I do not get the appeal. “It’s easy to handle” – NO, it is not. It’s the “fuck whoever needs to handle this” of file “formats”.

    JSON is a reasonable middle ground, I’ll give you that

    • unique_hemp@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      13 days ago

      CSV >>> JSON when dealing with large tabular data:

      1. Can be parsed row by row
      2. Does not repeat column names, more complicated (so slower) to parse

      1 can be solved with JSONL, but 2 is unavoidable.

      • flying_sheep@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        12 days ago

        No:

        • CSV isn’t good for anything unless you exactly specify the dialect. CSV is unstandardized, so you can’t parse arbitrary CSV files correctly.
        • you don’t have to serialize tables to JSON in the “list of named records” format

        Just user Zarr or so for array data. A table with more than 200 rows isn’t ”human readable” anyway.

      • entwine@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        12 days ago
        {
            "columns": ["id", "name", "age"],
            "rows": [
                [1, "bob", 44], [2, "alice", 7], ...
            ]
        }
        

        There ya go, problem solved without the unparseable ambiguity of CSV

        Please stop using CSV.