

Dear Google: we wouldn’t have to do this if you weren’t such a shit company.
Oh, you weren’t aware that you’re a shit company? You legitimately believe you’re a positive force for the world? Well that’s your own damn fault.


Dear Google: we wouldn’t have to do this if you weren’t such a shit company.
Oh, you weren’t aware that you’re a shit company? You legitimately believe you’re a positive force for the world? Well that’s your own damn fault.


What a great idea! Just claim your product is healthy to people that don’t care about their health!


I had the same idea to write my CV in Latex, but then realized it’s not such a great idea. I wanted to keep it down to 2 pages, so I ended up having to do a lot of manual formatting (font size, margins, spacing), and the whole point of Latex is that you’re supposed to let the typesetter do the formatting for you. So I switched back to Libreoffice.
But if I had a long-form CV, ie. an academic-style CV where you list all publications, conference talks, etc. with no regard to length, then Latex would be ideal for that.


I haven’t read through the other responses in the thread, but I don’t think it’s the slightly old software that’s the problem. I think it has more to do with using older kernels, meaning that the latest hardware won’t always be supported (on the stable branch at least - there’s always testing and unstable too of course which may have better hardware support).
That may have changed with recent releases though - I haven’t used Debian for several years now. But if your hardware is supported then it’s a pretty solid choice.
Some other people sometimes mention that Debian isn’t as beginner friendly as Ubuntu or Mint, but my experiences have been similar to yours - I found Debian to more user-friendly than Ubuntu for example. Assuming that the hardware works of course - if it doesn’t then it obviously is a worse choice.


The number-one frustration, cited by 45% of respondents, is dealing with “AI solutions that are almost right, but not quite,” which often makes debugging more time-consuming. In fact, 66% of developers say they are spending more time fixing “almost-right” AI-generated code.
Not surprising at all. When you write code, you’re actually thinking about it. And that’s valuable context when you’re debugging. When you just blindly follow snippets you got from some random other place, you’re not thinking about it and you don’t have that context.
So it’s easy to see how this could lead to a net productivity loss. Spend more time writing it yourself and less time debugging, or let something else write it for you quickly, but spend a lot of time debugging. And on top of it all, no consideration of edge cases and valuable design requirement context can also get lost too.


Yeah, I’ve seen a lot of those videos where they do things like {} + [], but why would anyone care what JS does in that case? Unless you’re a shit-ass programmer, you’re never going to be running code like that.
By this same logic, memory safety issues in C/C++ aren’t a problem either, right? Just don’t corrupt memory or dereference null pointers. Only “a shit-ass programmer” would write code that does something like that.
Real code has complexity. Variables are written to and read from all sorts of places and if you have to audit several functions deep to make sure that every variable won’t be set to some special value like that, then that’s a liability of the language that you will always have to work around carefully.


Another thing, he confirms something I was worried about, in his comments on parallelism / Python without the Global Interpreter Lock (aka GIL): Some developments in the language serve rather the big companies, than the community and open source projects. For example, lock-less multi-threading in Python serves mostly the largest companies, while having little value for small projects.
Absolutely agree. The significance of the GIL is heavily overstated in my opinion. There’s a narrow set of use-cases where it matters, ie. if you must use threads and something like multiprocessing or a message queue (ie. Celery) doesn’t do what you need. These are pretty rare circumstances, from my experience at least.


One principle I try to apply (when possible) comes from when I learned Haskell. Try to keep the low-level logical computations of your program pure, stateless functions. If their inputs are the same, they should always yield the same result. Then pass the results up to the higher level and perform your stateful transformations there.
An example would be: do I/O at the high level (file, network, database I/O), and only do very simple data transformations at these levels (avoid it altogether if possible). Then do the majority of the computational logic in lower level, modular components that have no external side effects. Also, pass all the data around using read-only records (example: Python dataclasses with frozen=True) so you know that nothing is being mutated between these modules.
This boundary generally makes it easier to test computational logic separately from stateful logic. It doesn’t work all the time, but it’s very helpful in making it easier to understand programs when you can structure programs this way.


Interesting, I had never heard of ccache before, though yes, all good build systems (CMake, Ninja, etc.) should cache intermediate object files.
But the projects I was working on were so large that even binary and unit test executables were so large that even they would take ~20 seconds to link. You can’t use caching to alleviate that buildtime cost unfortunately.


I think it’s just because it is always recommended as an “easy” language that’s good for beginners.
The only other thing it has going for it is that it has a REPL (and even that was shit until very recently), which I think is why it became popular for research.
If that’s the case, then why didn’t Javascript take its place instead? It’s arguably even better at Python in both of those areas…


Agreed. I have seen a lot of Python code that was really painful to massage back into a more structured object hierarchy. Java certainly does a bit better in that respect, and as a language, it does a much better job of encouraging better practices, but I think it’s also largely due to the kinds of people that use those languages as well.


Sure, but as with all things, it can depend on a lot of factors. All code needs some degree of testing, though one could certainly argue that Python needs more than Java and Java needs more than Rust/Haskell/etc. So you could argue that the productivity gain of using Python is offset by the productivity loss of extra testing. It’s still hard to say which one wins out in the end.


It can scale though. It parallelizes really well if you use queuing systems to distribute the load. You just have to make sure that the hot loops are in C/C++, and it is very easy to interface with compiled binaries via the C API.


Python’s type system is dramatically better than Javascript’s though. Try doing things like '' + True and Javascript will do incredibly stupid things. Python will just give you a type error. Also, look at things like == vs === in Javascript as well. That’s the biggest reason why Typescript replaced it overnight. Python has found a better balance between productivity and error safety.
In my opinion, the biggest shortcoming of Python’s dynamic typing system is that you need to have very thorough test coverage to be sure that your code doesn’t have semantic errors (a lot more than, say, Java). It has gotten better with the introduction of type hints, those I don’t have much experience with them, so I can’t say how much.


Having worked on large C++ projects, the solution to this issue is using a good IDE with an LSP. But it is quite a regrettable situation because setting up an LSP can be a difficult, and they also tend to break quite easily. Understanding how to make an LSP work is an important productivity skill in those kinds of positions.


Distribution usually isn’t considered a strong point for Python, though.
It depends. If it’s a simple script with no external dependencies, then it’s very easy to distribute. But if your application has external dependencies and you are trying to install it on your host (and you aren’t using docker or similar technologies), then yes, it’s harder than just shipping an executable or .jar file. The fact that Python’s standard library is so comprehensive helps a lot in this regard, but it only works up to a certain point.


Yeah, you can use it both for full applications (web or desktop) as well as simple scripts. The flow of getting from something simple to a full blown application is pretty smooth and natural - better than just starting out in Java or C++ and taking at least an hour before you get anything that runs.


My opinion: Python may not be the best at everything it does, but it’s in the top 3-5 languages in the following areas:
It will always be a practical choice for those reasons. There are probably a lot more as well that I can’t think of at the moment.


Pathlib is very nice indeed, but I can understand why a lot of languages don’t do similar things. There are major challenges implementing something like that. Cross-platform functionality is a big one, for example. File permissions between Unix systems and Windows do not map perfectly from one system to another which can be a maintenance burden.
But I do agree. As a user, it feels great to have. And yes, also in general, the things Python does with its standard library are definitely the way things should be done, from a user’s point of view at least.
My current favorite game (Valheim) is Swedish, I believe.