Short disclosure, I work as a Software Developer in the US, and often have to keep my negative opinions about the tech industry to myself. I often post podcasts and articles critical of the tech industry here in order to vent and, in a way, commiserate over the current state of tech and its negative effects on our environment and the Global/American sociopolitical landscape.
I’m generally reluctant to express these opinions IRL as I’m afraid of burning certain bridges in the tech industry that could one day lead to further employment opportunities. I also don’t want to get into these kinds of discussions except with my closest friends and family, as I could foresee them getting quite heated and lengthy with certain people in my social circles.
Some of these negative opinions include:
- I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
- I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
- I think that capitalism will ultimately doom the tech industry as it reinforces poor system design that deemphasizes maintenance and maintainability in preference of a move fast and break things mentality that still pervades many parts of tech.
- I think we’ve squeezed as much capital out of advertising as is possible without completely alienating the modern user, and we risk creating strong anti tech sentiments among the general population if we don’t figure out a less intrusive way of monetizing software.
You can agree or disagree with me, but in this thread I’d prefer not to get into arguments over the particular details of why any one of our opinions are wrong or right. Rather, I’d hope you could list what opinions on the tech industry you hold that you feel comfortable expressing here, but are, for whatever reason, reluctant to express in public or at work. I’d also welcome an elaboration of said reason, should you feel comfortable to give it.
I doubt we can completely avoid disagreements, but I’ll humbly ask that we all attempt to keep this as civil as possible. Thanks in advance for all thoughtful responses.
I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
Can you please expand on this and help me out here?
I’m coming across people who are true believers in crypto and while I insist it’s a scam and it’s destroying the fucking planet, they go down the rabbit hole into places I can’t follow because I’ve literally not had the interest nor desire to read up on crypto.
They keep saying that what’s really destroying the planet is the existing financial system with all of the logistics involved with keeping it up as opposed to the cryptofarms adding to the demand on the electric grid. They say that is the goal, to replace the existing financial energy demand with crypto but again, it’s only added to it. Another talking point is that in the case of global climate catastrophe there will be pockets of electricity and cryptoservers somewhere on the planet and that while crypto will remain all the other financial systems will disappear
They also seem to somehow think it’s the fix to workplace bureaucracy somehow and everything in sight
Please impart some knowledge.
Bitcoin and all similar crypto were intentionally designed to be self deflating, it won’t replace finance, it’s speed running the same problems. The reason almost every country on earth switched to fiat/self inflating currencies is that the best way to invest a deflating currency is to stash it and forget about it.
Please explain like I’m a bean
Why deflation is bad: deflation means that as time goes on the same amount of money is worth more. This means that a viable way to invest the money is to hold onto it. Say there is yearly deflation of 4%, that means any investment which has a return lower than 4% is losing you money. Additionally intelligent consumers will cut down on purchases since they can buy more for less later. This leads to economic slowdowns and can self compound if suppliers decide to lower prices.
This is one reason why countries like inflation, it encourages spending and investment.
Bitcoin and similar crypto require new coins to validate all previous coins and interactions. Each new coin is exponentially more expensive than the previous. Therefore Bitcoin wealth is extremely stratified to early adopters who built up a collection before the value became this obscene.
What about the new sentiment that pushes the switch back to the gold standard, is this a pipe dream? Aren’t there some major backers of this idea who hold it to be viable?
Complete pipe dream, commodity backed currency means the currency issuer loses control of inflation/deflation to production of said commodity. For a commodity backed currency to maintain value, the commodity stores owned by the issuer have to grow in proportion to monetary demand (usually GDP growth).
Much of what we do and have built is overpriced and useless bullshit that doesn’t make anybody better off.
We are inventing solutions and products to manage other solutions and products to manage other solutions and products to…etc etc.
Websites used to be static HTML pages with some simple graphics, images, and some imbedded stuff. Now, you need to know AWS for your IaaS, Kubernetes to manage your scaling and container orchestration for the thousands of Docker containers that you use to compose your app written in some horrific pile of JavaScript related web stacks like NodeJS, Typescript, React, blah blah blah…
Then you need a ton of other 3rd party components that handle authentication, databasing, backups, monitoring, signaling, account creation/management, logging, billing, etc etc.
It’s circles within circles within circles, and all that to make a buggy, overpriced, clunky web app.
Similar is true for IT, massive software suites that most people in the company use 10% of their functionality for stupid shit.
I’m all for advancing technology, I love technology, it’s my job and my hobby.
But the longer I work in this industry, the more I get this sick feeling that we lost the train long time ago. Buying brand new $1,500 laptops every 3 years so that most of our users can send emails, browse the web, and type up occasional memos.
100% agree with everything you said. I used to absolutely love technology and the Internet, but I’m definitely feeling a lot less interested in it all as the years go on.
Being techy I’m often asked to help out with systems / computer related stuff at work, and I just can’t for the life of me fathom why we’ve got 5 different systems all frankensteined together trying to talk to each other, instead of just one fucking system that does it all.
I learned the other day that our company spent something like 100 million on this prototype system that ended up being totally scrapped. We’ve now integrated all sorts of AI shite and switched to Microsoft purely because of Copilot, which, i can honestly say is a flaming pile of utter shit that never does what you need it to.
The whole industry is in shambles at the moment. I wish all this AI and crypto shit would just disappear, along with the majority of programming languages and frameworks, and other bloated bullshit and just take us back to simpler times.
An inability to understand that ‘e-mail’ doesn’t get an S is not how I guessed you work in a lot of Azure.
Few things would make me happier than to never log into an Azure instance ever again lol.
It’s one of the reasons I enjoy working on open source. Sure the companies that pay the bills for that maintenance might not be the ones you would work for directly but I satisfy myself that we are improving a commons that everyone can take advantage of.
I told my lib colleague about how many software creators provide their stuff and its source code for free and he could barely get why; I also told him historically many nations just left their research and findings available publicly for people to learn from and he can’t grasp why that was either.
He does truly believe the profit motive is the only (best?) way to advance science.
Yes and no. A lot of the projects I work on the majority of the engineers are funded by companies which have very real commercial drivers to do so. However the fact the code itself is free (as in freedom) means that everyone benefits from the commons and as a result interesting contributions come up which aren’t on the commercial roadmap. Look at git, a source control system Linus built because he needed something to maintain Linux in and he didn’t like any of the alternatives. It solved his itch but is now the basis for a large industry of code forges with git at their heart.
While we have roadmaps for features we want they still don’t get merged until they are ready and acceptable to the upstream which makes for much more sustainable projects in the long run.
Interestingly while we have had academic contributions there are a lot more research projects that use the public code as a base but the work is never upstreamed because the focus is on getting the paper/thesis done. Code can work and prove the thing they investigating but still need significant effort to get it merged.
You’re becoming an old man yelling at clouds. People sad all the same shit about websites back in the 90s. They said the same shit about personal computers in offices in general over the mainframe systems. Unless your software is going to be responsible for actual lives it’s better to get something buggy out on time then drag things out like star citizen soaking up money for no returns.
A very large portion (maybe not quite a majority) of software developers are not very good at their jobs. Just good enough to get by.
And that is entirely okay! Applies to most jobs, honestly. But there is really NO appropriate way to express that to a coworker.
I’ve seen way too much “just keep trying random things without really knowing what you’re doing, and hope you eventually stumble into something that works” attitude from coworkers.
I read somewhere that everyone is bad at their job. When you’re good at your job you get promoted until you stop being good at your job. When you get good again, you get promoted.
I know it’s not exactly true but I like the idea.
They call that the Peter Principle, and there’s at least one Ig Nobel Prize winning study which found that it’s better to randomly promote people rather than promote based on job performance.
I don’t want to get promoted… Once my job isn’t mainly about programming anymore (in a pretty wide sense though), I took a wrong turn in life 😅
maybe not quite a majority
VAST majority. This is 80-90% of devs.
I actually would go further and say that collectively, we are terrible at what we do. Not every individual, but the combination of individuals, teams, management, and business requirements mean that collectively we produce terrible results. If bridges failed at anywhere near the rate that software does, processes would be changed to fix the problem. But bugs, glitches, vulnerabilities etc. are rife in the software industry. And it just gets accepted as normal.
It is possible to do better. We know this, from things like the stuff that sent us to the moon. But we’ve collectively decided not to do better.
The tech industry is so very capitalistic, so many companies see devs as min max churn machines, tech debt? Nah FEATURES! AI! MODERNITY! That new dev needs to be trained in the basics and best practices? Sorry that’s not within scope
Main difference is, a bridge that fails physically breaks, takes months to repair, and risks killing people. Your average CRUD app… maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Remember that we almost all code to make products that will make a company money. There’s just no financial upside to doing better in most cases, so we don’t. The financial consequences of most bugs just aren’t great enough to make the industry care. It’s always about maximizing revenue.
maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Or thousands of people get stranded at airports as the ticketing system goes down or there is a data breach that exposes millions of people’s private data.
Some companies have been able to implement robust systems that can take major attacks, but that is generally because they are more sensitive to revenue loss when these systems go down.
That’s why I don’t work on mission critical stuff.
If my apps fail, some Business Person doesn’t get to move some bits around.
A friend of mine worked in software at NASA. If her apps failed, some astronaut was careening through space 😬
Yup, this is exactly it. There are very few software systems whose failure does not impact people. Sure, it’s rare for it to kill them, but they cause people to lose large amounts of money, valuable time, or sensitive information. That money loss is always, ultimately, paid by end consumers. Even in B2B software, there are human customers of the company that bought/uses the software.
I’m not sure if you’re agreeing or trying to disprove my previous comment - IMHO, we are saying the exact same thing. As long as those stranded travelers or data breaches cost less than the missed business from not getting the product out in the first place, from a purely financial point of view, it makes no sense to withhold the product’s release.
Let’s be real here, most developers are not working on airport ticketing systems or handling millions of users’ private data, and the cost of those systems failing isn’t nearly as dramatic. Those rigid procedures civil engineers have to follow come from somewhere, and it’s usually not from any individual engineer’s good will, but from regulations and procedures written from the blood of previous failures. If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
… If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
that’s probably why downtime clauses are a thing in contracts between corporations; it sets a cap at the amount of losses a corporation can suffer and it’s always significantly less than getting slapped by the gov’t if it ever went to court.
I’m just trying to highlight that there is a fuzzier middle ground than a lot of programmers want to admit. Also, a lot of regulations for that middle ground haven’t been written; the only attention to that middle ground have been when done companies have seen failures hit their bottom line.
I’m not saying the middle ground doesn’t exist, but that said middle ground visibly doesn’t cause enough damage to businesses’ bottom line, leading to companies having zero incentive to “fix” it. It just becomes part of the cost of doing business. I sure as hell won’t blame programmers for business decisions.
It just becomes part of the cost of doing business.
I agree with everything you said except for this. Often times, it isn’t the companies that have to bear the costs, but their customers or third parties.
Managers decided that by forcing people to deliver before it’s ready. It’s better for the company to have something that works but with bugs, rather than delaying projects until they are actually ready.
In most fields where people write code, writing code is just about gluing stuff together, and code quality doesn’t matter (simplicity does though).
Game programmers and other serious large app programmers are probably the only ones where it matters a lot how you write the code.
Kind of the opposite actually.
The Business™️ used to make all decisions about what to build and how to build it, shove those requirements down and hope for the best.
Then the industry moved towards Agile development where you put part of the product out and get feedback on it before you build the next part.
There’s a fine art to deciding which bugs to fix win. Most companies I’ve worked with aren’t very good at it to begin with. It’s a special skill to learn and practice
Agile is horrible though. It sounds good in theory but oh my god its so bad.
It’s usually the implementation of Agile that’s bad.
The Manifesto’s organizing principles are quite succinct and don’t include a lot of the things that teams dislike.
We follow these principles: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
I think it’s definitely the majority. The problem is that a lot of tech developments, new language features and Frameworks then pander to this lack of skill and then those new things become buzzwords that are required at most new jobs.
So many things could be got rid of if people would just write decent code in the first place!
deleted by creator
IT is slowly starting to get regulated like a real engineering field and that’s a good developement.
I’m sad that I missed my opportunity to take a PE exam in software engineering.
It’s all trash. Everything normal people use on a daily basis is pure dumpster fire level garbage with massive, HEINOUS, unforgivable amounts of tracking built in.
They know all of this. They just don’t care.
Most programmers suck ass, including myself
Fair, but it’s also just a way of saying that programming isn’t a task for humans. (At least not in the correctness aspect)
Partially, it’s not really a job for computers either
Please stop with the AI pushing. It’s a solution looking for a problem, it’s a waste in 90% of the cases.
Most of the high visibility “tech bros” aren’t technical. They are finance bros who invest in tech.
Most of them speak the jargon, but can’t explain what it means
I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
I’m a writer and my work is increasingly making me use AI to do things. I’m 98% sure I’m just training this thing to replace me at this point, and am planning accordingly.
I really don’t get the use of AI to replace creative roles. At worst I’ve used it as a sort of “lorem ipsum” generator but for various placeholders. I think AI’s true value is in understanding the sometimes overwhelming amount of documents, records, datasets and databases that organizations can amass. Being able to have an AI help sift through the garbage is real helpful actually.
I’ve seen governments using it to do things like handle access-to-information type requests or help patent examiners find relevant patents: those uses make a lot of sense.
No class consciousness. Too many tech workers think they’re rugged individuals that can negotiate their own contracts into wealth.
Working for free on nights and weekends to “hit that deadline” is not good. You’re just making the owners rich, and devaluing labor. Even if you own a lot of equity, it’s not as much as the owners.
And then there’s bullshit like return to office mandates and people are like “oh no none of us want to do this but there’s no organized mechanism to resist”
Join Tech Workers Coalition
We need to be able to talk about unionizing without fear of repercussions.
Are you a member? How is it?
There’s no formal membership but yeah, I’ve been involved in it for 6 years. It depends a lot on the chapter you’re in, so some are more oriented towards community building and socializing, some others are more focused on direct organizing support or political stuff. In each country the legal framework, the political landscape, and the culture are different, so chapters end up looking very different from each other.
Gotcha, all that makes sense. 6 years seems like a good sign. Thanks for the info, definitely checking the org out.
All software should be open source
All software should be released as a common good that cannot be captured by corporations. Otherwise it’s just free labor for Amazon, Google and Facebook
For the sake of humanity
companies don’t know how to interview. i don’t need someone to walk me through a sorting algorithm. i need someone who will be responsive, and interested in the problems we actually face.
Also, any number of interviews that is more than one is too many interviews.
not sure i agree with that. I mean ok, i recently had three interviews for a company where each interviewer asked me almost the same questions. That was clearly a waste.
At my place, we do a 30min introductory call with the boss first, to quickly weed out unfit candidates and not waste employee and interviewee time with interviews. if that’s ok, then there’s three interviews of 45-60 minutes, one with the product owner that focuses on soft skills and team fit, one with the team your applying to and one with the other team (like frontend or backend) with more technical things, and also just if you’d like to work with this person.
no amount of interviewing will ever guarantee that things work out and unfit people can slip through cracks. And i hate wasting time in tons of interviews. But i’d also not want to work at a place where i know my coworkers were hired after just 1 hour quick chatting. That so little time to get an idea of a person, to spot any red flags. Heck, the ‘tell me a bit about yourself’ section of an interview is already 15 minutes and not usually very helpful.
Advertising passed that point a long time ago. Practically everyone is alienated as a result