

It doesn’t say that. The potential fine the higher of 23M or 10%. Not that 23M is 10%


It doesn’t say that. The potential fine the higher of 23M or 10%. Not that 23M is 10%
Yeah, maybe the contractor thought he’d get more work fixing it but he was long gone by the time I got it so i never met him
One of bugs I got was performance because the search didn’t work, with about 600,000 assets in database it would timeout searching for one by exact match on ID. It took 45 minutes to return 1 result.
I got dumped with fixing some bugs in a project written by a contractor who had literally done this but with extra steps.
Backend was sql server and c#/asp.
There was an api endpoint that took json, used xslt to transform to xml. Then called the stored procedure specified in request passing the xml as a parameter.
The stored procedure then queried the xml for parameters, executed the query, and returned results as xml.
Another xslt transformed that to json and returned to the client.
It was impressive how little c# there was.
Despite holding all the business logic, the sql was not in source control.


Came here to say this. Scariest encounter I had was earlier this year with a stag. He was standing on a footpath, it was dusk and he was shadow so I didn’t see him until I was 5m away. I’m 1.9m and he was looking down at me. Had another 1m of antlers. Then my dog started barking and he just turned and walked away into the trees.
Same dog once tried to fight a pair of geese, she’s similar size to them, they didn’t back down.
Had a 3 year old one this week. A loop that builds a list of messages to send to a queue for another service to consume then it calls BatchPublish.
Only Batch Publish was inside the loop so instead of sending n messages, it sends 1+2+3… +n
We never noticed before because n was never more then 100 and the consuming service is idempotent so the duplicate messages don’t cause issues. I think it’s (n(n-1))/2. So n=100 is 4950. That’s only 4 minutes work. Also that code only runs at 1am.
Recently n has been hitting 1000 which produces 499500 messages and it takes a few hours to clear and triggers an alarm for delayed processing.


Was wondering about this, I’m in UK, I could just make my own instance, I’m the only user so I verify my own age, federate with everyone. All good? ¯\_(ツ)_/¯


Yeah, people trying to downplay it are as “just a tweet” are also pieces of shit. The way she said it is irrelevant.
What is relevant is that as she was calling for hotels full of people to set on fire, there were people trying to set hotels on fire.
She called for people to be murdered, then others attempted it, she got off lightly.
At work people think I’m some kind of wizard with git.
I tell them I’ve been using it every day for 15 years and I read the freely available book on the website, link them to it, and mention the first 3 chapters probably covers 90% of their normal usage so they should just read that.
They won’t do it. I don’t get it. Something about written words is scary to them.


Now apply this to literally everything else. There’s a tech company inserting itself into every industry that worked fine without them, extracting money from both sides.
My local pizza place is 40% more expensive on takeaway apps, or i can just phone them directly.


Not in the US, our water infrastructure was sold off in 90s but that makes sense. Was probably something similar They held us to it though so they overpaid for hardware beyond their needs and we forced the software to run slower


That would make sense, i hadn’t put that together but they had a lot of embedded control systems. This was water treatment but entirely separate from the control systems but i can see them having that a standard requirement


Did a project several years where the customrr required that the server we delivered specifically for the project never use more than 50% CPU or RAM. No requirements about how fast it actually performs its intended function, just that it can only utilise half the available resources while doing it.


This is my experience. It saves a bit of typing sometimes but that’s probably cancelled out by the time spent correcting it, rewriting nonsense it produced, and reviewing my corworkers PRs that didn’t notice the nonsense.
My university would keep past exam papers in the library. This was apparently a little known fact, but somehow we discovered it, went and got them and use them as the basis for revision.
Turns out our professors were lazy and used the same exam every year. Does that count as cheating?


I did the same but with milk. My job at the time supplied coffee but not milk, the fridge was full of 1 pint bottles with names written on. There was never enough space. People got territorial over their 5sq inches of fridge. There was a milk club where they pooled together to buy milk for their group.
I couldn’t face dealing with that so opted out and drank it black. That was 15 years ago.
Some time later that employer realised they could solve a great many staff disputes for the low price of 20 pints of milk a week and started supplying it. No idea why it took so long.
My first job was about 200 people but there was a satellite office with 3 people. Similar story, someone left and they tried to replace him for the same salary. The job ad was for project manager/lead dev/office manager/customer support and user training.
They actually hired someone who latest 6 weeks


Yeah, humans regularly deliver stuff wrong on our street. There is no way robots will manage. I get packages for both by neighbours and they get mine more often than correct deliveries and one of my neighbours is a business.
Set push.autoSetupRemote in your gitconfig and it’ll do it first time


A great example of this is TSA luggage locks. Mandated backdoor, master keys leaked by company that makes them, now anyone can open any TSA approved lock.
I speed ran this. First job right out of uni, the team lead went on holiday 2 weeks later and never came back. Everyone else was gone within 3 months.