- cross-posted to:
- programmer_humor@programming.dev
- cross-posted to:
- programmer_humor@programming.dev
In college back in 1991. Also had to do PASCAL and FORTRAN but thankfully those two were in a single course.
I also took PASCAL in the 90s, but it is considered a high level language, and writes similarly to other high lvl languages, assembly has a very different syntax
Oh, I know. I meant that we had to take courses on older languages as part of the curriculum. That was a funky little college program. The oddest experience for me was taking Python back in the day as the “new thing” then not seeing it again until it absolutely exploded ~10 years ago. That program is also why I ended up playing with Linux so early on. The professors truly seemed to have a passion for emerging technologies while not wanting anyone to forget what came before. Thankfully, no punch cards.
We used turbo pascal in school in the early 90’s. And it had assembly blocks… which I used copious amounts of because it was the only way to make the IBM PS/1’s do useful graphics.
Assembly code is for writing C compilers, and C compilers are for writing Lisp interpreters.
I saw a Scheme interpreter written in assembly running a C compiler written in Scheme.
There’s actually good reasons for this design. It’s easy to write a Scheme interpreter in assembly, but it’s hard to write a C compiler in assembly that handles everything correctly. Much rather write it in higher level language if possible and Scheme lowers the bar to getting there, so you can get away from using assembly as quickly as possible. Or you can copy somebody else’s Scheme implementation of a C compiler because now you’re platform independent.
Then you can write your C compiler in C (or steal a better compiler already written in C) and close the loop. For your final step, you use the C compiler to compile itself.
Back in High School in the 80’s me and a buddy wrote a Z-80 editor assembler in TRS-DOS BASIC.
It was not rocket science.
I never did get very far with the TRS-80 Editor Assembler, but that was my first exposure to such things.
I also remember the BASIC code for the Dancing Daemon which was replete with PEEKs and POKEs, such that much of it was written in machine code.
Only the most very basic compilers. C compilers are in C mainly.
Talking about bootstrap here?
Indeed
And that’s how you get the Thompson hack
Not the first C compiler obviously. According to this Stack Overflow post, BCPL* begat B, which begat C. Language self-hosting is pretty fascinating.
*Perhaps BCPL was originally written in assembly; I’m not certain: https://github.com/SergeGris/BCPL-compiler
For a university assignment, I built a compiler for x86; I cheated a bit by relying on LLVM, but it gave me a better understanding of the architecture. I also developed emulators for the NES (Ricoh 2A03) and RISC-V (RV32I) as a hobby. For the latter, I implemented it in FPGA.
I used to make video games entirely in assembly
Assembly used to be a required course for CS undergrads in the 90s. Is that no longer the case?
Also we had to take something called Computer Architecture, which was like an EE class designing circuits with gates and shit.
I still had to do that in the late 2010s in college
Which target did you use? Having to learn even a fraction of modern x86 would be ridiculous, but SPARC or something could be good to know, just to reduce the “magic box” effect.
I learned MIPS as an undergrad. Pretty neat little RISC architecture.
I learned mips as graduate. In undergrad had to build with logic gates for things like 2 digit decimal counter and my architecture classes were diagram blocks for a simple CPU. But by that time we knew how to do moderate complexity circuits in VHDL simulation, and we had to make a simple VHDL circuit run for real in FPGA.
This was a long time ago. I’m pretty sure it was 8086.
Required course work for electrical engineers in the early 2000s.
I had to learn assembly but was one topic of many we handled in architecture. Like one question of one exam. That was one of the toughest professors we had, class was about 2001
Its still a thing
I think the university I went to phased out the EE requirements the year after me. Honestly, I think it should be required. Understanding how the computer “thinks” is such an important skill.
I attended two different Bachelor’s courses, one with a very technical (2016-2018) and one with a more high level focus (2018-2023). The first did have a class where we learned how to go from logic gates to a full ALU as well as some actual EE classes, but I didn’t go far enough or memorise the list of classes to remember whether Assembly would have become a thing. We learned programming with first Processing, then C and C++.
The second had C as an elective course, and that was as technical and low-level as it ever got.
Only on the VIC20 and Atari STe. On the VIC20 you had to write the assembler, manually convert it to machine code and enter that into the computer. There was a cartridge with an assembler, debugger and an extra 3.5Kb memory for it but I never got one.
Vic 20 was my first. I watched my dad struggle with and eventually give up on assembly. Something-something and the microbots. I was fearful of it until I took Assembly at Uni. That 2nd/3rd year class was where the final puzzle piece of how computers work fell in place for me.
My first job was writing assembly tests for a DSP hardware design team. Fell in love. Never looked back.
I was pretty into x86 asm in my teens. Nasm was my go to. Anyone else ever play with MenuetOS?
I used to write z80 asm without an assembler back when I was a LOT younger. The ZX spectrum manual I had, had the full instruction list with the byte values.
I think it was oddly easier than some higher level languages for some tasks.
But, making changes was an utter nightmare.
Ha! I teach assembly and use this one every year to lighten the mood before midterms.
I get the feeling that all of these assembly jokes are justifications to avoid learning assembly.
You can still make syscalls in assembly. Assembly isnt magic. It isn’t starting from the creation of matter and energy, it’s just very specific code.
It’s just a joke friend.
A very bad one if it requires switching off a large portion of your brain to find it funny.
suspension of disbelief
I said so in my comment, try to keep up.
I had an assembly class in college. I didn’t love of at all. Got my first job after graduating and it was writing space shuttle engine control software, which was in assembly. I was kind of surprised at how fast it became natural after dealing with it full time. Still, it felt luxurious when we upgraded the controller and could do the software in C.
“oh no, I had to do literal rocket science”
Mebly I do, and mebly I don’t.
I have Dyslexia ¯_(ツ)_/¯ Sorry.
You dropped this \
Short explanation: Type ¯\\\_(ツ)\_/¯ to see ¯\_(ツ)_/¯.
Long expanation: Lemmy supports formatting, like _italic_ becomes italic. To stop this from happening, you can put a \ before it like \_; the \ isn’t shown. This is why ¯\_(ツ)_/¯ becomes ¯_(ツ)_/¯. To show a \ you need an additional \ like so: \\, and to make sure _ is shown and not turned into italic, it too needs \. This is why ¯\\\_(ツ)\_/¯ becomes ¯\_(ツ)_/¯
The backslash is known as an escape character in this context, because it removes (escapes) the special meaning of the following character.
It’s also used that way in most Unix shells.
I’m sorry, I have no idea what you’re talking about. Could you explain it in assembly?
global _main extern _GetStdHandle@4 extern _WriteFile@20 extern _ExitProcess@4 section .text _main: ; DWORD bytes; mov ebp, esp sub esp, 4 ; hStdOut = GetstdHandle( STD_OUTPUT_HANDLE) push -11 call _GetStdHandle@4 mov ebx, eax ; WriteFile( hstdOut, message, length(message), &bytes, 0); push 0 lea eax, [ebp-4] push eax push (message_end - message) push message push ebx call _WriteFile@20 ; ExitProcess(0) push 0 call _ExitProcess@4 ; never here hlt message: db '¯\\\_(ツ)\_/¯', 10 message_end:
Do you want to show us what that looks like in assembly, ASCII from machine code? …ha, ha, ha, no!
Depends on the device, I know. Such a pain without the higher level languages.
What would it look like for ARM android touch screens? Just for one character…
But if some characters go missing or are exchanged for others for no discernable reason, then might that be an exploit on a EC or assembly level?
Alternatively, you can just use the `` enclosure, used for single line code.
That is a “grave accent” or a “backtick”, the key you will find on the left of the ‘1’ key and under the ‘Esc’ key on a standard (ISO, maybe) 104/105 key qwerty keyboard.¯\_(ツ)_/¯
Not exactly accurate, I think. Even machine language is bound by the CPU’s architecture. You can’t do anything in machine language that wasn’t specifically provided for by the CPU architects.
It would be more accurate to say it’s like creating a new universe using all the same laws of physics, thermodynamics, cosmology, ethics, etc as our existing universe.
I don’t think accuracy was the goal, it is a joke not a dissertation. It’s more about how it feels to try a language like assembly after working with higher-level languages.
OS and embedded dev here. I use assembly all the time. I’ve even worked on firmware that was entirely in assembly of strict requirements that couldn’t be met in C.
Also even machine code hides a lot about how the underlying machine works so if you really want to do computing from scratch you really do hate to invent the universe because there’s abstractions all the way up the hardware stack just like there is in software.
NASM FTW