I see your pitiful vibe coding and raise you… SPITE CODING 👿
Spite coding is almost the original coding.
I have definitely gone into a rageful fugue state and woken up a week later after reworking an entire code base from being an inconsistent mess of slop…
…into actually having a common library of functions instead of just rewriting slightly different versions of them 8 times, having those functions only actually instantiated for necesarry classes…
…rewriting every variable name and function name to an actually consistent and intelligible naming scheme…
… and finally, moving a whole bunch of shit out of some kind of global ‘think’ type loop that doesn’t actually need to be called or checked every goddamned micro second.
Done that more than once actually.
Never look inside ‘baby’s first video game mod’ code, unless you have healthy blood pressure.
But uh yeah, spite, hatred, and anger are indeed powerful motivators for making good code, lol.
… so many idiots just jam everything into a global, called every tick loop, and then claim that it just can’t be optimized, because “the game engine just can’t handle it”…
I spent a good fraction of my career taking over and trying to fix code bases that my company refused to scrap and replace outright because they didn’t want to admit their worthlessness. Complete rewrites would have taken maybe a tenth of the time I spent.
My favorite thing to encounter (which was nearly universal) was the phenomenon of a young programmer fresh out of college encountering SQL for the first time, deciding he hated it, and writing a huge mess of code to handle auto-generating the necessary SQL. I remember taking over one C# application that had classes named “AND.cs” and “OR.cs” which just took a String as a parameter and returned that String with " AND " and " OR " appended to it, respectively. In about an hour, I replaced three months of this guy’s work that had bottlenecked the project with like five SQL statements.
It’s insane to think what the civil engineering world would be like if it had the career structure of the software world.
Holy shit, have we worked with the same guy?
This guy’s code once fired a 125 mph knuckleball a foot above a 10-year-old kid’s head. Probably not the same guy.
The person on the video, known online as Tsoding (or by some as “mista azozin”), was writing a music visualizer program using the raylib library for writing videogames. raylib doesn’t have code aimed at UI handling, meaning he had to manage the UI by himself. He likes doing a little bit of trolling, so that’s why he picked that title.
Tsoding does by far the most entertaining recreational programming sessions I have ever seen on the Internet, so, despite them being quite long (about two hours), I recommend you watch at least a little bit of his videos/streams if you have time.
- YouTube channel: https://www.youtube.com/channel/UCrqM0Ym_NbK1fqeQG2VIohg
- Twitch channel: https://twitch.tv/tsoding
If you’re interested specifically on this video in question:
- The video: https://www.youtube.com/watch?v=SRgLA8X5N_4
- The program’s code: https://github.com/tsoding/musializer
- The playlist containing the making of the program: https://www.youtube.com/playlist?list=PLpM-Dvs8t0Vak1rrE2NJn8XYEJ5M7-BqT
I moved from Visual Basic (3 no less!) to C because I needed to optimize the performance of a software synthesis (like, sound synthesis) application I was developing at the time (mid-1990s). It boggles my mind to this day how much fucking work you had to do just to create a simple window in C. It instantly made clear why UIs at the time were so bad and I went back to Visual Basic for the UI with a compiled C DLL to do the heavy lifting.
There’s no excuse for why UIs are still so bad today.
The “excuse” is more or less the 20 or so replacements that have been made and died. I think Microsoft alone is responsible for 5 over the life of Windows.
We’ve more or less kinda settled on HTML only because it’s already wide spread. But it’s not perfect so more standards for the standards pile. Don’t worry, react will end up buried by the next thing on the pile eventually.
We’ve more or less kinda settled on HTML
It’s funny, one of the modern UI glitches that I hate the most is when a long bit of text is just truncated with ellipses instead of the whole thing being shown and you have to hold the mouse over to get it in a tooltip, or shudder actually click on the thing. HTML is great at word-wrapping and allowing the whole UI to “flow” with variable heights and widths as necessary - and yet that is never allowed to happen in apps.
this is like when I built that web server in x86 assembly lol.
I once write a web app in C, but this terrifies even me… though Tsoding, the guy in the video, did that, too…
I bet that thing was fast!
I mean, just because you implemented something in a low level lang, it doesn’t mean you’re gonna have the fastest implementation. Even in high level langs, there’s usually heavy optimization involved in things that are done all the time (e.g. web servers)
Who do you think is better at writing assembly? @[email protected] or a modern compiler with hundreds of contributors.
It’s definitely not me
Data visualization ≠ UI and signal processing is traditionally done in C
That looks like buttons in the thumbnail, on the left of the visualisation.
I’d say that’s enough to call it UI.UI. User Interface. The bridge between a system and a user. So anything, literally any information transfer from the user to the system OR from the system to the user, is a User Interface.
A definition so broad as to be useless.
Is it a UI when someone calls memcpy to move data from a file to a screen buffer?
This isn’t hard, you’re just trying to make it to be.
Memcpy from a file to a screen buffer is as much a UI as pouring water in a pot is a soup.
Not it isn’t.
A command line literally is a UI.
You seem to be confusing GUI and UI?
You seem to be confusing C stdlib with a CLI?
why would you take the least charitable interpretation? there is no need to be hostile.
and the answer, of course, is that it can be, as long as the information copied is meaningful for displaying to the user.
you’re basically asking the equivalent of whether putting things into an array is an algorithm, which of course has the answer “it can be, depending on how you put it in”. so basically, the operation you’re highlighting is not the point.I did not make this definition. However, this does not give you the freedom to make up your own definition and treat it as a fact. Don’t spread wrong information.
Umm this is just being retro. Like using a film camera.
Raw film is objectively higher quality than raw digital. Are you saying that C is objectively higher quality than Rust?
Excerpt film isn’t objectively better…
What’s this dude talking about?! Everyone knows no one hates React like people who code in React 😂 No one is gonna get pissed off watching this.
Do React devs really hate React?
I used to be a React dev. The only thing I hated more than React was my boss.
I love this guy he’s a fucking freak of nature
What’s React?
React is what powers your Windows Start Menu.
Wtf is that real
Yes, as well as parts of the settings menu. What’s not to love about constantly loading and unloading javascript just by clicking around in native apps? CPU spikes are good for your health.
I’m an elitist asshole and I hate that people say “react dev” when really it’s “web dev that uses react”
Is this distinction really all that useful?
I suppose you could write a react app that doesn’t use “the web”? But you still might just say they are a react developer.
Speaking of coding out of spite, is nobody going to mention that his C code features a
struct
with over 20 fields in it?That’s not uncommon, is it?
Not really, but I’d probably try to organize those into sub structures where it made sense. A data structure holding the UI state and FFT data all flat is kinda messy imo since it becomes unclear what is actually required where.
Oh man. You should see the source code for IOS (the Cisco one not Apple).
Spent 5 years working on it out of college. I think it’s the most cursed code base you can imagine.
Not necessarily because of the massive struct defs everywhere. They are kinda needed when you’re running an entire OS as basically a set of interacting Linux processes pretending to be an OS.
At some point Cisco realized they could not compete without putting a Linux kernel as their base. So they basically just copy and pasted the old code written in the early 90s for the IOS and put it into a set Linux processes.
To be clear. It’s not just the front end. They didn’t really change the code much from the old IOS. Its a cluster fuck of interprocess communication hacks that probably seemed like a good idea at the time.
It is a massive pain in the ass to code because you’re basically doing everything on the Linux kernel and then frustratingly have to write the CLIs for IOS just so Cisco can continue to sell their proprietary OS with some of the most unnecessary hardware locks. Massive learning curve for any new engineer.
Literally, no one on the entire switching team knew how to send a message from a specific process to the IOS process. I had been assigned something that needed it. So I somehow figured it out and was “the guy” for that for the time I spent there.
Fuck. I’m gonna start ranting more if I go any further. But yeah, sometimes you need a massive struct because some idiot decided that forcing a closed source CLI on the market is a good idea for profits.
Definitely not a good idea for coding. But you learn quickly that no one actually cares about good code in this industry. There is no time for it. There is no reason for it. Just spit out garbage until it works and your manager won’t care.
If you want clean code. Go write an open source project or a personal project.