• 1 Post
  • 24 Comments
Joined 3 years ago
cake
Cake day: July 4th, 2023

help-circle


  • You don’t even need to ban it. You just need to open up the marketplace to alternatives. Cory Doctrow has been pushing the idea that the best retaliation is to simply repeal anti-circumvention laws and allow companies to begin chipping away at the walled gardens of the tech giants. For example, John Deere famously puts software locks on its tractors so that even simple repairs require the owner to pay for a technician to come out and “authorize” the newly installed part or else the machine will refuse to start. This system could almost certainly be bypassed, but right now the law not only allows manufacturers to lock their tractors, it also forbids anyone else from unlocking them. If the EU simply repealed the law that bans circumvention then some clever EU citizen could legally reverse engineer the software running on those tractors and start a business selling unlocking software. They could make it a one-time purchase at 10x the cost of an official tech visit and make money hand over fist while still saving their customers time and money in the long term.
    And of course it’s not just tractors. Make a third party app store for the iphone that charges half the commission of Apple. Make a tool that allows seamless account migration from Google to the independent cloud provider of your choice. A huge amount of corporate rent seeking is enabled by anti-circumvention laws.



  • The bloodletting will hit hardest in back-office operations, risk management, and compliance

    Compliance? Really? They’re going to put the non-deterministic halucination engines in charge of compliance?

    I can only hope that the first time this goes south the regulators understand how important it is to make an example to everyone else that a company can not escape liability for their actions just because the AI system that took those actions wasn’t explicitly instructed to do so.



  • An excellent talk from Doctrow as always. A lot of it I’ve hard before, but this part was new to me.

    The thing is, software is not an asset, it’s a liability. The capabilities that running software delivers – automation, production, analysis and administration – those are assets. But the software itself? That’s a liability. Brittle, fragile, forever breaking down. …
    Now, obviously, tech bosses are totally clueless when it comes to this. They really do think that software is an asset. That’s why they’re so fucking horny to have chatbots shit out software at superhuman speeds. That’s why they think it’s good that they’ve got a chatbot that “produces a thousand times more code than a human programmer.”
    Producing code that isn’t designed for legibility and maintainability, that is optimized, rather, for speed of production, is a way to incur tech debt at scale.




  • Every time I see a headline like this I’m reminded of the time I heard someone describe the modern state of AI research as equivalent to the practice of alchemy.

    Long before anyone knew about atoms, molecules, atomic weights, or electron bonds, there were dudes who would just mix random chemicals together in an attempt to turn lead to gold, or create the elixir of life or whatever. Their methods were haphazard, their objectives impossible, and most probably poisoned themselves in the process, but those early stumbling steps eventually gave rise to the modern science of chemistry and all that came with it.

    AI researchers are modern alchemists. They have no idea how anything really works and their experiments result in disaster as often as not. There’s great potential but no clear path to it. We can only hope that we’ll make it out of the alchemy phase before society succumbs to the digital equivalent of mercury poisoning because it’s just so fun to play with.



  • I’m hopeful that when the bubble pops it’ll be more like the dot com crash, which is to say that the fallout is mostly of the economic variety rather than the superfund variety. Sure, that’ll still suck in the short term. But it will ideally lead to the big players and VC firms backing away and leaving behind an oversupply of infrastructure and talent that can be soaked up at fire sale prices by the smaller, more responsible companies that are willing to stick out the downturn and do the unglamorous work of developing this technology into something that’s actually sustainable and beneficial to society.

    That’s my naive hope. I do recognize that there’s an unfortunately high probability that things won’t go that way.






  • I looked into it a while ago but I gave up on the idea after realizing how few programs can actually run on one. There’s no “reverse VM” software that allows you to seamlessly combine multiple physical machines into one virtual one. Each application has to be specifically designed to take advantage of running on a cluster. If you’re writing your own code, or if you have a specific project in mind that you know supports cluster computing then by all means go for it, but if you’re imagining that you’d build one and use it for gaming or video editing or some other resource intensive desktop application, unfortunately it doesn’t work like that.

    Edit: I dug up a link to the post I made about it in /c/linux. There’s some good discussion in there if you’d like to learn more https://lemmy.world/post/11528823


  • Check how nearby colleges and universities dispose of used assets. The state school near me maintains a very nice website where they auction off everything from lab equipment to office furniture. It’s also where all their PCs go when they hit ~5 years old and come up in the IT department’s refresh cycle. Only problem in my case is that they tend to auction stuff in bulk. You can get a solid machine for $50 to $100, but only if you’re willing to pay $500 to $1000 for a pallet of 10.


  • The AI hype has become so stifling that I can certainly understand the knee-jerk reaction that many are having, but at the end of the day it’s just math. It’s a tool which, like all tools, can be used for good or bad. If you’re interested, I’d highly recommend looking into the story of AlphaFold. You can find some really good long form articles that have been written about it. It’s not only a truly impressive technical achievement, but also one of the few technological breakthroughs in recent memory that is (at least as far as I can tell) truly, indisputably, good for humanity.


  • Your thoughts are very similar to mine. The usefulness of machine learning to bridge the gap between the endlessly messy real world and the strictly regimented digital one can’t be overstated, but throwing all your problems into an LLM chatbot is never going to yield good results. Case in point is the AlphaFold project which used AI to basically solve one of the hardest problems in modern biochemistry. They didn’t do it by asking ChatGPT to “please fold this protein for me” they did it by assembling a multi-disciplinary team with a deep technical understanding of the problem and building a fully custom machine learning system which was designed from the ground up for the sole purpose of predicting protein structures. That’s applied AI. All these so-called AI startups who’s product is basically a coat of paint and a custom system prompt on the ChatGPT API are nothing more than app developers chasing the latest trend.

    Incidentally, I had a very similar experience to your “Exchange Server / Exchange Online” problem. I was asked to make updates to an old VBS code base, and VBS is a language that I have neither experience with nor interest in learning, so I was using a chatbot to try and save a few minutes on some super simple beginner level questions. It kept confidently spitting out answers for VBA code, which is similar to, but very much not the same as VBS.