• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle
  • “Generally, what happens to these wastes today is they go to a landfill, get dumped in a waterway, or they’re just spread on land,” said Vaulted Deep CEO Julia Reichelstein. “In all of those cases, they’re decomposing into CO2 and methane. That’s contributing to climate change.”

    Waste decomposition is part of the natural carbon cycle. Burning fossil fuels isn’t. We should not be suppressing part of the natural cycle so we can supplant it with our own processes. This is Hollywood accounting applied to carbon emissions, and it’s not going to solve anything.


  • A balloon full of helium has more mass than a balloon without helium, but less weight

    That’s not true. A balloon full of helium has more mass and more weight than a balloon without helium. Weight is dependent only on the mass of the balloon+helium and the mass of the planet (Earth).

    The balloon full of helium displaces way more air than the balloon without helium since it is inflated. The volume of displaced air of the inflated balloon has more weight than the combined weight of the balloon and helium within, so it floats due to buoyancy from the atmosphere. Its weight is the same regardless of the medium it’s in, but the net forces experienced by it are not.




  • Part of the reason that this jailbreak worked is that the Windows keys, a mix of Home, Pro, and Enterprise keys, had been trained into the model, Figueroa told The Register.

    Isn’t that the whole point? They’re using prompting tricks to tease out the training data. This has been done several times with copyrighted written works. That’s the only reasonable way ChatGPT could produce valid Windows keys. What would be the alternative? ChatGPT somehow reverse engineered the algorithm for generating valid Windows product keys?


  • The technological progress LLMs represent has come to completion. They’re a technological dead end. They have no practical application because of hallucinations, and hallucinations are baked into the very core of how they work. Any further progress will come from experts learning from the successes and failures of LLMs, abandoning them, and building entirely new AI systems.

    AI as a general field is not a dread end, and it will continue to improve. But we’re nowhere near the AGI that tech CEOs are promising LLMs are so close to.