kersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 months agoIt works thosh.itjust.worksvideomessage-square35linkfedilinkarrow-up1783
arrow-up1783videoIt works thosh.itjust.workskersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 months agomessage-square35linkfedilink
minus-squareJankatarch@lemmy.worldlinkfedilinkarrow-up42·edit-22 months agoI am actually pretty ok with this type of "messing around’ usage. On the condition they also stop killing the environment to train and run these stupid things.
minus-squareentropicdrift@lemmy.sdf.orglinkfedilinkarrow-up13·2 months agoYeah, if they were just running it locally off a GPU it would be cooler
minus-squarepsud@aussie.zonelinkfedilinkEnglisharrow-up7·2 months agoRunning an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.
I am actually pretty ok with this type of "messing around’ usage.
On the condition they also stop killing the environment to train and run these stupid things.
Yeah, if they were just running it locally off a GPU it would be cooler
Running an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.