misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 day agoAlibaba Releases Advanced Open Video Model, Immediately Becomes AI Porn Machinewww.404media.coexternal-linkmessage-square202fedilinkarrow-up1600arrow-down112file-text
arrow-up1588arrow-down1external-linkAlibaba Releases Advanced Open Video Model, Immediately Becomes AI Porn Machinewww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 day agomessage-square202fedilinkfile-text
minus-squareAbsoluteChicagoDog@lemm.eelinkfedilinkEnglisharrow-up30·7 hours agoWhat’s the URL for this generator so I can avoid it?
minus-squarebrucethemoose@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-227 minutes agoThis is Lemmy. Why not self host the generation? :)
minus-squareSippyCup@feddit.nllinkfedilinkEnglisharrow-up22·6 hours agoDefinitely don’t click on this link otherwise you might try to install an AI locally
minus-squaremadcaesar@lemmy.worldlinkfedilinkEnglisharrow-up5·4 hours agoDepraved! Disgusting! I’d never! Unless??? (👁 ͜ʖ👁)
minus-squaremorrowind@lemm.eelinkfedilinkEnglisharrow-up3·3 hours agogood luck trying to run a video model locally Unless you have top tier hardware
minus-squarebrucethemoose@lemmy.worldlinkfedilinkEnglisharrow-up4·3 hours ago1.4B should be surprisingly doable, especially once quantization/optimized kernels are figured out. HunyuanVideo can already run on a 12GB desktop 3060.
What’s the URL for this generator so I can avoid it?
This is Lemmy. Why not self host the generation? :)
Definitely don’t click on this link otherwise you might try to install an AI locally
Depraved! Disgusting! I’d never!
Unless??? (👁 ͜ʖ👁)
good luck trying to run a video model locally
Unless you have top tier hardware
1.4B should be surprisingly doable, especially once quantization/optimized kernels are figured out.
HunyuanVideo can already run on a 12GB desktop 3060.