Readit News logoReadit News
smusamashah · 2 years ago
Someone also improved speed of LCM recently.

Code: https://github.com/discus0434/faster-lcm Blog post (in Japanese): https://zenn.dev/discus0434/articles/12427b887b4082

Have no idea how this can be used but they claim 26fps on a RTX 3090.

cchance · 2 years ago
I'm sorry ... WHAT! I pay attention to the SD on reddit and no one saw this or somehow it got swept under the rug somewhere lol... Thats insane!

I wonder if SDXL Turbo + LCM will be a thing, to get to realtime generation

dragonwriter · 2 years ago
> I wonder if SDXL Turbo + LCM will be a thing

SDXL Turbo works best (at least from my trials today) with the LCM sampler, producing better results in fewer iterations with it than it does with Euler A.

smusamashah · 2 years ago
This was one of the top posts on reddit yesterday, not as much as turbo though.
dragonwriter · 2 years ago
Using the comfyui workflow [0] I'm getting really impressive results (obviously, not as quick as single step, but still very fast [1]) at 768x768, 10 steps, using the lcm sampler instead of euler ancestral, and putting CFG at 2.0 instead of 1.0.

[0] drop this image on the comfyui canvas: https://comfyanonymous.github.io/ComfyUI_examples/sdturbo/sd...

[1] On a 3080Ti laptop card

tetris11 · 2 years ago
I love that they embed entire workflows into the meta of their images
dragonwriter · 2 years ago
Combined with the ComfyUI Manager extensions which provides an index of custom node packages and can install missing ones from a loaded workflow it makes it very easy to get up and running with a new workflow.
dbreunig · 2 years ago
Reading through the comments, it's hard to not think of "Pygmalion and Galatea."
colesantiago · 2 years ago
I'm so glad this is now available for free and not needing to ask or being charged extra by an artist for different concepts.

The barrier is really being lowered and this is beautiful.

What a great time to be alive.

dragonwriter · 2 years ago
Bad news for that: it is only free for noncommercial use, and this isn't just a temporary early-release thing, but the new general direction for StabilityAI:

https://twitter.com/EMostaque

LordDragonfang · 2 years ago
Slightly off-topic, but holy crap, some of these use cases he retweeted are bonkers:

https://twitter.com/toyxyz3/status/1729922123119104476

https://nitter.net/toyxyz3/status/1729922123119104476

cchance · 2 years ago
I mean the fact SD has shown its possible just means that other research groups can also use the same concept...

Someone on Reddit was actually pointing towards a model thats more narrow in scope to SDXL but that was trained by a single guy on an A100, so no reason we can't expect other groups to pop up or maybe a consortium of freelancers from the fine tuning community to maybe get together to start there own base model.

nico · 2 years ago
Really impressive. Is that video sped up at all? Crazy fast!
yreg · 2 years ago
Not sped up.

Here's a live demo, but you need to register an account.

https://clipdrop.co/stable-diffusion-turbo

simbolit · 2 years ago
You need to provide an email address and click on a link. Emails from grr.la work.

Yes, it is "register an account", but it is the lowest friction I know.

cchance · 2 years ago
Nope and this is just the beginning imagine a year from now or once the people that bring us LCM and controlnet and ipadapter start looking at the possibilities, not to mention fine tunes on turbo.
pilotneko · 2 years ago
It's not sped up. I tried it locally last night and it was just as fast. Running on Windows 11 w/ a RTX 3090.
wccrawford · 2 years ago
I just tried it locally with a 3070 and it was about 3 seconds per render. I'm far from great at this stuff and it was my first use of ComfyUI, so I don't know if that number could be improved on my setup.
pbalcer · 2 years ago
On my machine with AMD RX 7900XT, it takes ~0.17s per image. Are you using SD Turbo Scheduler node?
Rastonbury · 2 years ago
Doesn't seem that crazy 3070s can do 20 step 768px in like 6 to 9s
acheong08 · 2 years ago
I really want to know if it would work on a CPU. I don’t have the money for a graphics card
NietTim · 2 years ago
You can already do that using existing models, but instead of generating 1 image taking a few seconds it will take at least a minute, perhaps SDXL Turbo brings that down
sebzim4500 · 2 years ago
You can run these models on CPU but it would be much slower than the demo.
unaindz · 2 years ago
Ideally you would run SDXL Turbo with OpenVino optimizations. I'm not aware of any project that has support for both but maybe there is something.
StantheBrain · 2 years ago
Any help ? When loading the graph, the following node types were not found: SDTurboScheduler
StantheBrain · 2 years ago
When loading the graph, the following node types were not found: SDTurboScheduler