> (Unlike say LLMs where GPT-4 is clearly dominating for now.)
A lot of this comes from people comparing GPT-4 to e.g. LLaMA-7B, because that's the thing that fits in memory on their laptop. Whereas you can run LLaMA-65B, and it's dramatically better, but it uses about 128GB of RAM and the hardware needed to run it fast is expensive.
And GPT-4 has even more parameters than that, but that's not a matter of the tooling, it's that someone needs to release a public model with more parameters.
That's part of the point though. I get better results from stable diffusion on my PC than out of DALL-E 2. (I still have some credits there, but little reason to use them.)
I can't do that with LLaMA-65B. (Although to be fair 128 GB RAM is not that much.) But I suspect it's still far less capable than GPT-4, is it not?
It's a popular GUI for stable diffusion models with many extensions. Like the sibling comment points out, everyone calls it like that because that's the handle of the original maintainer. (As in: Which web ui? Auto11's.)
It would be cool to see that become a GIMP plugin next, I think that would be more of a direct alternative to the workflow using generative fill within photoshop.
"....dropped...." in the context of new albums/music being released means "made available", however in the context of software features (or general english vernacular) it means "removed".
examples
"When Weird Al's song 'White and Nerdy' dropped, I stopped everything so I could listen to it."
"In the latest news, Microsoft has dropped the ability to log on locally to your PC. All logins require internet connectivity and a MS Account."
This isn't a gripe about HN 'headline' rules; it's a complaint about a use/misuse of slang. Oh get off my lawn too please.
Haven't looked up the etymology, but I've usually heard it in reference to the recording industry. I assume it either refers to dropping a record on a platter / needle onto a record, or "dropping off" new releases at a record store.
I read "dropped" as some feature they had and removed 168 hours ago causing fallout and more anger with their subscription model, a bit overloaded of a term I guess.
Same. I saw a reddit post last night about someone complaining that the generative fill wasn't working on their machine.... so when I saw this headline, I first thought that they'd rolled it back in a sloppy way.
The work done by the Photoshop devs is extraordinary. The artistry by some of the people creating these illustrations is similarly excellent.
These bullshit copy-paste threads from AI "influencers" and devrel hacks are a scourge - bandwagon "content" from people who produce nothing of value themselves other than tricking people into buying what I can only assume is $500 video courses repackaging 6 month old blog articles from someone else.
Not to be all hipster, but outpainting was available in Dall-E in April 2022. Impressive, yes, but not really all that novel. I did this a year ago: https://www.artstyle.ai/uncropping-movie-posters/
Cool stuff. Did the OP think this was the best list aggregating uses of Generative Fill? Or is this kind of a hit at the AI "influencers" jumping on everything that is released these days? Or is there a better list? I see some links showing up in comments here.
After seeing things go from 0 to Stable Diffusion and beyond, the fact that GenAI can do x,y,z; well, seems like a given eventuality.
I remember when CS came out and it had the magnetic lasso tool, overnight people could save hours and hours of time in their work, this reminds me of that.
https://github.com/Mikubill/sd-webui-controlnet/discussions/...
So far it seems that the OSS diffusion models + tooling that we can run locally keep being state of the art. It makes me so happy.
(Unlike say LLMs where GPT-4 is clearly dominating for now.)
Since we have capable local hardware, I'll propose this as an alternative once we get an estimate of our Firefly costs.
A lot of this comes from people comparing GPT-4 to e.g. LLaMA-7B, because that's the thing that fits in memory on their laptop. Whereas you can run LLaMA-65B, and it's dramatically better, but it uses about 128GB of RAM and the hardware needed to run it fast is expensive.
And GPT-4 has even more parameters than that, but that's not a matter of the tooling, it's that someone needs to release a public model with more parameters.
I can't do that with LLaMA-65B. (Although to be fair 128 GB RAM is not that much.) But I suspect it's still far less capable than GPT-4, is it not?
"....dropped...." in the context of new albums/music being released means "made available", however in the context of software features (or general english vernacular) it means "removed".
examples "When Weird Al's song 'White and Nerdy' dropped, I stopped everything so I could listen to it."
"In the latest news, Microsoft has dropped the ability to log on locally to your PC. All logins require internet connectivity and a MS Account."
This isn't a gripe about HN 'headline' rules; it's a complaint about a use/misuse of slang. Oh get off my lawn too please.
Dead Comment
Here are n [jawdropping|amazing|stunning] examples of what people has done with it.
These bullshit copy-paste threads from AI "influencers" and devrel hacks are a scourge - bandwagon "content" from people who produce nothing of value themselves other than tricking people into buying what I can only assume is $500 video courses repackaging 6 month old blog articles from someone else.
After seeing things go from 0 to Stable Diffusion and beyond, the fact that GenAI can do x,y,z; well, seems like a given eventuality.
- https://twitter.com/maddoxrules/status/1663715966755430401
https://www.reddit.com/r/ChatGPT/comments/13wfaqg/photoshop_...
https://share.getcloudapp.com/2NupkrOg
I remember when CS came out and it had the magnetic lasso tool, overnight people could save hours and hours of time in their work, this reminds me of that.