When I saw this, I figured it was a clear step back from simply using plugins in the main ChatGPT view. It's basically plugins, but with extra prompting and you can only use one at a time.
But if you look at projects like Autogen ( https://github.com/microsoft/autogen ), you see one master agent coordinating other agents which have a narrower scope. But you have to create the prompts for the agents yourself.
This GPTs setup will crowd-source a ton of data on creating agents which serve a single task. Then they can train a model that's very good at creating other agents. Altman nods to this immediately after the GPTs feature is shown, repeating that OpenAI does not train on API usage.
Prediction: next year's dev day, which Altman hints will make today's conference look "quaint" by comparison, will basically be an app built around the autogen concept, but which you can spin up using very simple prompts to complete very complex tasks. Probably on top of a mixture of GPT 4 & 5.
The GPT Store will prove to be an interesting moderation and quality control experiment for OpenAI. Apple/Google have spent a lot of time and money on both of those things and they still have issues, and that's not even accounting for the fact that AI growth hackers will be the primary creators of GPTs. And a revenue sharing agreement will provide even more incentive to do the traditional App Store marketing shennanigans.
The more elaborate the moderation tooling, the faster the race to the bottom gets as revenue drives black-hat marketers to become experts in circumventing AI moderation. It could end up being Google “S”EO all over again where we eventually ended up with very little real information in the results.
Tickets filed by plugin developers (including requests to publish or update a plugin) are handled by bots. The only contact I've had with a human there has been via backchannels.
Not sure about currently - but budget increases seemed to be manually reviewed in a queue. I had to increase monthly budget amounts twice and it took like 10-30 hours for each approval, interesting to see how their processes advance.
I don't think the target market for this is people looking for extremely knowledgeable LLMs that can handle deep technical tasks, given that you can't even finetune these models.
I'd guess this is more of an attempt to poach the market of companies like character.ai. The market for models with a distinct character/personality is absolutely massive right now (see: app store rankings) and users are willing to spend insane amounts of money on it (in part because of the "digital girlfriend" appeal).
Can you elaborate a bit on why you think the market for distinct character/personality models is massive right now?
I ask only because I've been asked by a fairly well-known GAI company to help them 'personality-ize' some of their models and I'm trying to understand who else is doing it, where, and why – namely because I'd like to keep doing it.
Notice how it's "digital girlfriend" but not "digital boyfriend" that insane users will spend insane amounts of money on. So sad. But maybe it will distract the incels from harassing real women.
So these "GPTs" are the combination of predefined prompts and custom API access? Not customly trained LLMs?
If so, I guess you can make such a "GPT" on your own server and independent from a specific LLM service by using a prompt like
...you have available an API "WEATHER_API". If you need the
weather for a given city for a given day, please say
WEATHER_API('berlin', '2022-11-24')
and I will give you a new prompt including the data you
asked for...
Or is there some magic done by OpenAI which goes beyond this?
If you want to be independent for academic/personal reasons, sure you can.
If you want reasoning capabilities that Open Source hasn't matched before today, and I'm guessing just got blown out of the water on again today... there's no reason to bother.
> If you want to be independent for academic/personal reasons,
Or OpenAI's usage policy limits reasons (either because of your direct use, or because of the potential scope of use you want to support for downstream users of your GPT, etc.) Yes, OpenAI's model is the most powerful around. Yes, it would be foolish not to take advantage of it, if you can, in your custom tools that depend on some LLM. Depending on your use, it may not make sense to be fully and exclusively dependent on OpenAI, though.
You don't need to use an open source LLM for the approach I described. You can still send the prompts to OpenAI's GPT-4 or any other LLM which is available as a service.
Agreed, but one is a recognizable trademark (I assume) and one is a generic catch-all term that means a lot of things and would probably be worse from a branding/marketing perspective.
Not even. An agent should primarily operate in some kind of self-prompting loop. Afaik xou can not specify complex and branching looping behaviors for GPTs.
they are trying to trademark "GPT" and "GPTs"..in order to do this...they need to use it widely and specifically...i don't believe that it has gone through yet..so in order for it be approved they will use this name...then prevent others from using it....thats the game...
Barrier to entry for commercial or useful GPTs/Plugins/"Agents" is almost non-existant since its just a str.concat(hiddenprompt, user_prompt), the secret sauce (ie the weights, chat timeout and context length) are already generated/limited by OpenAI and they already have the content moderation/"hr dept" baked in at the weights level. So even if one was to create a "story writer helper" GPT, i don't see how it would be of any value generating new, unique and interesting content other than the prompt recipes we already have on reddit/r/chatgpt (heres 1000 prompts for every use case) that creates netflix like plots (inclusively diverse casting across ethnicities and orientations, socially conscious storylines, modern jargon-filled dialogue, themes of empowerment, progressive characters, and non-traditional relationship dynamics).
This will most likely be like the google play store with a 99% of GPTs being a repackaged public prompt.
I wonder how much money I could make making "GPTs" full time. Barrier to entry is nonexistent so I imagine highest revenue ones if this becomes a serious thing people use will be advertised externally or have some proprietary info/access.
OpenAI in a weird way has mediocre marketing. The examples they use for Dalle-3 are way worse than the average ones I see people cooking up on Twitter/Reddit. They only seem to demo the most vaguely generic implementations of their app. Even their DevDay logo is just 4 lines of text.
not much based on what OpenAI has been doing lately, using their own customers as product research and then copying the best ideas. OpenAI pretty much has to keep a huge lead in model capabilities or developers are going to stop using them for this reason
basically copying the Microsoft strategy of Embrace, Extend, Extinguish. Makes sense they took so much funding from Microsoft
I'm more confused how the revenue share works. Do they get part of my ChatGPT subscription fee? Am I paying extra? Per bot? Per amount of time I consult with the bot?
I'm certain of one thing...when I'm making a GPT, I save, I update and it completely disregards established rules, making me max out my requests very fast. It also gives general answers when I ask for specifics. I only have Plus. In regards to general answers...I then went to Bard and got exactly what I was looking for after 1 request-this is well known information that's been online for years, information that Bing certainly has...it was making me angry, basically charging me for doing nothing over and over.
Thoughts on Zapier trying to become OpenAI faster than OpenAPI can become Zapier? There will always be a long tail of APIs that folks want integrating, but the most popular APIs are perhaps only a few hundred in number (Google Calendar and Slack, for example).
So many zapier integrations are half baked. A lot of them are good for reacting to events but not for searching for data (i.e. you can use zapier to react to a jira ticket change but can't use zapier to query jira for ticket info)
And seeing how OpenAI is moving up the value chain, what's the guarantee they won't come up with an in-house competitor to the bot that was built on their platform?
Guarantee? That's one of the most important aspects of bothering to build out a platform: eat the ecosystem to add incremental value to the platform, bolster the moat.
The guarantee is that they will clone and extinguish most of the best bots/tools/services riding on top of GPT. It's what platforms overwhelmingly tend to do.
If your thing is a near-touch to GPT, depends on it, is of medium complexity or lower, and is very popular/useful, they will build your thing into GPT eventually.
On the flip side, if you're Salesforce and using GPT to augment something about a major product, you're not facing a serious threat of OpenAI trying to become the next big CRM company.
This process will work similarly to how it did with Windows, Google, AWS, etc.
I think something could be said for "virality" as well - could easily see some entertainment or lifehack themed templates blowing up on TikTok. No one wants to post the output of the lame, less popular template on their story!
> I wonder how much money I could make making "GPTs" full time
I don’t get why people are thinking along these lines at all. Like, if you don’t own and control the LLM yourself, what makes you so sure OpenAI will allow you to make money at all? They could make advertising externally or hosting external marketplaces against the TOS. They could copy your GPT and put their “official” version at the top of the store page. Just because a technology is powerful does not necessarily mean you can make money off of it.
> what makes you so sure OpenAI will allow you to make money at all?
They might not. But if they do, I'd imagine there are a lot of people who will try. And as long as you're not dependent on the income stream they provide, you don't have much to worry about if it gets shut off.
Imagine a "GPT" that could generate websites and provide you with a live deployment as you change it using natural language. A website builder GPT that is primed to output and design in a decent way, that has all the prep beforehand to use particular libraries, and integrations with something like Render.
The GPT's making the most money will be made by larger companies who advertise use of it and maybe make it a funnel to their in-app integration, or GPTs which are made effective by information that is proprietary.
"Example GPTs are available today for ChatGPT Plus and Enterprise users to try out including Canva and Zapier AI Actions." and yet as a paying ChatGPT Plus customer, neither the Canva nor the Zapier AI Actions link work for me, I get a "GPT inaccessible or not found" error for Canva or Zapier.
> Example GPTs are available today for ChatGPT Plus
or
> Starting today, no more hopping between models; everything you need is in one place.
Neither of which are true. I'm a paying user and I have access to neither. They do this _all the time_. They announce something "available immediately" and it trickles out a week or more later. If they want to do gradual rollouts (which is smart) then they should say as much.
> We believe the most incredible GPTs will come from builders in the community. Whether you’re an educator, coach, or just someone who loves to build helpful tools, you don’t need to know coding to make one and share your expertise.
“Please work for us for free, while we keep all the product of your work for ourselves like we did with the content we scraped on the internet.”
Isn’t that the game they all play? Amazon Marketplace. Apple App Store. Let the guinea pigs run. See which one gets the furthest. Then take away its lunch.
But if you look at projects like Autogen ( https://github.com/microsoft/autogen ), you see one master agent coordinating other agents which have a narrower scope. But you have to create the prompts for the agents yourself.
This GPTs setup will crowd-source a ton of data on creating agents which serve a single task. Then they can train a model that's very good at creating other agents. Altman nods to this immediately after the GPTs feature is shown, repeating that OpenAI does not train on API usage.
Prediction: next year's dev day, which Altman hints will make today's conference look "quaint" by comparison, will basically be an app built around the autogen concept, but which you can spin up using very simple prompts to complete very complex tasks. Probably on top of a mixture of GPT 4 & 5.
Also RIP chatgpt plugnins
The more elaborate the moderation tooling, the faster the race to the bottom gets as revenue drives black-hat marketers to become experts in circumventing AI moderation. It could end up being Google “S”EO all over again where we eventually ended up with very little real information in the results.
I'd guess this is more of an attempt to poach the market of companies like character.ai. The market for models with a distinct character/personality is absolutely massive right now (see: app store rankings) and users are willing to spend insane amounts of money on it (in part because of the "digital girlfriend" appeal).
The ban on "adult themes" is part of the reason people use services other than OpenAI for that kind of thing in the first place.
I ask only because I've been asked by a fairly well-known GAI company to help them 'personality-ize' some of their models and I'm trying to understand who else is doing it, where, and why – namely because I'd like to keep doing it.
If so, I guess you can make such a "GPT" on your own server and independent from a specific LLM service by using a prompt like
Or is there some magic done by OpenAI which goes beyond this?However creating something like this previously required a Jupyter notebook but now just...asking for it. Makes it accessible to 10x more people
If you want reasoning capabilities that Open Source hasn't matched before today, and I'm guessing just got blown out of the water on again today... there's no reason to bother.
Or OpenAI's usage policy limits reasons (either because of your direct use, or because of the potential scope of use you want to support for downstream users of your GPT, etc.) Yes, OpenAI's model is the most powerful around. Yes, it would be foolish not to take advantage of it, if you can, in your custom tools that depend on some LLM. Depending on your use, it may not make sense to be fully and exclusively dependent on OpenAI, though.
Sorry, but I won’t bite the marketplace as normally wipe the top agents with official ones, like apple did with native apps.
It’s crowdsourced AGI
The lack of transparency for how the product works behind the scenes will most likely make it difficult to build something effectively.
This will most likely be like the google play store with a 99% of GPTs being a repackaged public prompt.
I saw that announcement and my immediate thought was "God yet another thing passive income youtubers will be shilling soon"
In general, I was a little confused by this. Sam's demo of creating a GPT didn't seem particularly exciting as well.
I’d pay for one that was good at programming rubber ducking
There are specific sub-tasks that everyone would pay for to make their lives easier. This marketplace is trying to make that efficient
Study the Poe ecosystem, and look at YouTube or Reddit.
. . .
EDIT: adding the detail from my comment 2 below, I'm referring to...
Bot creator monetization:
https://developer.poe.com/resources/creator-monetization
Earn money when:
- your bot brings a user to Poe for the first time and they eventually subscribe
- your bot brings a user back to Poe and they eventually subscribe
- your bot’s paywall is seen just before a user subscribes users send messages to your bot (starting soon)
basically copying the Microsoft strategy of Embrace, Extend, Extinguish. Makes sense they took so much funding from Microsoft
The guarantee is that they will clone and extinguish most of the best bots/tools/services riding on top of GPT. It's what platforms overwhelmingly tend to do.
If your thing is a near-touch to GPT, depends on it, is of medium complexity or lower, and is very popular/useful, they will build your thing into GPT eventually.
On the flip side, if you're Salesforce and using GPT to augment something about a major product, you're not facing a serious threat of OpenAI trying to become the next big CRM company.
This process will work similarly to how it did with Windows, Google, AWS, etc.
I don’t get why people are thinking along these lines at all. Like, if you don’t own and control the LLM yourself, what makes you so sure OpenAI will allow you to make money at all? They could make advertising externally or hosting external marketplaces against the TOS. They could copy your GPT and put their “official” version at the top of the store page. Just because a technology is powerful does not necessarily mean you can make money off of it.
It takes significant effort to come up with good use cases, build the prompts, and advertise the bots.
So a company can get a lot of value by going after this up and coming type of "content creator".
They might not. But if they do, I'd imagine there are a lot of people who will try. And as long as you're not dependent on the income stream they provide, you don't have much to worry about if it gets shut off.
Anyone with factual data (proprietary or not) is now an input away to AI / GPTs.
Data (or a new foundational model) is now the moat.
People would pay for that.
Sh--... I better build it!
Edit: here is the message I get:
Your access to custom GPTs isn’t ready yet. We’re rolling this feature out over the coming days. Check back soon.
> Example GPTs are available today for ChatGPT Plus
or
> Starting today, no more hopping between models; everything you need is in one place.
Neither of which are true. I'm a paying user and I have access to neither. They do this _all the time_. They announce something "available immediately" and it trickles out a week or more later. If they want to do gradual rollouts (which is smart) then they should say as much.
> We believe the most incredible GPTs will come from builders in the community. Whether you’re an educator, coach, or just someone who loves to build helpful tools, you don’t need to know coding to make one and share your expertise.
“Please work for us for free, while we keep all the product of your work for ourselves like we did with the content we scraped on the internet.”
OpenAI really is next level parasitism.