Readit News logoReadit News
sholladay · 2 years ago
I want ChatGPT for Family.

The free version gets a lot of use around here but the most powerful feature is the ability to search the web, which is only available to paid users. I pay $20/month for myself and I’d happily pay a bit more for the whole family, but not $20/month per person - it adds up. Family members end up asking to borrow my phone a lot to use it.

Give me a 3-4 person plan that costs $30-$40/month. You’re leaving money on the table!

freediver · 2 years ago
At Kagi we plan to offer this for $20/mo and 6 family members included. You get both paid search (much better than openai bing) + AI (gpt-3.5-turbo/claude-instant). If you need gpt-4 it will be an optional $15/mo upgrade per family member.
idiotsecant · 2 years ago
This is not a comment on the Kagi service, but more a comment on transitions in general. I have tried Kagi and I think it's great. I really want to use Kagi. I want to support Kagi. I have a mental stickynote that says 'start using Kagi on everything'. Every time I sit down to do some tasks it just falls to the bottom of the to-do pile because I feel like there's so many devices I now need to go through and update. Google really has a powerful advantage by bundling search in with the browser product. Isn't that what got microsoft into anti-trust trouble? How is it allowed?
uneoneuno · 2 years ago
I've been playing around with the assistant stuff and adding !expert to my searches to see what the LLM spits out first as a quick check. I'd love if I could get my custom assistant to work - sounds like a lot of fun to be had there.
fouc · 2 years ago
is gpt-3.5-turbo/claude-instant better than the model that free tier of chatgpt uses? FWIW, from my testing dolphin-2.5-mixtral-8x7b was clearly better than free tier chatgpt.
freedomben · 2 years ago
Nice, I'm looking forward to that! You guys have some pretty outstanding AI chops going. I've been really impressed!
twodayslate · 2 years ago
Sounds great. Will that plan also have access to the Search API which is currently restricted to Teams plans?
hmottestad · 2 years ago
I haven't found the web search feature particularly useful or helpful. Far too many sites are blocking the ChatGPT bot. I also find that ChatGPT isn't getting any better search results that I would if I searched for something myself. Quality of the results varies a lot too, and ChatGPT doesn't really seem to be able to distinguish between high quality content and not so high quality content.

For software development I find that Phind is pretty good at combining search results with GPT-4 in a way that increases the quality of the result.

Maybe OpenAI can convince the Bing team to index everything using their embeddings. If ChatGPT could also read the text directly from Bind instead of having to "surf the web" it would be able to consume several search results at the same time. In the future I could even see Bing et al. running an LLM over all text when indexing a page to extract info and stats like a summary, keywords, truthfulness, usefulness, etc.

freedomben · 2 years ago
Same experience. 90% of the time I ask it to summarize something, it can't because it's blocked. At least it has the decency to tell me that it's blocked rather than just failing (which is what Kagi does. Love Kagi, but that's a minor improvement they could make).

This is where I suspect Bard is going to be an absolute beast of a product. Ability to quickly and thoroughly consume a bunch of hits and find the best and summarize and such is something uniquely able for Google (and increasingly, Kagi)

osigurdson · 2 years ago
I feel that LLMs have the potential to reorganize the web. Instead of being ad sponsored, raw, high quality data will be priced and aggregated.
rachele_ · 2 years ago
A workaround for this is to print the site as a pdf and upload it to GPT.
lhl · 2 years ago
ChatGPT's web search is interminably slow and I've added to my custom prompt to not do web searches unless explicitly asked. However, I'd give Perplexity.ai a try - I've found it to be incredibly fast and useful (funnily enough, they largely also use Bing for search retrieval results) and if you pay for Pro (which I do now), you can also use GPT-4 for generating responses.
ringofchaos · 2 years ago
I also had some good experience woth default free version of phind. I was facing a issue in a python framework, which turned out to be a bug. Phind was able to pinpoint the github discussion where the issue was raised and also suggested workaround code example based on the github issue. No other free AI tools were able to do this.
px43 · 2 years ago
I have a custom GPT for telling my 3 year old bedtime stories. It's super cute to listen to the two of them collaborate back and forth where my kid will add new characters (friends from school, or stuffed animals) and new wacky twists to their adventures, and the storyteller GPT will come up with a new revised version.

It would be pretty rad if she could just have the app on her tablet with a family plan. She doesn't use it quite enough to justify getting her own subscription, but especially if we could share GPTs across devices, so she gets the ones I make for her, but doesn't get flooded with my work or research related GPTs.

krzyk · 2 years ago
Oh, how does your 3 year old interact with GPT?

BTW. I read once some person made automated generation of bed time stories (with childrens as the main characters) for his children using open AI API and speakers - I was quite amazed (not a thing I would do, but nice usage for gpt).

taylorhou · 2 years ago
ummm how do i get this? i've got a 5, 3, and 1 year old and would love this
siva7 · 2 years ago
I'm certain that they will soon release anything that promises more subscriptions. ChatGPT for Family, ChatGPT for Gov, and so on...
whycome · 2 years ago
ChatGPT for Kids™
worldsayshi · 2 years ago
There seems to be a plethora of somewhat ChatGPT competitive alternatives that does search the web at this point though. Maybe try phind.com?

(Although I haven't yet myself tried any alternative that is clearly on par with ChatGPT 4)

unnouinceput · 2 years ago
Can't you use the same account on multiple phones though? I thought this is a no brainer.
lhnz · 2 years ago
This is probably correct but I'd prefer that family don't read the conversations I've had, as even if I'm not saying anything too private, it feels too intrusive (it'd be a bit like reading my inner thoughts).
sholladay · 2 years ago
As a general rule, I don’t share account access. I can count on one hand the number of times I’ve made an exception to that rule and it was always for something relatively benign like Spotify. Privacy isn’t the only reason to avoid sharing, either.

I don’t even like that when my family picks up the remote, Apple TV assumes it’s me using the TV. They watch something and mess up my Up Next history and recommendations. I wish it supported using a PIN. I’ve thought about getting rid of the remote to force everyone to use their phone as a remote, because then it detects who is using it and automatically switches accounts. But that means everyone has to have an iPhone and have their phone charged, etc. Getting rid of the remote just for my convenience seems too inconsiderate.

Deleted Comment

londons_explore · 2 years ago
Can't you just share the login details?
selfportrait · 2 years ago
Sharing the same space and turning off/on the custom instructions is also very annoying.
teleforce · 2 years ago
Agreed on the most powerful feature is the ability to search the web. This feature single-handedly makes ChatGPT a very potent Google search alternative but without the dreaded advertisements.

Deleted Comment

eru · 2 years ago
Bing's version can search the web.
cyanydeez · 2 years ago
I guarantee you they aren't leaving money on the table. they're running the same techno capitalist playbook.

they want you hooked on apps, API, etc, before the real costs are brought in. they likely should be charging anywhere from 50-100$ depending on hours

minimaxir · 2 years ago
A notable feature here is "no training on your business data or conversations" which really shouldn't have to be a feature. (requests using the ChatGPT API already aren't trained upon)
alwa · 2 years ago
Similarly you can opt your individual account out on the ChatGPT side. [0] Although by default they do seem to use both vanilla and Plus conversations for training.

[0] https://privacy.openai.com/policies

MacsHeadroom · 2 years ago
If you opt out you lose access to basically every feature you're paying for. No conversation history, no access to plugins, etc.
ttul · 2 years ago
I don’t agree with you here. OpenAI should be free to train on your data assuming you agreed to that in the terms of service (and yes, you did). If they ask for a little more money in exchange for not having that valuable information in trade, that seems fair.

If you want an entirely free and open LLM experience, you can also run one of the ever-improving open source models on your own hardware. But for many if not most companies, paying $25/mo per user for something as amazing as ChatGPT-4 is a bargain.

unshavedyak · 2 years ago
> I don’t agree with you here. OpenAI should be free to train on your data assuming you agreed to that in the terms of service (and yes, you did). If they ask for a little more money in exchange for not having that valuable information in trade, that seems fair.

Yea, another way to word it would be to imagine that they _only_ had a more expensive "no train" option. Now ask if it would be okay to offer a lower priced but yes-train version.

andrewstuart2 · 2 years ago
This is ChatGPT, not OpenAI's API with the gpt4 models. This is allowing your team to use chat.openai.com together, rather than having to build or deploy your own with the API.
ashu1461 · 2 years ago
Yes but even in ChatGPT the training is not done ?
sjwhevvvvvsj · 2 years ago
For some reason “Confidentiality Tax for Small Business” has less of a ring to it than “Teams”.
ParetoOptimal · 2 years ago
Does anyone really trust openai isn't training on their data given their views on copyright?

It would make more sense for them to just train on it anyway.

phh · 2 years ago
My personal guess is that they don't put it in training data, BUT they still have human read what you send, to see 1. what are the innovative uses that they can try to copy/integrate; 2. optimize (both in score and throughput) for their consumer's usage.
dvngnt_ · 2 years ago
if there EULA says one thing and they do another that opens them up to huge liabilities. though anything is possible
johnfn · 2 years ago
Google trains on all your searches. Why is OpenAI held to a higher standard?
matheusmoreira · 2 years ago
Google should not be allowed to do that either.
dmonitor · 2 years ago
OpenAI's product might just print out the info you put in verbatim to another user if asked politely
wildpeaks · 2 years ago
Note that "no training on your data" is only for Team and Enterprise: https://openai.com/chatgpt/pricing
lhl · 2 years ago
You can make a privacy request for OpenAI to not train on your data here: https://privacy.openai.com/

Alternatively, you could also use your own UI/API token (API calls aren't trained on). Chatbot UI just got a major update released and has nice things like folders, and chat search: https://github.com/mckaywrigley/chatbot-ui

happytiger · 2 years ago
It should be opt out by default: not opt in.
dan_bez · 2 years ago
no training on API as well. I integrated it with Telegram over a year ago. For convenience rather than for cost savings. Been paying $2 per month on average ever since. And "No training on your data" is included.
queueueue · 2 years ago
The API is not used for training purposes either. https://openai.com/enterprise-privacy
londons_explore · 2 years ago
I suspect that user data isn't really valuable for training from anyway - the data will be full of users lying to the bot to try to manipulate it.

But "we won't train from your data" is a powerful marketing line, and differentiator between classes of customer, even if they have no intention to train from the data of anyone.

ta988 · 2 years ago
A major change is that you cannot opt out from having your conversations used for training unless you are usig a team account which is pretty costly for a single person.
tedsanders · 2 years ago
According to this, you can still opt out of training, but you have to turn off history: https://help.openai.com/en/articles/7730893-data-controls-fa...
OJFord · 2 years ago
That's been true for at least a month, not new with (though it may have been in anticipation of) teams support.

Deleted Comment

emsign · 2 years ago
Sneaky buggers
ec109685 · 2 years ago
This link lets individuals opt out: https://privacy.openai.com/policies
dizzydes · 2 years ago
OpenAI understand their tech lead isn't a sustainable moat, so are going for network effects. Similar to Slack Connect (shared channels).
weatherlite · 2 years ago
I heard the no moat theory before and I don't get it. The open source models are about a year or two behind the latest ChatGPT in terms of quality. That means companies will always be willing to pay premium to use ChatGPT and not rely on open source. So even if/when Google and Apple (and perhaps Meta) catch up in terms of A.I quality, there's still so much money to be made for OpenAI. One interesting by product of late game capitalism like this is as more and more jobs get destroyed due to A.I, so will subscriptions. So it might be a mixed bag in the end for the tech giants if there's no real economy to buy the products anymore, but we're a long way from there.
hackerlight · 2 years ago
I think no moat vs moat is a false dichotomy. They have a moat (better researchers and data) and are about to make it even better (network effects).
goatlover · 2 years ago
It was on researcher's opinion at a competing company, and everyone treated it as fact.
phillipcarter · 2 years ago
Yeah, this is something I've been saying as well. Their true "moat" is their network of people who know and understand how to know use their tech.
ttul · 2 years ago
It’s the “we will make this so easy for you that you never want to switch” moat. Definitely akin to Slack, which also has the integration glue to keep you on their platform. Even though there are many Slack alternatives now that are really great, most companies on Slack will opt to stay there rather than invest in migrating.
wand3r · 2 years ago
Adjacent question, leaving aside value proposition. Do companies pay for 1000 seats like this? I didn't realize slack is $5 a user a month. Do they discount this for bulk, or are companies paying $5k/month $60k/yearly? These subscriptions must really add up.

On All In, they discussed the leverage from AI tools and they probably also meant open source, but one of the companies just rolled their own instance of a big monthly SaaS product because it was such a big expense for the startup.

mikepurvis · 2 years ago
I'm not really in the know, but I bet the enterprise discounts don't kick in until you're at the tens of thousands of users. In any case, $60k sounds like a lot as a top-line figure to some bean-counter, but all these sales pitches follow the same basic pattern:

- This is an essential, best-in-class tool. You wouldn't deny your employees a laptop or a free lunch, would you?

- $5/user/mo is a bargain compared to the hassle of building/hosting this yourself, punching holes in your firewall every time you need to receive a webhook, dealing with security and auth issues.

- $60k is half the cost of someone you don't need to hire on your in-house IT team. Does it make sense yet?

rrr_oh_man · 2 years ago
> I bet the enterprise discounts don't kick in until you're at the tens of thousands of users

I'll take that bet ;) Not really sure about OpenAI, but you can absolutely negotiate with almost any company.

marpstar · 2 years ago
This is why the price of "Enterprise" level of SaaS is always "Contact Us". Contract deals (i.e. "lock-in") are negotiated behind the scenes.
ren_engineer · 2 years ago
you'd be amazed at how many startups waste 100s of thousands(and millions) of dollars on buying seats for tools that barely anybody uses. Interest rate increases have made VC startups get a little smarter, but a few years ago it was really bad. Similar to how tons of startups burn huge amounts of money on AWS due to laziness
Aeolun · 2 years ago
Yeah, companies really do. Once a year our company gets a really large bill (15k users, several services).

The thing is those same people need to be paid, and that’s a much (100x) larger bill, so the extra amount doesn’t really signify.

reallymental · 2 years ago
Ok, so there are now 2 tiers where they don't use our data to train the model?

The higher bandwidth is to clearly entice new customers, but the question remains, what happens to the old ChatGPT Plus users? Do their quotas get eaten up by these new teams?

yawnxyz · 2 years ago
Looks like the $20/month PLUS plan DOES use your data to train the model now... (they seem to have removed that "feature" from the list in the side-by-side comparison)
Metricon · 2 years ago
Currently, if you disable chat history, you'll see this message:

Chat History is off for this browser. When history is turned off, new chats on this browser won't appear in your history on any of your devices, be used to train our models, or stored for longer than 30 days. This setting does not sync across browsers or devices.

obmelvin · 2 years ago
AFAIK, Plus has always trained on your conversation data. Enterprise and the API do not.
tempestn · 2 years ago
There used to be a form you could submit asking them not to train on your data. Absent some communication to the contrary I would hope that continues to be respected.
castles · 2 years ago
It's not super obvious, but even with Plus you can opt-out of training.

Aside: If you can see other colleagues' interactions with the custom/private GPTs, it could be quite an efficient way to share knowledge, especially for people in disparate time zones.

reaperman · 2 years ago
> what happens to the old ChatGPT Plus users? Do their quotas get eaten up by these new teams?

This is probably run on Microsoft servers (Azure, basically), not OpenAI servers, so it shouldn't directly compete for capacity. This is more of a "the pie got bigger" situation.

Deleted Comment