So I can opt out of training, but they still save the conversation? Why can't they just not use my data when I pay for things. I am tired of paying, and then them stealing my information. Tell you what, create a free tier that harvests data as the cost of the service. If you pay, no data harvesting.
Storing the data is not the same as stealing. It's helpful for many use cases.
I suppose they should have a way to delete conversations though.
Looks great, but it's kind of buggy:
- I can't figure out how to toggle thinking
- Have to click in the text box to write, not just anywhere in the Claude panel
- Have to click to reject edits
The Gemini API has a canonical implementation of structured outputs where you can instead pass the JSON schema as a separate parameter to control the grammar more closely. However, this setting will reorder the JSON schema fields to be alphabetical beforehand, which is especially not desired behavior as the order of JSON fields in a schema is often very deliberate to control generation.
You can specify ordering in the Gemini API with propertyOrdering:
"propertyOrdering": ["recipeName", "ingredients"]
If you remember using AOL or AIM(AOL Instant Messenger) there were sound effects for various "events" like "Welcome" or "You've got mail" when you got a new email.
AOL and AIM had "buddy lists" and there were sound effects when they came online or offline. Like a knocking sound and door closing sound.
In the early 2000s when cable and DSL was becoming more widespread, it became cool for people to leave their AOL/AIM accounts connected all the time. This generally meant a computer running usually in their house, bedroom, or living room. People would leave "Away Messages" sort of like a status on a social media timeline. I think Jack Dorsey said turning AIM away messages into a timeline was one of his original inspirations for making a social media app. Anyway
So someone opens Visual Basic and starts writing some code. It goes to the privacy preferences of their own account and checks "Don't allow anyone to see me online" and then clicks apply.
Now it checks "Allow everyone to see me online" and clicks apply.
What does this do for everyone on your buddy list?
They hear a constant rotation of WAV files like BuddyIn.wav BuddyOut.wav. Over and over.
you can hear in the first few seconds of this video https://www.youtube.com/watch?v=AQjfU4g6_SQ
Much hilarity ensued.
> Build context for the work you're doing. Put lots of your codebase into the context window.
If you don’t say that, what do you think happens as the agent works on your codebase.
If you're asking the agent to do chunks of work, this will get better results than asking it to blindly go forth and do work. Anthropic's best practices guide says as much.
If you're asking the agent to create one method that accomplishes X, this isn't useful.
> Note: This tutorial uses our smallest, fastest, and cheapest model, Claude 3 Haiku. Anthropic has two other models, Claude 3 Sonnet and Claude 3 Opus, which are more intelligent than Haiku, with Opus being the most intelligent.