creds = service_account.Credentials.from_service_account_file(
SA_FILE,
scopes=[
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/generative-language",
]
)
google.genai.Client(
vertexai=True,
project=PROJECT_ID,
location=LOCATION,
http_options={"api_version": "v1beta1"},
credentials=sa_creds,
)
That `vertexai=True` does the trick - you can use same code without this option, and you will not be using "Vertex".Also, note, with Vertex, I am providing service account rather than API key, which should improve security and performance.
For me, the main aspect of "using Vertex", as in this example is the fact Start AI Cloud Credit ($350K) are only useable under Vertex. That is, one must use this platform to benefit from this generous credit.
Feels like the "Anthos" days for me, when Google now pushing their Enterprise Grade ML Ops platform, but all in all I am grateful for their generosity and the great Gemini model.
Also, for $700 independent reviews are also a must.
For the pump kit - this too looks interesting, but requires (way) more details. At the very least a list of supported machines and, again, a video or two of an actual retrofit. Dimensions, voltage (!), etc.
I’ve been using Cursor and I’m kind of disappointed. I get better results just going back and forth between the editor and ChatGPT
I tried localforge and aider, but they are kinda slow with local models