Readit News logoReadit News
tpetry commented on Bunny Database   bunny.net/blog/meet-bunny... · Posted by u/dabinat
js4ever · 10 days ago
Pro plan without argo give you better peering on Cloudflare?
tpetry · 10 days ago
The reverse. Argo gives better peering than any paid plan. Its the reason for the product‘s existence. They can use more costly peering that they couldn‘t use with their free egress model.
tpetry commented on Bunny Database   bunny.net/blog/meet-bunny... · Posted by u/dabinat
badlibrarian · 11 days ago
Adding my voice to the chorus here: they've established a pattern of introducing new features and never really getting them past the 80% point. No qualms with the CDN; it's a sweet spot among providers. But their other offerings have been frustrating me for years now.
tpetry · 11 days ago
Can you share some anecdotes?
tpetry commented on Bunny Database   bunny.net/blog/meet-bunny... · Posted by u/dabinat
fspoettel · 11 days ago
A couple of reasons:

- The free CDN is basically unusable with my ISP Telekom Germany due to a long-running and well documented peering dispute. This is not necessarily an issue with Cloudflare itself, but means that I have to pay for the Pro plan for every domain if I want to have a functioning site in my home country. The $25 per domain / project add up.

- Cloudflare recently had repeated, long outages that took down my projects for hours at a time.

- Their database offering (D1) had some unpredictable latency spikes that I never managed to fully track down.

- As a European, I'm trying to minimize the money I spent on US cloud services and am actively looking for European alternatives.

tpetry · 11 days ago
You don‘t have to get the Pro plan to solve the Deutsche Telekom issues. You can also use their Argo product for $5/month - but only makes sense if your egress costs wouldn‘t exceed the pro plans pricing.
tpetry commented on Hypergrowth isn’t always easy   tailscale.com/blog/hyperg... · Posted by u/usrme
flkiwi · 12 days ago
Interesting post. I appreciate their candor and self-criticism, but, as a customer, I'm consistently surprised by how robust Tailscale ends up being, and how rarely I've experienced an issue that actually broke my tailnet. The sort of downtime that might keep me from accessing the admin tool or something else like that is rare enough, but my nodes have almost (?) never failed to talk to each other. Pretty great.

Caveat: I have a very small tailnet (<100 nodes). Anyone running with thousands of nodes may have a very different experience where inconvenience might be existential.

tpetry · 11 days ago
The reason for that is that all nodes talk p2p to each other. There is no central communication server like with many other vpn solutions. So even if Tailscale would go down for days you won't have any downtime between your nodes.
tpetry commented on Updates to our web search products and Programmable Search Engine capabilities   programmablesearchengine.... · Posted by u/01jonny01
nemosaltat · 22 days ago
> Kagi This seems to be true, but more indirectly. From Kagi’s blog [0] which is a follow up to a Kagi blog post from last year [1].

[0]> Google: Google does not offer a public search API. The only available path is an ad-syndication bundle with no changes to result presentation - the model Startpage uses. Ad syndication is a non-starter for Kagi’s ad-free subscription model.[^1]

[0]> The current interim approach (current as of Jan 21, 2026)

[0]> Because direct licensing isn’t available to us on compatible terms, we - like many others - use third-party API providers for SERP-style results (SERP meaning search engine results page). These providers serve major enterprises (according to their websites) including Nvidia, Adobe, Samsung, Stanford, DeepMind, Uber, and the United Nations.

I’m an avid Kagi user, and it seems like Kagi and some other notable interested parties have _already_ been unable to do get what they want/need with Google’s index.

[0]> The fact that we - and companies like Stanford, Nvidia, Adobe, and the United Nations - have had to rely on third-party vendors is a symptom of the closed ecosystem, not a preference.

Hopefully someone here can clarify for me, or enumerate some of these “third-party vendors” who seem like they will/might/could be directly affected by this.

[0] antibabelic > relevant https://blog.kagi.com/waiting-dawn-search [1] https://blog.kagi.com/dawn-new-era-search > [^1]: A note on Google’s existing APIs: Google offers PSE, designed for adding search boxes to websites. It can return web results, but with reduced scope and terms tailored for that narrow use case. More recently, Google offers Grounding with Google Search through Vertex AI, intended for grounding LLM responses. Neither is general-purpose index access. Programmable Search Engine is not designed for building competitive search. Grounding with Google Search is priced at $35 per 1,000 requests - economically unviable for search at scale, and structured as an AI add-on rather than standalone index syndication. These are not the FRAND terms the market needs

tpetry · 22 days ago
I believe they try to indirectly say they are using SerpApi or a similar product that scrapes Google search results to use them. And other big ones use it too so it must be ok...

That must be the reason why they limit the searches you can do in the starter plan. Every SerpApi call costs money.

tpetry commented on Show HN: EuConform – Offline-first EU AI Act compliance tool (open source)   github.com/Hiepler/EuConf... · Posted by u/hiepler
hash872 · a month ago
Glad to see future builders focusing on bureaucratic compliance first & foremost. It's a stirring vision. This is a great European VC on Twitter you may want to tag about your project, he invests solely in GDPR-compliant European tech https://x.com/compliantvc
tpetry · a month ago
You know that this is a parody account?
tpetry commented on Biscuit is a specialized PostgreSQL index for fast pattern matching LIKE queries   github.com/CrystallineCor... · Posted by u/eatonphil
fwip · 2 months ago
The GitHub repo is about two weeks old and there's a single author - if I were you, I'd let it cook for a while longer.
tpetry · 2 months ago
In my experience you wait for the next two major PG release. When its actively maintained they support them fast. If not, you see by them that it is abandoned…
tpetry commented on I spent a week without IPv4 (2023)   apalrd.net/posts/2023/net... · Posted by u/mahirsaid
rockskon · 2 months ago
The second address is invalid. You can only use :: once per address.

Edit: Whoops. Didn't read what the above post was in response to. My bad.

tpetry · 2 months ago
That exactly what was the question about and they explained why it is invalid…
tpetry commented on Building a Toast Component   emilkowal.ski/ui/building... · Posted by u/FragrantRiver
anilakar · 2 months ago
Scrolling that web site on mobile is really choppy.
tpetry · 2 months ago
Perfectly smooth on iOS for me.
tpetry commented on Anthropic acquires Bun   bun.com/blog/bun-joins-an... · Posted by u/ryanvogel
reactordev · 2 months ago
Bun is such a great runtime. If you haven't tried it, try it. It's got bells and whistles.

This will make sure Bun is around for many, many, years to come. Thanks Anthropic.

Why Bun?

Easy to setup and go. bun run <something.ts>

Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)

Typescript Supported. (No need to tsc, bun can transpile for you)

Binary builds. (single executables for easy deployment)

Full Node.js Support. (The whole API)

Full NPM Support. (All the packages)

Native modules. (90% and getting better thanks to Zig's interop)

S3 File / SQL Builtin. (Blazingly Fast!)

You should try it. Yes, others do these things too, but we're talking about Bun.

tpetry · 2 months ago
Its not 100% nodejs compatible. I see enough non-green dots in their own official report https://bun.com/docs/runtime/nodejs-compat

And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.

u/tpetry

KarmaCake day3599August 6, 2013View Original