Readit News logoReadit News
Posted by u/jmadeano a month ago
Launch HN: Tweeks (YC W25) – Browser extension to deshittify the webtweeks.io/onboarding...
Hey HN! We’re Jason & Matt and we’re building Tweeks (https://tweeks.io), a browser extension that lets you modify any website in your browser to add functionality, filter/highlight, re-theme, reorganize, de-clutter, etc. If you’ve used Violentmonkey/Tampermonkey, Tweeks is like a next‑generation userscript manager. Instead of digging through selectors and hand‑writing custom JS/CSS, describe what you want in natural language and Tweeks plans + generates your edits and applies them.

The modern web is so full of clutter and junk (banners, modals, feeds, and recommendations you didn’t ask for). Even a simple google search is guarded by multiple ads, an AI overview, a trending searches module, etc. before you even see the first real blue link.

Every day there's a new Lovable-like product (make it simple to build your own website/app) or a new agentic browser (AI agents click around and browse the web for you), but we built Tweeks to serve the middle ground: most of our time spent on the web is on someone else's site (not our own), and we don't want to offload everything to an agentic browser. We want to be able to shape the entire web to our own preferences as we browse.

I spent years working on recommendation systems and relevance at Pinterest, and understand how well-meaning recommendations and A/B tests can lead to website enshittification. No one sets out to make UX worse, but optimizing for an “average” user is not the same as optimizing for each individual user.

I’ve also been hacking “page fixers” as long as I can remember: remove a login wall here, collapse cookie banners there, add missing filters/highlights (first with F12/inspect element and eventually graduated to advanced GreaseMonkey userscripts). Tweeks started as a weekend prototype that turned simple requests into page edits but unexpectedly grew into something people kept asking to share. We hope you’ll like it too!

How it works: Open the Tweeks extension, type your request (e.g. “hide cookie banners and add a price/quality score”), and submit. Upon submission, the page structure is captured, an AI agent reviews the structure, plans changes, and returns deterministic transformations (selectors, layout tweaks, styles, and small scripts) that run locally. Your modifications persist across page loads and can be enabled/disabled, modified, and shared.

Here are a bunch of one‑shot examples from early users:

Youtube: Remove Youtube Shorts. Demo: http://youtube.com/watch?v=aL7i89BdO9o. Try it yourself: http://tweeks.io/share/script/bcd8bc32b8034b79a78a8564

Hacker News: Filter posts by title/url or points/comments, modify header and text size. Demo: http://youtube.com/watch?v=cD5Ei8bMmUk. Try it yourself: http://tweeks.io/share/script/97e72c6de5c14906a1351abd (filter), http://tweeks.io/share/script/6f51f96c877a4998bda8e781 (header + text).

LinkedIn: Keep track of cool people (extracts author data and send a POST request to a server). Demo: http://youtube.com/watch?v=WDO4DRXQoTU

Reddit: Remove sidebar and add a countdown timer that shows a blocking modal when time is up. Demo: http://youtube.com/watch?v=kBIkQ9j_u94. Try it yourself: http://tweeks.io/share/script/e1daa0c5edd441dca5a150c8 (sidebar), http://tweeks.io/share/script/c321c9b6018a4221bd06fdab (timer).

New York Times Games: Add a Strands helper that finds all possible words. Demo: http://youtube.com/watch?v=hJ75jSATg3Q. Try it yourself: http://tweeks.io/share/script/7a955c910812467eaa36f569

Theming: Retheme Google to be a 1970s CLI terminal. Demo: http://youtube.com/shorts/V-CG5CbYJb4 (oops sorry a youtube short snuck back in there). Try it yourself: http://tweeks.io/share/script/8c8c0953f6984163922c4da7.

We just opened access at https://tweeks.io. It’s currently free, but each use costs tokens so we'll likely need to cap usage to prevent abuse. We're more interested in early feedback than your money, so if you manage to hit the cap, message us at contact@trynextbyte.com or https://discord.gg/WucN6wpJw2, tell us how you're using it/what features you want next, and we'll happily reset it for you.

Btw if you do anything interesting with it, feel free to make a shareable link (go to ‘Library’ and press ‘share’ after generating) and include it in the comments below. It’s fun to see the different things people are coming up with!

We're rapidly shipping improvements and would love your feedback and comments. Thanks for reading!

freshtake · a month ago
This looks cool and could be a much needed step towards fixing the web.

Some questions:

[Tech]

1. How deep does the modification go? If I request a tweek to the YouTube homepage, do I need to re-specify or reload the tweek to have it persist across the entire site (deeply nested pages, iframes, etc.)

2. What is your test and eval setup? How confident are you that the model is performing the requested change without being overly aggressive and eliminating important content?

3. What is your upkeep strategy? How will you ensure that your system continues to WAI after site owners update their content in potentially adversarial ways? In my experience LLMs do a fairly poor job at website understanding when the original author is intentionally trying to mess with the model, or has overly complex CSS and JS.

4. Can I prompt changes that I want to see globally applied across all sites (or a category of sites)? For example, I may want a persistent toolbar for quick actions across all pages -- essentially becoming a generic extension builder.

[Privacy]

5. Where and how are results being cached? For example, if I apply tweeks to a banking website, what content is being scraped and sent to an LLM? When I reload a site, is content being pulled purely from a local cache on my machine?

[Business]

6. Is this (or will it be) open source? IMO a large component of empowering the user against enshittification is open source. As compute commoditizes it will likely be open source that is the best hope for protection against the overlords.

7. What is your revenue model? If your product essentially wrestles control from site owners and reduces their optionality for revenue, your arbitrage is likely to be equal or less than the sum of site owners' loss (a potentially massive amount to be sure). It's unclear to me how you'd capture this value though, if open source.

8. Interested in the cost and latency. If this essentially requires an LLM call for every website I visit, this will start to add up. Also curious if this means that my cost will scale with the efficiency of the sites I visit (i.e. do my costs scale with the size of the site's content).

Very cool.

Cheers

jmadeano · a month ago
> 1. How deep does the modification go? If I request a tweek to the YouTube homepage, do I need to re-specify or reload the tweek to have it persist across the entire site (deeply nested pages, iframes, etc.)

If you're familiar with Greasemonkey, we work similar to the @match metadata. A given script could have a specific domain like (https://www.youtube.com/watch?v=cD5Ei8bMmUk) or all videos (https://www.youtube.com/watch*) or all of youtube (https://www.youtube.com/*) or all domains (https:///). During generation, we try to infer your intent based on your request (and you can also manually override with a dropdown.

> 2. What is your test and eval setup? How confident are you that the model is performing the requested change without being overly aggressive and eliminating important content?

Oh boy, don't get me started. We have not found a way to automate eval yet. We can automate "is there an error?", "does it target the right selectors", etc. But the request are open ended so there are 1M "correct" answers. We have a growing set of "tough" requests and when we are shipping a major change, we sit down, generate them all, and click through and manually check pass/fail. We built tooling around this so it is actually pretty quick but definitely thinking about better automation.

This is also where more users comes in. Hopefully you complain to us if it doesn't work and we get a better sense of what to improve!

> 3. What is your upkeep strategy? How will you ensure that your system continues to WAI after site owners update their content in potentially adversarial ways? In my experience LLMs do a fairly poor job at website understanding when the original author is intentionally trying to mess with the model, or has overly complex CSS and JS.

Great question. The good news is that there are things like aria labels that are pretty consistent. If the model picks the right selectors, it can be pretty robust to change. Beyond that, hopefully it is as easy as one update request ("this script doesn't work anymore, please update the selectors"). Though we can't really expect each user to do that, so we are thinking of an update system where e.g. if you install/copy script A, and then the original script A is updated, you can pull that new update. The final stage of this is an intelligent system where the script can heals itself (every so often, it assess the site, sees if selectors have changed and fixes itself) -> that is more long-term.

> 4. Can I prompt changes that I want to see globally applied across all sites (or a category of sites)? For example, I may want a persistent toolbar for quick actions across all pages -- essentially becoming a generic extension builder. Yes, if domain is https:/// it applies to all sites so you can think of this as a meta-extension builder. E.g. I have a timer script that applies across reddit, linkedin, twitter, etc. and keeps me focused.

> 5. Where and how are results being cached? For example, if I apply tweeks to a banking website, what content is being scraped and sent to an LLM? When I reload a site, is content being pulled purely from a local cache on my machine?

There is a distinction. When you generate a tweek, the page is captured and sent to an LLM. There is no way around this. You can't generate a modification for a site you cannot see.

The result of a generation is a static script that applies to the page across reloads (unless you disable it). When you apply a tweek, everything is local, there is no dynamic server communication.

Hopefully that is all helpful! I need to get to other replies, but I will try to return to finish up your business questions (those are the most boring anyway)

-- Edit: I'm back! --

> 6. Is this (or will it be) open source? IMO a large component of empowering the user against enshittification is open source. As compute commoditizes it will likely be open source that is the best hope for protection against the overlords.

It is very important to me that people trust us. I can say that we don't do X, Y, Z with your data and that using our product is safe, but trust is not freely given (nor should it be). We have a privacy policy, we have SOC II, and in theory, you could even download the extension and dig into the code yourself.

Open-source is one way to build trust. However, I also recognize that many of these "overlords" you speak of are happy to abuse their power. Who's to say that we don't open our code, only to have e.g. OpenAI fork it for their own browser? Of course, we could put restrictive licenses, but lawsuits haven't been particularly protective of copyright lately. I am interested in open-sourcing parts of our code (and there certainly is hunger for it in this post), but I am cognizant that there is a lot that goes into that decision.

> 7. What is your revenue model? If your product essentially wrestles control from site owners and reduces their optionality for revenue, your arbitrage is likely to be equal or less than the sum of site owners' loss (a potentially massive amount to be sure). It's unclear to me how you'd capture this value though, if open source.

The honest answer is TBD. I would push back on your claim that we wrestle control from site owners and reduce their optionality for revenue. While there likely will be users who say "hide this ad" (costing the site revenue) there are also users who say "move this sidbebar from left to right" or "I use {x} button all the time but it is hidden three menus in, place it prominently for easy access". I'd argue the latter cases are not negative for the site owners, they could be positive sum. Maybe we even see a trend that 80% of users make this UX modification on Z site. We could go to Z site and say, "Hey, you could probably make your users happy if you made this change". Maybe they'd even pay us for that insight?

Again, the honest answer is that I'm not certain about the business model. I am a lover of positive sum games. And in the moment, I am building something that I enjoy using and hopefully also provides value to others.

> 8. Interested in the cost and latency. If this essentially requires an LLM call for every website I visit, this will start to add up. Also curious if this means that my cost will scale with the efficiency of the sites I visit (i.e. do my costs scale with the size of the site's content).

As I noted above, this does not require an LLM call for every website you visit. You are correct that that would bankrupt us very quickly! An LLM is only involved when you actively start a generation/update request. There is still a cost and it does scale with the complexity of the site/request, but it is infinitely more feasible than running on every site.

In the future, we may extend functionality so that the core script that is generated can itself dynamically call LLMs on new page loads. That would enable you to do things like "filter political content from my feed" which requires test time LLM compute to dynamically categorize on each load (can't be hard-coded in a once-generated static script). That would likely have to be done locally (e.g. Google actually packages Gemini nano into the browser) for both cost and latency reasons. We're not there yet, and there is a lot you can do with the extension today, but there are definitely opportunities to build really cool stuff, way beyond Greasemonkey.

Wow, you really put me to work with this comment. Appreciate all the great questions!

potatowaffle · a month ago
I love the idea and the execution. The onboarding experience is great as well. Thanks for sharing. I am curious about SOC II. how much effort did you put in to acquire it, and what made you decide to pursue it?
danudey · a month ago
> We could go to Z site and say, "Hey, you could probably make your users happy if you made this change". Maybe they'd even pay us for that insight?

My honest opinion:

1. No site would pay for that insight

2. Every site should pay for that insight

Part of the problem is that a lot of companies fall into one of two categories:

1. Small companies that don't have the time/energy/inclination to make changes, even if they're simple; often they're not even the ones making the website itself and they aren't going to way to pay the company who made the site originally to come back and tweak it based on what a small, self-selecting group of users decided to change.

2. Large companies who, even if they did care about what that small, self-selecting group of users wanted to change, have so many layers between A and Z that it's nearly impossible to get anything done without a tangible business need. No manager is going to sign off on developer and engineer time and testing because 40% of 1% of their audience moves the sidebar from one side to the other.

Also:

1. Designers are opinionated and don't want some clanker telling them what they're doing wrong, regardless of the data.

2. Your subset of users may have different goals or values; maybe the users more likely to install this extension and generate tweaks don't want to see recommended articles or video articles or 'you may like...' or whatever, but most of their users do and the change would turn out to be a bad one. Maybe it would reduce accessibility in some way that most users don't care about, etc.

If I had to pick a 'what's the value of all this', I would say that it's less about "what users want from this site" vs. "what users want from sites". For example, if you did the following:

1. Record all the prompts that people create that result in tweaks that people actually use, along with the category of site (banking, blogs, news, shopping, social media, forums); this gives you a general selection of things that people want. Promote these to other users to see how much mass appeal they have

2. Record all the prompts that people create that result in tweaks that people don't actually use; this gives you a selection of things that people think they want but it turns out they don't.

3. Summarize those changes into reports.

Now you could produce a 'web trend report' where you can say:

1. 80% of users are making changes to reduce clutter on sites

2. 40% of users are disabling or hiding auto-play videos

3. 40% of People in countries which use right-to-left languages swap sidebars from one side to another even on left-to-right-language websites

4. The top 'changed' sites in your industry are ... and the changes people make are ...

5. The top changes that people make to sites in your industry are ... and users who make those changes have a 40% lower bounce rate / 30% longer time-on-site / etc. than users who don't make those changes.

On top of that, you could build a model trained on those user prompts that companies could then pay for (somehow?) to run their sites through to provide suggestions of what changes they could make to their sites to satisfy these apparent user needs or preferences without sacrificing their own goals for the websites - e.g. users want to remove auto-playing videos because they're obnoxious, but the company is trying to promote their video content so maybe this model could find a middle-ground to present the video to users in a way that's less obnoxious but generates user engagement.

That's what I think anyway, but I'm not in marketing or whatever.

thefourthchime · a month ago
Can you answer question 7?
glenstein · a month ago
Looks great, and a brilliant idea to bring back the Greasemonkey way of doing things. Also, perhaps the first practical use case for LLM-In-The-Browser I've seen in the wild (sidebars or AI startpages are very half-posterier'd ideas for what AI in the browser should mean imo).

Like some others here, Firefox is my daily driver and would look forward to anything you could bring our way.

jmadeano · a month ago
Thanks! I've tried my share of Agentic Browsers, sidebars, etc. Most of them don't work that well, and even as they get better, I am just generally not sold on the vision. Sure, there are some amount of "chores" that I need to do on the web that I wouldn't mind automating/offloading, but I also genuinely enjoying browsing the web. I don't want a future where AI agents do all the browsing for us.

So we built this to hopefully make browsing the web more enjoyable for us humans that remain :)

And I'm with you on Firefox. I'd love to be able to go back to Firefox as my daily driver. Will try to prioritize it!

bambax · a month ago
> bring back the Greasemonkey way of doing things

Greasemonkey still works great, no?

glenstein · a month ago
It certainly may, I'm not sure. I think the ecosystem was at its apex when userscripts.org had a browseable library of scripts that even laypeople could install with a click. It was like a second ecosystem of browser extensions.

My understanding is that it's a bit more of a fragmented ecosystem now but I could be wrong.

charlesabarnes · a month ago
Its a great idea, I'm cautious to install this because I don't know how to monetize this for the long haul. I'd love to hear your thoughts on local models vs something hosted for this.
jmadeano · a month ago
I'm a big fan of local myself, but unfortunately the local models aren't there yet. Even of the closed-source models, many surprisingly struggle with relatively simple requests in this domain.

Don't get me wrong, there are a lot more iterations of tool + prompt + agent flow updates we can and will do to make things even better, and the models will keep getting better themselves, but the task is non-trivial. If you download the raw HTML of a webpage, it's a messy jungle, and frankly impressive that the models are capable of doing anything useful with it

fwip · a month ago
Especially with the permissions you necessarily grant to this extension! The easiest way to monetize this is to sell it to somebody who will exfiltrate all your banking data with an invisible auto-update.
jmadeano · a month ago
Totally hear you on the permissions/access. I'd love to request fewer permissions, but the chrome store doesn't support that kind of permissions granting.

In order for us to be able to execute your scripts that do powerful things (send notifications, save to local storage, download things, etc.), our extension needs to have those permissions itself. Google doesn't have any way for us to say our extension itself only requires permissions x, y, z but give this user script permissions j, k, l.

Your browsing/page data is yours. That data is only accessed when you explicitly request to generate a script (i.e. can't generate a script to modify a page without seeing that page).

jdprgm · a month ago
I had basically this exact idea too a few months ago and at the time already found a few implementations attempting it. https://robomonkey.io/ being one example I found so didn't pursue it further.

Also it turns out llm's are already very good at just generating Violentmonkey scripts for me with minimal prompting. They also are great for quickly generating full blown minimal extensions with something like WXT when you run into userscript limitations. These are kind of the perfect projects for coding with llm's given the relatively small context of even a modest extension and certainly a userscript.

I am a bit surprised YC would fund this as I think building a large business on the idea will be extremely difficult.

One angle I was/am considering that I think could be interesting would be truly private and personal recommendation systems using LLM's that build up personal context on your likes/dislikes and that you fully control and could own and steer. Ideally local inference and basically an algo that has zero outside business interests.

jmadeano · a month ago
Great minds think alike :) I think it is important for users to have more control over how the browse the internet, so I'm happy to see others building in the space!

> Also it turns out llm's are already very good at just generating Violentmonkey scripts for me with minimal prompting. They also are great for quickly generating full blown minimal extensions with something like WXT when you run into userscript limitations.

We've thought about full blown extensions and maybe we'll get there, but I'd wager that there is gap between users who would install/generate a userscript vs a full blown extension. Also a one-click install userscript is much simpler to share vs full chrome store submission/approval (the approval time has been a pain for many developers I've talked with). With that said, this is early days and we're still figuring out what people want.

> One angle I was/am considering that I think could be interesting would be truly private and personal recommendation systems using LLM's that build up personal context on your likes/dislikes and that you fully control and could own and steer. Ideally local inference and basically an algo that has zero outside business interests.

I've definitely considered the idea of your own personal, tunable recommendation system that follows you across the web. And I have some background there (worked on recommendations systems at Pinterest), but recommendation systems are very data hungry (unless we regress to the XGBoost days), and task of predicting will/won't the user like this image (binary) is vastly easier than operating over the entire page UI. Definitely not impossible, but we aren't there yet. For now, I just want to make it super easy for you to generate your own useful page mods

AJ007 · a month ago
Maybe I'm going too far from the tipping point of "this is easy" when it actually isn't, but the ability to clone an open source project now, modify some part of it, and then compile it locally seems like the future. This is almost trivial to do now.

Why not do the same for the web?

Without going off on a rant about all of the user hostile bullshit that's being shoved down our throats right now, I think one inevitable outcome of AI is that users are going to need to have defensive local AI agents protecting them and fact checking data. This is the trojan horse for the big tech companies that rely on ad revenue and dark patterns to manipulate their users: if they provide the AI agents, they will be obviously inferior and not super-intelligent, just like when Google's early public image-gen model was making image of ethnically and gender diverse nazi soldiers, etc.

gagik_co · a month ago
Yeah I have thought about more user friendly Violentmonkeh before. This sort of thing just needs to be open source and non-profit, there isn’t even much upkeep to it. At what point will the investors want some form of return?

This is built from the system that created enshittifcation in the first place; a cleaner web is definitely not going to come from a startup.

ggsp · a month ago
Great idea, great execution on your landing page (the onboarding experience is really well done) and great job on answering questions in this thread. Also, +1 on building a Firefox version.

Since I also have to use Chrome for an extension I'm developing, I pinned Tweeks and will likely reach for it every so often to actually test how well it does, but the demos definitely impressed me.

Out of curiosity, how much, if any, of this did you vibe code?

jmadeano · a month ago
> Great idea, great execution on your landing page (the onboarding experience is really well done)

Thank you! As others pointed out here, we admittedly didn't invest much in the "landing page" aspect, but I did work hard to make a great onboarding experience. Glad it shined through

> Since I also have to use Chrome for an extension I'm developing

We're in the same boat, I wouldn't be using chrome if not for this extension. Great to see HN has a strong cohort of Firefox users!

> Out of curiosity, how much, if any, of this did you vibe code?

A lot of the elements of the extension, backend, and even the onboarding page integrations push at the boundaries of what tools like codex and claude code can do right now.

We do believe in the tech (in some regards, the extension is powered by similar tech), and we are power users of both, but we also know when claude code has said "You're absolutely right" one too many times and we need to dig in and get our hands dirty.

pkamb · a month ago
The extension I've always wanted is a one that makes every link to a modern story on the New York Times, CNN, ESPN, etc. load using their same websites from like 2004.

https://web.archive.org/web/20041207071752/http://www.cnn.co...

Make every new page I load look like this, or a slightly cleaned up or mobile-specific version.

jmadeano · a month ago
I tried "Make this page look like the 2004 version of CNN" and it did update the theming to be older but it didn't have the true reference, so it just made up the old style.

We don't currently have a way to provide a reference during generation, but if I were you, I'd personally try downloading the old page archive and a new page archive, throw them both in codex/claude code as context and see what it can come up with. Wouldn't be surprised if it could do a decent job writing a converter.

Somewhat tangential: I have had luck with more generic retheming, e.g. "Turn Google into a 1970s era CLI " (https://www.tweeks.io/share/script/8c8c0953f6984163922c4da7) or "Turn LinkedIn into a 90s era Neocities site" These aren't the most useful, but they are fun!

eejdoowad · a month ago
Cool idea and onboarding experience. I spun up Chrome to demo it, and although its got the rough edges of a prototype, the potential is there.

I created a rule to remove thumbnails and shorts from YouTube, and after a few failed attempts, it succeeded! But there were massive tracts of empty space where the images were before. With polish and an accessible way to find and apply vetted addons so that you don't have to (fail at) making your own, I would consider using it.

My daily driver is Firefox, where I've set up custom uBlock Origin cosmetic rules to personalize YouTube by removing thumbnails, short, comments, images, grayscaling everything except the video, etc. My setup works great for me, but I can't easily share it with other people who would find it useful.

jmadeano · a month ago
> Cool idea and onboarding experience. I spun up Chrome to demo it, and although its got the rough edges of a prototype, the potential is there.

I'd love to hear more about the rough edges. We're working hard to polish everything up! Would you be willing to share the script you generated so that I can take a closer look? And any other suggestions are welcome :)

> an accessible way to find and apply vetted addons so that you don't have to (fail at) making your own

This is on the immediate roadmap. We just shipped V0 of the share/profile and sharing + discoverability are going to play an important role in upcoming launches.

Let's say it perfectly one-shotted your request for youtube: Would you be more likely to generate more scripts yourself or still lean toward vetted and relevant existing tweeks?

abe94 · a month ago
Very cool - I've wanted something like this for a while. I currently use a patchwork of site specific extensions, so will definitely give this a go

Something in a similar vein that I would love would be a single feed you have control over, powered by an extension. You can specify in plain english what the algorithm should exclude / include - that pulls from your fb/ig/gmail/tiktok feeds.

jmadeano · a month ago
> I currently use a patchwork of site specific extensions, so will definitely give this a go

Hopefully we can help you trim down the patchwork. If they're broadly useful, I'm happy to assist with the creation to share with others. Just let me know!

> You can specify in plain english what the algorithm should exclude / include - that pulls from your fb/ig/gmail/tiktok feeds.

The is the holy grail. Admittedly, we aren't there yet, but something like that might be possible as we keep building