I don't think your account is AI just by these few comments, but I would like to point out that most rubrics one might use to determine what is obviously AI might end up including the way you talk.
If there was a truly accurate tell, some algorithm you could feed a few sentences in and it could tell you "yep, this is 100% AI", then yeah sure use that. I don't know you could realistically build that machine, especially when it comes to the generation of text.
Dead Comment
Once men turned their thinking over to machines
in the hope that this would set them free.
But that only permitted other men with machines
to enslave them.
...
Thou shalt not make a machine in the
likeness of a human mind.
-- Frank Herbert, Dune
You won't read, except the output of your LLM.You won't write, except prompts for your LLM. Why write code or prose when the machine can write it for you?
You won't think or analyze or understand. The LLM will do that.
This is the end of your humanity. Ultimately, the end of our species.
Currently the Poison Fountain (an anti-AI weapon, see https://news.ycombinator.com/item?id=46926439) feeds 2 gigabytes of high-quality poison (free to generate, expensive to detect) into web crawlers each day. Our goal is a terabyte of poison per day by December 2026.
Join us, or better yet: deploy weapons of your own design.
What's more, if you even touch them while scrolling, it triggers the "download app" screen, even if I don't explicitly tap. This is new as of a few weeks ago.
The corporation you're citing named "Pangram" cannot confirm anything of the sort. They only make claims, like the ones in your screenshot.
Indeed, this very "citation" of the AI-generated output of Pangram Inc.'s product is a good example of outsourcing work to an LLM without verifying it.
Are there specific parts of the article which are inaccurate or misleading? If so please say, it would be very interesting and add to the discussion.
Also, most of the suggestions provided in the AI generated section are just useless. While I think this law is terrible, the suggestions provided completely contradict what the lawmakers are intending. I'll explain what I mean with some of the suggestions provided.
> Narrow the Scope to Intent, Not the Tool
This is essentially a suggestion to throw out the entire law as written. Sure, but this is meaningless advice to lawmakers.
> Drop Mandatory File Scanning
This is the same suggestion as before but rephrased.
> Exempt Open-Source and Offline Toolchains
This is asking them to create a massive loophole in their own law making it useless. Once again, essentially just asking them to throw out the entire law.
> Add safe harbor for sellers and educators who don’t modify equipment or participate in unlawful manufacture.
Two fundamentally different concepts here jammed into one idea. Do you want to add safe harbor for sellers who don't modify equipment or do you want to throw out the entire law and have it not apply to anybody who doesn't participate in unlawful manufacture? These are very different ideas, it makes no sense to treat them as one cohesive concept.
All of these are signals that not much thought went into this. If a human had used AI for ideas and writing assistance, but participated in the writing process as an active contributor, I think they would have caught things like this. I don't think they would have chosen to make multiple bullet points semantically identical. I think they would have chosen to actually cite specific aspects of the law and propose concrete solutions.
Another example, one of their suggestions is to improve the working groups to add specific members. Genuinely a fairly good idea. Having actually read the law, I would have cited the specific passage, which requires that the working group "SHALL INCLUDE EXPERTS IN ADDITIVE MANUFACTURING TECHNOLOGY, ARTIFICIAL INTELLIGENCE AND DIGITAL SECURITY, FIREARMS REGULATION, PUBLIC SAFETY, CONSUMER PRODUCT SAFETY, AND ANY OTHER RELEVANT DISCIPLINES DETERMINED BY THE DIVISION TO BE NECESSARY TO PERFORM THE FUNCTIONS PRESCRIBED HEREIN." I would question, who do they consider to be experts in additive manufacturing? Why does it seem that the working group will be far more heavily weighed towards policy experts as opposed to 3D printing experts? The article suggests that "standards will default to large vendors" yet there is no evidence here that vendors will be included at all.