As the jewelry industry repositions around the uniqueness of natural diamonds I would expect to see more promotion of this kind of socially responsible production.
As the jewelry industry repositions around the uniqueness of natural diamonds I would expect to see more promotion of this kind of socially responsible production.
What I find most interesting is the weight put on the ethical side. I think it’s overstated. When the issue became big, the Blood Diamond movie, sales of lab grown did not markedly increase. It took another decade or so to become more prevalent. What changed over that time is the price, IIRC the price was comparable to natural at the time the movie came out. Ethics were not compelling enough for most people at that price. When prices got about 50% of natural, it became much more compelling. Now that it’s around 10%, it’s practically so compelling that buying natural isn’t even a real consideration for many people.
Anyways, I think people use the Blood Diamond talking point as a socially acceptable reason- it’s what they tell their parents and grandparents who might judge them- but in reality it’s almost completely a financial decision. If the tables were turned and natural diamonds became 1/10th the cost of lab grown, the market would completely flip back practically overnight.
Data that can be used against my children is another.
My late wife had MS. It took her. Insurance companies would love that data to load against anything my kids do.
There are other issues but the fact is that companies will use DNA and every other data point they can to maximise what they take and minimise with loaded terms what they might, just might, maybe, pay out.
It's not about the now.
It's about the later.
So it’s pointless in the end
I only use AI for small problems rather than let it orchestrate entire files.
Then it became hip, and people would hand-roll machine-specific assembly code. Later on, it became too onerous when CPU architecture started to change faster than programmers could churn out code. So we came up with compilers, and people started coding at a higher level of abstraction. No one lamented the lost art of assembly.
Coding is just a means to an end. We’ve always searched for better and easier ways to convince the rocks to do something for us. LLMs will probably let us jump another abstraction level higher.
I too spent hours looking for the right PHP or Perl snippet in the early days to do something. My hard-earned bash-fu is mostly useless now. Am I sad about it? Nah. Writing bash always sucked, who am I kidding. Also, regex. I never learned it properly. It doesn’t appeal to me. So I’m glad these whatever machines are helping me do this grunt work.
There are sides of programming I like, and implementation isn’t one of them. Once upon a time I could care less about the binary streams ticking the CPU. Now I’m excited about the probable prospect of not having to think as much about “higher-level” code and jumping even higher.
To me, programming is more like science than art. Science doesn’t care how much profundity we find in the process. It moves on to the next thing for progress.
AI at the current state in my workflow is a decent search engine and stackoverflow. But it has far greater pitfalls as OP pointed out (it just assumes the code is always 100% accurate and will “fake” API).
It’s sad because I miss printed content like tech magazines.