Readit News logoReadit News
throwawaybbq1 commented on Section 174 is reversed, mostly   newsletter.pragmaticengin... · Posted by u/jawns
throwawaybbq1 · 5 months ago
Does this mean a huge hiring uptick in the US/layoff reversal? I do think this law caused some of the bad market. Will undoing it get us back to where we were?
throwawaybbq1 commented on H-1B program grew 81 percent from 2011 to 2022   twitter.com/USTechWorkers... · Posted by u/DonnyV
jjtheblunt · 5 months ago
as an American who worked in Silicon Valley for years, but is not from there, i'd say it's also "captive real estate customers" created by H1-B people.
throwawaybbq1 · 5 months ago
Can you clarify this? You saying non H1B's would not pay the crazy high rents of SV's housing?
throwawaybbq1 commented on Launch HN: Vassar Robotics (YC X25) – $219 robot arm that learns new skills    · Posted by u/charleszyong
yardie · 6 months ago
Of course this arrives right after I order all the electronic parts and just kicked off the 24+ hour 3D print job to complete my SO-Arm101.

But I’m routing for you!

throwawaybbq1 · 6 months ago
Curious where you sourced the parts? In Canada, shipping kills it for me. When I priced out the robot + electronics + $100 in shipping, I am around $700 - far cry from the $100 on the "sticker".
throwawaybbq1 commented on Kirin demos "electric salt spoon" at CES   techcrunch.com/2025/01/05... · Posted by u/interestica
throwawaybbq1 · a year ago
My taste buds have become extremely muted after covid bouts and other sinus issues. Wonder if this would be helpful.
throwawaybbq1 commented on Dockworkers at ports from Maine to Texas go on strike   apnews.com/article/port-s... · Posted by u/mikeocool
navane · a year ago
We've had stagnant incomes for the last 50 years. The fruits of automation are not shared with the workers.

The small minority that keeps a country by the balls is not the unions but the owning class. The 2008 crash that put the whole world in a decade recession is collateral damage.

throwawaybbq1 · a year ago
In the US? I don't think what you are saying is supported by real data. My understanding is that US works did see an improvement in incomes in the last decade but Canadian workers did not.

What makes life better for everyone is competition. Canada's stagnation can be be summed up in a single phrase - lack of competition. Generally, the US has been a free-for-all when it comes to competition and hence its populace enjoys some of the best living standards.

I'll also relate my experience traveling the subway in Asia vs. Manhattan. Asian transit seems like space-age compared to what we have in the West. I think UBI won't save us as the income must come from somewhere. Hiking taxes kills incentives. The better way is to have more freedom/efficiencies in my humble opinion.

throwawaybbq1 commented on Dockworkers at ports from Maine to Texas go on strike   apnews.com/article/port-s... · Posted by u/mikeocool
dopamean · a year ago
I'm about as pro union as anyone I know. I'm not blind to the problems with unions but I do think that generally the benefits outweigh the costs. This situation has me pretty torn however. I specifically take issue with the demand for a complete ban on automation. This just seems unreasonable and anti progress. I understand that automation costs jobs and a union's primary responsibility is to protect the jobs of its members but what are we really supposed to do here? Continue with antiquated processes that affect an entire economy just to protect the jobs of a relatively small number of people?

I get that that's unfortunate and perhaps a very serious problem for people working in that industry but what choice do we really have? I'm reminded of something a teacher said to me in high school about how one day many of my classmates and I would have jobs that didn't exist when he was a kid. Isn't that how this works? As time goes on some jobs go away and new jobs come about and there is some pain in the interim? I'm all for figuring out some way to ease that pain for the people in the transitional period but I don't know who's really responsible for that.

throwawaybbq1 · a year ago
In the limit, this is like arguing against the use of wheels. Automation improves labor productivity. Economies that have not invested in capital, have seen labor productivity and incomes stagnant. This is a current debate going on in Canada (especially as compared to the productivity and income gains in the last decade). Canada does have strong Unions, so I wonder if this is related.

Another thing that seems troubling is how a small group of people can hold a majority of the country by the bXlls. Given how this is an election year, I can see this turning into huge fiasco. The rest of the economy is collateral damage.

throwawaybbq1 commented on GGUF, the Long Way Around   vickiboykis.com/2024/02/2... · Posted by u/Tomte
rahimnathwani · 2 years ago
Why are you horrified?

In designing software, there's often a trade off between (i) generality / configurability, and (ii) performance.

llama.cpp is built for inference, not for training or model architecture research. It seems reasonable to optimize for performance, which is what ~100% of llama.cpp users care about.

throwawaybbq1 · 2 years ago
GGUF files seems to be proliferating. I think some folks (like myself) make an incorrect assumption that the format has more portability/generalizability than it appears to have. Hence, the horror!
throwawaybbq1 commented on GGUF, the Long Way Around   vickiboykis.com/2024/02/2... · Posted by u/Tomte
rahimnathwani · 2 years ago
It seems like a lot of innovation is around training, no? GGML (the library that reads GGUF format) supports these values for the required 'general.architecture':

  llama
  mpt
  gptneox
  gptj
  gpt2
  bloom
  falcon
  rwkv

throwawaybbq1 · 2 years ago
I've also been trying to figure out GGUF and the other model formats going around. I'm horrified to see there is no model architecture details in the file! As you say, it seems they are hard-coding the above architectures as constants. If a new hot model comes out, one would need to update the reader code (which has the new model arch implemented). Am I understanding this right?

I'm also a bit confused by the quantization aspect. This is a pretty complex topic. GGML seems to use 16bit as per the article. If was pushing it to 8bit, I reckin I'd see no size improvement the GGML file? The article says they encode quantization versions in that file. Where are they defined?

throwawaybbq1 commented on ChatGPT went berserk   garymarcus.substack.com/p... · Posted by u/RafelMri
mark_l_watson · 2 years ago
Mixtral 8x7b continues to amaze me, even though I have to run it with 3 bit quantization on my Mac (I just have 32G memory). When I run this model on commercial services with 4 or more bits of quantization I definitely notice, subjectively, better results.

I like to play around with smaller models and regular app code in Common Lisp or Racket, and Mistral 7b is very good for that. Mixing and matching old fashioned coding with the NLP, limited world knowledge, and data manipulation capabilities of LLMs.

throwawaybbq1 · 2 years ago
This is neat to know. On Ollama, I see mistral and mixtral. Is the latter one the MoE model?

u/throwawaybbq1

KarmaCake day305June 15, 2017View Original