Tesla did not release new cars, except for Cybertruck, for how long ? 5 years ? 10 years ?
Their lineup was great initially, and there was 0 competition. Now there is a lot of competition and their lineup did not change at all.
Their car business is dying. That's why they try to be an AI & Robotics company.
Edit: Here is a good link to follow the sales data - for many countries, it's reported daily. https://eu-evs.com/brandCharts/TESLA/ALL_DAILY/QoQ-Chart
https://www.wsj.com/lifestyle/careers/computer-science-major...
"Between 2018 and 2023, the number of students majoring in computer and information science jumped from about 444,000 to 628,000."
Around 40% of MIT graduates are now in CS https://alum.mit.edu/slice/conversation-new-computing-dean-a...
Further, COVID has reduced a lot of friction for remote work, so now there is also global competition for these jobs.
CS (along with ECE/EECS) degrees have been watering down their curriculum for a decade by reducing the amount of hardware, low level, and theory courses that remain requirements abroad.
Just take a look at the curriculum changes for the CSE major (course 6-3) at MIT in the 2025 [0] versus 2017-22 [1] versus pre-2017 [2] - there is a steady decrease in the amount of table stakes EE/CE content like circuits, signals, computer architecture, and OS dev (all of which are building blocks for Cybersecurity and ML) and an increased amount in math.
Nothing wrong with increasing the math content, but reducing the ECE content in a CSE major is bad given how tightly coupled software is with hardware. We are now at a point where an entire generation of CSE majors in America do not know what a series or parallel circuit is.
And this trend has been happening at every program in the US over the past 10 years.
I CANNOT JUSTIFY building a new grad pipeline in cybersecurity, DevSecOps, CloudSec, MLOps, Infra Silicon Design, or ML Infra with people who don't understand how a jump register works, the difference between BPF and eBPF, or how to derive a restricted Boltzmann machine (for my ML researcher hires) - not because they need to know it on the job, but because it betrays a lack of fundamental knowledge.
I can find new grad candidates with a similar profile at a handful of domestic CS programs (Cal included), but (Cal specific) someone with a BA CS from LAS who never touched CS152, CS161, CS162, or CS168 isn't getting hired into the early career pipeline for a security startup when they took CS160, CS169L, or CS169A because they are "easier", or isn't getting hired as a junior MLE if they didn't take all the more theoretical undergrad ML classes at Cal like CS182, CS185, CS188, and CS189. And even worse if they are a BA DS without a second fundamental major like AMATH or IEOR.
[0] - https://eecsis.mit.edu/degree_requirements.html#6-3_2025
[1] - https://eecsis.mit.edu/degree_requirements.html#6-3_2017
[2] - https://www.scribd.com/document/555216170/6-3-roadmap
-------------
Edit: can't reply so replying here
> Give me a new grad with strong fundamentals, a love of programming, and an interest in the domain and I'll teach them in sixth months whatever they missed in college that's relevant to the job
I 100% agree. A lot of core foundational classes that at the very least build the mindset of how to problem solve are not offered or have severely reduced the curriculum and content offered.
> until the implication that it's learning the nitty-gritty details that's important.
Not what I meant. What I mean is you can't understand or ramp up on (eg.) eBPF without understanding how the Linux Kernel, syscalls, and registries work. If you don't have the foundations down, I can't justify spending $120k base plus 30% in benefits and taxes hiring you out of college.
> These are kind oddly specific criteria
I'm giving random examples from individual portfolio companies
> Are those really things you think new grads need to know
This is the kind of curriculum a new grad from Cal (be they on F1 OPT or a citizen) are competing with when my portfolio companies have hired new grads.
TAU - https://exact-sciences.m.tau.ac.il/yedion/2021-22/computer_s...
IITD - https://www.cse.iitd.ac.in/academics/btech_links/curriculum....
Uniwersytet Warszawski - https://informatorects.uw.edu.pl/en/programmes-all/IN/S1-INF...
Babeş-Bolayai University - https://cci.ubbcluj.ro/wp-content/uploads/2022/04/Curricula-...
There is a level of mathematical or hardware-software maturity that is built into top programs abroad that make it hard to justify hiring new grads domestically.
In Israel, India, much of Eastern Europe, and China - all universities follow the same curriculum as defined by their Ministries of Education.
I can find new grad candidates with a similar profile at a handful of domestic CS programs (Cal included), but someone with a BA CS from LAS who never touched CS152, CS161, CS162, or CS168 isn't getting hired into the pipeline for a cybersecurity vendor when they took CS160, CS169L, or CS169A because they are "easier".
Ask Opus or Gemini 2.5 Pro to write a plan. Then ask the other to critique it and fix mistakes. Then ask Sonnet to implement
Are you aware of any user interfaces that expose some limited ChatGPT functionality using a UI, that internally uses llm. This is for my non-techie wife.
I'll switch to o4-mini when I'm writing code, but otherwise 4.1-mini usually does a great job.
Fun example from earlier today:
llm -f https://raw.githubusercontent.com/BenjaminAster/CSS-Minecraft/refs/heads/main/main.css \
-s 'explain all the tricks used by this CSS'
That's piping the CSS from that incredible CSS Minecraft demo - https://news.ycombinator.com/item?id=44100148 - into GPT-4.1 mini and asking it for an explanation.The code is clearly written but entirely uncommented: https://github.com/BenjaminAster/CSS-Minecraft/blob/main/mai...
GPT-4.1 mini's explanation is genuinely excellent: https://gist.github.com/simonw/cafd612b3982e3ad463788dd50287... - it correctly identifies "This CSS uses modern CSS features at an expert level to create a 3D interactive voxel-style UI while minimizing or eliminating JavaScript" and explains a bunch of tricks I hadn't figured out.
And it used 3,813 input tokens and 1,291 output tokens - https://www.llm-prices.com/#it=3813&ot=1291&ic=0.4&oc=1.6 - that's 0.3591 cents (around a third of a cent).
I get that you want to be "open", but is everyone involved in these transactions ok with them being shared? Even if they are, this doesn't seem like a good idea security wise. I see partial account numbers and other IDs/numbers that I assume you'd prefer not be public, regardless of how insensitive they may seem now.
EXPENSIFY, INC. VALIDATION XXXXXX5987 THE HACK FOUNDATION +$0.89
FRONTING $10,000 TO CHRIS WALKER FOR GITHUB GRANTS MADE FROM PERSONAL ACCOUNT -$10,000.00
CHECK TO LACHLAN CAMPBELL +$800.00
Transfer to Emma's Earnings -$1,923.08
Not sure if all the organizations using their software know this.