It seems like the Fannie Mae data was shared with Freddie Mac. Aren't they both quasi-government organizations? GSEs. So they're both supported by the government but there's a firewall between them to keep some semblance of competition?
It seems like the Fannie Mae data was shared with Freddie Mac. Aren't they both quasi-government organizations? GSEs. So they're both supported by the government but there's a firewall between them to keep some semblance of competition?
Sector Company 1 Company 2 Information Technology Microsoft (MSFT) Apple (AAPL) Financials JPMorgan Chase (JPM) Berkshire Hathaway (BRK.B) Health Care Johnson & Johnson (JNJ) UnitedHealth Group (UNH) Consumer Discretionary Amazon (AMZN) Tesla (TSLA) Communication Services Alphabet (GOOGL) Meta (META) Industrials Boeing (BA) Caterpillar (CAT) Energy ExxonMobil (XOM) Chevron (CVX)
Dead Comment
It's worth being specific: the National Weather Service operates some of the most robust automation and radar ingest pipelines on Earth, but the final go/no-go warning call is almost always human—often a single overnight forecaster on a console, monitoring a swath of counties. Automation (e.g., Warn-on-Forecast guidance) can surface threats, but the NWS intentionally doesn't have an 'auto-warn' button for tornadoes, because of the asymmetry of false positives (blow credibility, cost lives in the long run).
Budget cuts reduce redundancy and experience in those overnight shifts. When you have only one person monitoring instead of a team of two or three, you get decision fatigue and coverage holes, especially during clustered, multi-cell outbreaks. We've seen near-misses in the past, and every pro-meteorologist I know says they're playing defense against process errors, not just technology failures.
Before we point fingers or blame 'technology/automation' shortfalls, let's quantify the concrete bottleneck: skilled human decision-makers are the limiting reagent; machine learning warning aids are still years away from majority trust.
You have a President who is ordering the defunding of tons of groups (universities, media, aid, institutes) while not clearly having that authority and often doing so for what he views as ideological crimes.
Also arresting and trying to deport people for things that are not clearly crimes (newspaper op-eds, etc) and without due process.
Very strange times.
Right now I have some faith the courts in the US will stand up to this and get the US back on track but I worry that dam may not hold forever.
Saving grace is that his is not widely popular, although that is more for his tariff moves than for the others.
Doesn't matter when mileage isn't what's being compared - it's whether or not others have caused the same problem - PERIOD.
As called out elsewhere, workspace trust is literally the protection here which is being circumvented. You're warned when you open a folder whether you trust the origin/authors with pretty strong wording. Sure you may find this annoying, but it's literally a security warning in a giant modal that forces you to chose.
Even if automatic tasks were disabled by default, you'd still be vulnerable if you trust the workspace. VS Code is an IDE and the core and extensions can execute code based on files within the folder in order to provide rich features like autocomplete, compilation, run tests, agentic coding, etc.
Before workspace trust existed, we started noticing many extensions and core features having their own version of workspace trust warnings popping up. Workspace trust unified this into a single in your face experience. It's perfectly fine to not trust the folder, you'll just enter restricted mode that will protect you and certain things will be degraded like language servers may not run, you don't be able to debug (executes code in vscode/launch.json), etc.
Ultimately we're shipping developer tool that can do powerful things like automating project compilation or dependency install when you open a folder. This attack vector capitalizes on neglectful developers that ignore a scary looking security warning. It certainly happens in practice, but workspace trust is pretty critical to the trust model of VS Code and is also an important part to improve the UX around it as we annoy you a _single_ time when you open the folder, not several times from various components using a JIT notification approach. I recall many discussions happening around the exact wording of the warning, it's a difficult to communicate concept in the small amount of words that it needs to use.
My recommendation is to use the check box to trust the parent or configure trusted folders. I personally have all my safe git clones in a dev/ folder which I configured to trust, but I also have a playground/ folder where I put random projects that I don't know much about and decide at the time I open something.