I think it’s complicated and more tied in with your personal ambitions and passions than money. I imagine that most people making stupid money at the forefront of LLM development are probably very passionate about their work. I imagine that will make the career/person side of things much more intertwined than say… making stupid money working in finance.
It’s fairly easy to deal with bad upper management, poor organisational decision making, political infighting, silly bureaucracy, pseudo-work and a range of other nonsense when you’re mostly in it for the tech and go home at 4pm (or whenever you go home in your country). It’s quite another thing if you actually care about the mission.
OpenAI dual structure also makes things super complicated and adds gas to the fire. Like Altman doesn't own anything but has some weird VC fund, and all fuckery around employee compensation.
I think you underestimate the power of cults of personality and the extreme tech bubble people in Silicon Valley live in. People who are backing these individuals genuinely want to work with them, be around them, and most are hoping to become them one day. The percentage of those who deal with their shenanigans just for the money is likely very low.
you don't threaten to walk out in retaliation while vague-tweeting heart emojis because you think your CEO is a mere sociopath, IMO you do it because A, you won't build generational wealth without him, B, you feel there's an existential risk to humanity under any other leadership or C, you're a pragmatist whose social rank is proportional to how enthusiastic you are about the emperors' fashion
(OpenAI feels cultish to me as an outsider, some north korea shit going on there, us against the world)
I can see how it looks cultish. I know people who work there, and I know someone who managed to escape (with outside assistance) from scientology, and… if they're a cult, it's one about how dangerous AI could be, not how great Altman is.
But both of these examples are mere anecdotes; I have no formal qualifications to discuss cults, and contact with only two current staff members.
I'm sure there is a truth to it. But as soon as Sam Altman & Elon musk get mentioned this entire forum gets highjacked by hating on them instead of having an interesting discussion.
With all the negative press Altman is getting, I wonder if he’ll step down at some point to appease public opinion and run things from the background instead. Seems somewhat reasonable but unlikely with his ego.
> It seems pretty obvious that he's a dark triad type personality
The obvious moustache twirling villains mostly end up in prison almost immediately, precisely because they're so obviously bad.
Subtle and careful ones, that normal people don't notice are being manipulative, are generally the ones who get furthest — the darkness isn't shown to those who must not see it.
This is why so many in the US blame all of Congress except their own representative.
1. You assume that what press reports and public opinion is the same thing. It's not.
2. You assume majority of the public actually has an opinion or time to think about whether Sam should run OpenAI. They don't.
3. You assume public opinion is an accurate metric of justice. It's not. We have judicial system for that.
4. You assume Sam makes decisions through his ego and not any other system. You don't know that. Your ego thinks it's true. But honestly you have zero information unless you are really close to Sam in that case you have no reason to make this comment.
I believe everything she said about Sam. I don't doubt that he exaggerated how robust their safety processes were, or announced products without telling them first, or used company resources to benefit himself personally while denying he had a financial stake in the company.
I honestly believe that employees did talk about his toxic behavior, the role of a successful startup CEO can be achieved with surprisingly little people management skills.
But the board handled this horribly; it's unsurprising they lost the fight, and frankly, listening to her podcast, the main question I have is why she was on the board in the first place. The level of articulation, the weak framing of events, etc, are what I'd expect to hear out of a middle-level HR rep, not someone on the board of a billion+ dollar company.
She got played from day one and is mad because she was the last one to figure it out. Sam brought her on for academic credibility and she didn’t realize how powerless the role was until she made the mistake of attempting to flex that power. She was out maneuvered at every step and had zero business playing in that space, hard elbows are going to be thrown and this now is how she’s dealing with it.
This interview is exactly why she never should have been in that role to begin with.
It's surprisingly a bad look for her. Plus all the criticism to AI itself. She seems set on slowing down AI development, which is clearly at odds with what the company is doing. Why would she event be involved in this company let alone in the board.
> But the board handled this horribly; it's unsurprising they lost the fight, and frankly, listening to her podcast, the main question I have is why she was on the board in the first place. The level of articulation, the weak framing of events, etc, are what I'd expect to hear out of a middle-level HR rep, not someone on the board of a billion+ dollar company.
Your comment sounds more like ad-hominem and character assassination than an actual analysis of the accusations.
Only recently did OpenAI became a billion dollar company. A few years ago it was scrambling for financial support. Nevertheless this is immaterial to the fact that there's a solid accusation of a toxic work culture and shady practices that motivated firing the CEO.
https://news.ycombinator.com/item?id=40506582 - 427 comments
I think most people would, as was evident by the amount of people who backed him to come back as the CEO.
Not that every one of those people knew this, but there certainly were people who did.
> On the podcast, Toner attributed Altman's swift return to employees being told that the company would collapse without him.
> Additionally, once a potential return seemed likely, employees feared retaliation from Altman if they did not support him, she said.
These employees are smart enough to decide for themselves how critical they believe Altman is.
Regardless of who said/who thought that... Why would it?
Engineers need to take a good look at the mirror and stop idolizing "visionary jerk" figures.
It’s fairly easy to deal with bad upper management, poor organisational decision making, political infighting, silly bureaucracy, pseudo-work and a range of other nonsense when you’re mostly in it for the tech and go home at 4pm (or whenever you go home in your country). It’s quite another thing if you actually care about the mission.
(OpenAI feels cultish to me as an outsider, some north korea shit going on there, us against the world)
I can see how it looks cultish. I know people who work there, and I know someone who managed to escape (with outside assistance) from scientology, and… if they're a cult, it's one about how dangerous AI could be, not how great Altman is.
But both of these examples are mere anecdotes; I have no formal qualifications to discuss cults, and contact with only two current staff members.
It seems pretty obvious that he's a dark triad type personality and will keep going as long as possible.
The obvious moustache twirling villains mostly end up in prison almost immediately, precisely because they're so obviously bad.
Subtle and careful ones, that normal people don't notice are being manipulative, are generally the ones who get furthest — the darkness isn't shown to those who must not see it.
This is why so many in the US blame all of Congress except their own representative.
2. You assume majority of the public actually has an opinion or time to think about whether Sam should run OpenAI. They don't.
3. You assume public opinion is an accurate metric of justice. It's not. We have judicial system for that.
4. You assume Sam makes decisions through his ego and not any other system. You don't know that. Your ego thinks it's true. But honestly you have zero information unless you are really close to Sam in that case you have no reason to make this comment.
I am a member of the public, and I consider Sam Altman power hungry, egoistical and narcissistic.
My opinion has been formed through accounts in the press.
I honestly believe that employees did talk about his toxic behavior, the role of a successful startup CEO can be achieved with surprisingly little people management skills.
But the board handled this horribly; it's unsurprising they lost the fight, and frankly, listening to her podcast, the main question I have is why she was on the board in the first place. The level of articulation, the weak framing of events, etc, are what I'd expect to hear out of a middle-level HR rep, not someone on the board of a billion+ dollar company.
This interview is exactly why she never should have been in that role to begin with.
Your comment sounds more like ad-hominem and character assassination than an actual analysis of the accusations.
Only recently did OpenAI became a billion dollar company. A few years ago it was scrambling for financial support. Nevertheless this is immaterial to the fact that there's a solid accusation of a toxic work culture and shady practices that motivated firing the CEO.
Deleted Comment
Deleted Comment