Rough math plugging in public #s and comments here:
- All stock deal at Aug 2021 val of 38B (1B ARR)
- Assume rev doubled to 2B (which may even be aggressive)
- SAAS multiples are down 6x since Aug 2021
- 38B x 2 / 6 = $12.7B
- 12.7B / 38B * 1.3B = 434M = effective price
- Assume 100M to pref stock
--> Comes out to 334M, with a chunk of that (1/3? 1/4?) potentially subject to earn out
Rough math plugging in public #s and comments here:
- All stock deal at Aug 2021 val of 38B (1B ARR)
- Assume rev doubled to 2B (which may even be aggressive)
- SAAS multiples are down 6x since Aug 2021
- 38B x 2 / 6 = $12.7B
- 12.7B / 38B * 1.3B = 434M = effective price
- Assume 100M to pref stock
--> Comes out to 334M, with a chunk of that (1/3? 1/4?) potentially subject to earn out
It makes sense to use these tools, but we need to remember that we revert back to our human ability level when they are offline. We should still invest in our own human skills, and not assume that these tools will always be available. There will be minor outages like these, and in the case of cyber attacks, war or other major disruptions, they could be offline for longer periods of time.
Similar offline risk goes for all tech: navigation, generating energy, finding food & water.
And as others have noted, like other personal tools, ai will become more portable and efficient (see progress on self hosted, minimal, efficiently trained models like Vicuna that are 92% parity with OpenAI fancy model)
Right now, you can't pay a lot and get a local LLM with similar performance to GPT-4.
Anything you can run on-site isn't really even close in terms of performance.
The ability to finetune to your workplaces terminology and document set is certainly a benefit, but for many usecases that doesn't outweigh the performance difference.
https://lmsys.org/blog/2023-03-30-vicuna/
https://www.semianalysis.com/p/google-we-have-no-moat-and-ne...
While consumers are happy to get their data mined to avoid paying, businesses are the opposite: willing to pay a lot to avoid feeding data to MSFT/GOOG/META.
They may give assurances on data protection (even here GitHub copilot TOS has sketchy language around saving down derived data), but can’t get around fundamental problem that their products need user interactions to work well.
So it seems with BigTechLLM there’s inherent tension between product competitiveness and data privacy, which makes them incompatible with enterprise.
Biz ideas along these lines: - Help enterprises set up, train, maintain own customized LLMs - Security, compliance, monitoring tools - Help AI startups get compliant with enterprise security - Fine tuning service
For the search engine (e.g. not GSuite), Google's customers are actually the advertisers and not the users. This is not meant to be a cynical take but just stating financial reality. The advertisers are the ones paying billions into Google's revenue to maintain the expensive data centers and host petabytes of disk space for Youtube videos. Because money flows from advertisers to Google, the advertisers are the ones that caused Adpocalypse[1].
The websurfers querying the search engine are users or consumers and not the paying customers. Not sure how a users' union would have much leverage since they don't pay. If they're unhappy, they can use another search engine (e.g. Bing) or influence indirectly (e.g. boycott advertisers which causes Adpocalypse.)
[1] https://www.google.com/search?q=youtube+advertisers+adpocaly...