There are always opportunities, at least in the USA.
I started programming in 1985 and thought I was very likely too late. I was the same age as Chris Espinosa, who was Apple's head of documentation at age 14. At that time it was still not perfectly clear that PCs would be regarded as essential to business. I had no one to talk to about computers and programming. I was in a very rough spot in my life and had to teach myself everything, while not being sure I'd have a job in 5 years.
A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course. They were selling stock like crazy. By the time I left 4 years later the stock had gone up 1300%. Last I checked it's gone up yet another 2500% or so--I kept my 1,000 ESPP shares mostly out of nostalgia and thanks to a company that was so good to me.
I bought a website after the first dot com bust of 2001, monetized it by hiring well, and it provided a very good living for well over two decades after that.
This is an incredible time to start a web-based business because there's a surfeit of free and almost free tools for development all the way to deployment.
It's interesting to hear your experience at this point in time. Mine was later but noticeably different.
I started coding in the 90's as a teen releasing shareware, with my first gig at Adobe in 2001, where they paid me $14/hr as an intern. Even though this was rough times for tech, me and my CS peers at UW felt a lot of optimism and enthusiasm about what could be built because the framework still supported idealists and we cared less about the economy (and we were definitely naive). Both the researchers and entrepreneurial types were very curious about inventing new paradigms.
When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types', with the latter having interest concentrated on what is booming (like AI), and more about what can be built but what will be able to raise large sums or attract more users. I'll caveat this with I don't know to what extent the people I talk to do have a bias, but I do wonder if there are less people willing to explore new frontiers now.
One major difference between now and then is that the fraction of US market cap that is tech, and to a lesser extent, tech's importance in the economy. I wonder if this established leader position somehow could make people less optimistic and willing to explore?
> When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types'
In my view the pure research types who land jobs have always been maybe 1%? UW is research heavy, don’t forget. And at the time I was at Microsoft almost no one had plans to quit and start their own company. Before I started there in 1996 I figured about 80% would—-an early indicator of how distorted my view was from press reports and my own biases. It seems to me that these days the vast majority are happy to be drones. Or am I way off?
In my time most of the employees were very smart students who just kind of fell into programming and were hooked in by Microsoft’s pioneering efforts at sending recruiters to colleges (rare at the time) and especially less-obvious choices such as Waterloo and Harvard.
I'm 16 and I want to grow up and revive the concept of a fixed-function graphics card to make graphics cards cheaper and more focused on gaming. I might even look into acquiring the 3dfx brand and reviving it with that goal. AI and crypto whatever might be the buzzwords but they don't interest me.
I think it is because the geeks "won". A lot of geeks found each other during the early web days. Nowadays it's not a stigma to be a geek. Everything is more mature and commercially integrated, including mopping up geek-ly minded people into commercial ventures. It's harder today to have an outsized impact from a bedroom and no budget. (Not in absolute numbers, how often this happens, but in relative terms! Many more people are at it nowadays, so some slip through with a run-away success from nothing.)
> A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course.
People thought Microsoft had run its course, one year after the release of Windows 95?
Just as MS Word was dominating WordPerfect? And the likes of AutoCAD were dropping support for AIX, Solaris etc? When Photoshop had just released for Windows, and Apple's market share had dropped to about 3% ?
Wow! I know they were kinda late to online stuff, but still - strange how these things turn out, in hindsight!
> People thought Microsoft had run its course, one year after the release of Windows 95?
Not sure if I made this clear, but I am referring to industry publications, and to what seemed to be a vast majority of my colleagues at Microsoft. Experienced business watchers who could read an SEC filing, and who had a mature grasp of business, knew better, but it wasn’t sexy to keep printing opinions like that.
I’m sure Department of Justice problems weren’t helping either. So these kids were exercising stock options like crazy because of the bad press.
These were people in their mid 20s who had entered these amazing jobs just out of college, and who had experienced several years of astonishing growth. They were both internally and externally very good at taking criticism. I think they just overcorrected at the first sign of trouble. I was a decade older, and had been studying business on my own for about 20 years by that point.
I had also worked for many companies that were nowhere near as well put together as Microsoft. Microsoft makes its share of mistakes, but never in ways that jeopardize the company’s existence. That may sound trivial, but if you’ve ever run a company, you’ll understand that it’s a surprisingly rare trait. Just managing not to fuck up too badly is exceedingly rare in the business world, and I think they didn’t quite understand that. Because they were open to criticism, they really couldn’t understand how special the company was at that time. Since they came in fresh from college, they had nothing to compare their jobs to. To them it was what any job should be.
If you lived through it, at a young age, you were on Slashdot or similar places, and Linux was riding high on the server (it was absolutely massacring Unix, and for kids at college, Linux + Samba was a quite powerful combination compared to even a few years before (think: Novell Netware).
Few college kids have experience with really massive enterprise setups and solutions, and so those were just entirely discounted out of hand.
>I started programming in 1985 and thought I was very likely too late.
Lots of skilled programmers started coding when they were kids. But that does not mean starting to learn coding when you are 30 years old won't make you an excellent programmer: actually, you can learn even faster at that age because concepts like logic, dividing an issue into several small ones, and iterating step by step until the final solution are more intuitive to you than to a kid.
I agree that this is a weak argument. I think a stronger argument is simply to say: every year we see people start independent, bootstrapped businesses. There's no reason to believe the potential here has changed. The only way to include yourself in that set is to try, and never give up.
EDIT: I will say, it may be a powerful story for anybody experiencing self doubt. Different people need different kinds of motivation.
I think the point is that he thought it was too late in 1985, then people thought the same ten years later (1996), then the original article (from 2014) tries to explain how 2014 is not too late either (“the best time is now”).
In other words, it is always easy to think it is too late when looking back. But it is never too late if we try to look forward and imagine the things that have not invented yet and which will be around in 10-20 years time.
I like people with your attitude! Leaves more for people willing to work and to learn things. I upvoted your comment out of pity.
That era represented the collapse of 8 bit microcomputers. They were huge, then nothing. Silicon Valley had boomed, then gone bust. No one had much use for computers except brave financial analysts willing to use the toylike Apple ][ for primitive spreadsheets. No one took the Mac seriously until Postscript and laser printers happened.
In October 1987 Black Monday hit, causing the worst market crash since the Depression. I hit up a bunch of angel investors—a term that didn’t even exist—to raise money for a batch file compiler. A compiler! That supported me and several employees for years.
Point is that you can always make opportunities happen even in hard times.
While I agree with the spirit of the post, I think that there are better and worse times to start something new, and in retrospect 2014 seems like it was one of those worse times. The period from 2014--2024 was an era where the sheer gravity of the big tech platforms crushed out innovative startups left and right. People with an extreme focus on product polish could succeed, like Slack (est. 2013) and Discord (est. 2015), but it feels like most of the tech-sphere was either just working on half-hearted products towards an inevitable acqui-hire, or fervently trying to mainstream blockchain in a wishful attempt to create an entirely separate, more open ecosystem.
Yeah there are, but it's hard to know where you are in. So best just to have fun & create great stuff. If focused on making people's lives better you never really go wrong in the longterm.
The large IT companies are acting in ways that are completely against innovation and self-improvement. I expect them to freeze-up like the old giant corporations that they are.
But the change only started after the US decided to change its free-money-for-rich-people policy, and that was last year.
Well, a startup (ish…) is threatening Google, and VC money is much much tighter. So I’d say the Silicon Valley era is officially over. I hope y’all enjoyed it! I for one am a bit excited to see what’s the next big consumer idea to capture the zeitgeist.
The last thing Slack had was an extreme focus on polish. As a chat system, it's hardly functional and far less so than irc which came before. Slack managed to sell chat to big corporations, that was its innovation.
2013 was a great time to join in the fray, I think the clarifying point is that you are working on the machine, you are not embodying the success or failure of the machines of that time.
Not to seem pessimistic, but in the 10 years since this article what have we really gained on the internet that we didn’t have then? Seems like we got a lot more social media and some failed promises from crypto. This is barring the current ai stuff since it’s still really shaking out.
There was the mobile and cloud boom. Which resulted in more digital payments (more difficult for crime and corruption), online to offline stuff like ride sharing and e-commerce. Plus a ton of advancements on logistics, especially in developing countries.
I think most of these changes didn't affect developed nations so much, it's probably still good old Walmart and Amazon. But they were lifechanging to developing nations. We had some advancements in the rights of factory workers as they had to match gig workers, and crime dropped drastically in some places because it just wasn't worth it anymore when you could climb out of poverty by delivering food.
mobile and cloud was well alive and booming a decade before 2014 (i guess you could very technically argue not cloud but S3 buckets on aws caught fire almost instantaneously)
Not sure about e-commerce being a last ten years thing, if anything we kind of lost a lot of e-commerce to the consolidation of online shopping into online markets like Amazon, Etsy, alibaba etc. Also online shopping software itself has consolidated up into the big tech companies, and small vendors are somewhat forced into the Amazon psuedi-monopoly.
I'm a painter pursuing traditional-style work. The education system absolutely failed me, and I have seen it fail countless others with the same desire.
The last 10 years have seen a renaissance of academic painting information and education, and social media, particularly Instagram, has been the fuel.
There is so much now that simply wasn't there then. My life would be so much different and happier if I were coming of age now with those desires instead of a decade ago. Nonetheless, it is still dramatically improved by the advancements I described.
I no longer feel so alone. I suspect many people with different niches are enjoying richer lives like I am, due to the last 10 years of internet.
I had a similar thought but challenged myself to think about the other side. Using this list many companies and thinking about those founded after 2012, because it takes a couple years to enter the mainstream, we can see that there's quite a bit of opportunity. https://en.wikipedia.org/wiki/List_of_largest_Internet_compa....
Social media for sure, but the entirety of AR/VR. AI, and not just GPTs. but recommendation, detection, sentiment, data mining are all things that are 'new'. We can also think about things like online banking or healthcare apps that didn't exist. I was still sending checks in 2014, and I certainly wasn't doing video visits with my doctor. As someone that is middle age, when I ask younger people they point out how much opportunity as a counterpoint to the cynicism I'm seeing here HN.
> but in the 10 years since this article what have we really gained on the internet that we didn’t have then
Figma comes to mind as an obvious standout example. We didn’t even have the technology to support that level of multiplayer computationally heavy UI in the browser back in 2014. No native apps had collaboration that smooth either.
Collaborative [text] document editing in general is a good example. So mundane these days in all the big web-based work apps that we don’t even notice it anymore.
I believe I recall quake 3 running in the browser a few years before 2014. I imagine if we had the technology to run that, we had what was needed to render a wysiwyg UI editor with multiple cursors.
Figma is a good product and being in the browser was a huge booster for its adoption but as a product, technically it could have existed 15 years ago as an installable program. With the same success? Probably not.
What made those products boom was cheap and qualitative video calls (so, bandwidth) that are totally necessary to work in real time collaboration.
India - and probably many other countries outside of the Americentric West zone - achieved complete adoption of digital (mobile) payments, shot up internet connectivity to the moon (to the point that it now has the largest population of netizens in the world), made huge strides in FinTech and Digital Learning, achieved complete adoption of digital commerce including app-based ride-hailing and food delivery, and saw the blossoming of a gigantic app/digital services-based economy.
Life has changed radically here as compared to 2014.
Zoom? Try having a video conference call 10 years ago. I remember going to an HP corporate office (in Taiwan, if I remember correctly) in 2012 where they proudly demonstrated a video conference room that worked well. Setting up calls using WebEx back then was slow, had poor performance, and usually somebody failed to join and had to call in.
The main thing that happened here was bandwidth. Back then you'd be lucky if someone had the 2mbps needed for a reasonable HD call. The underlying technology has not really changed since then though.
In about 2005, while I was IT for a startup, I worked with a counterpart at an investor (Deloitte maybe?) whose job was largely coordinating video calls in advance of meetings. We’d exchange IP addresses and SIP addresses and whatnot so that our Polycom could talk to their Cisco or whatever it all was. She and I would join 30 minutes beforehand to make sure it all worked, stand by during the meeting in case there was trouble, then come back after the execs were finished and shut it all down. The calls were at least as good as Zoom or Teams today, at the expense of two IT people and dedicated equipment. The execs never knew it not to work great. A much simpler time. We have come a long way.
Tinder was technically launched in 2012 but the swiping thing came out in 2013. These tools are an essential part of connection for a large group of people. And you can't really handwave social media aside, it's been absurdly relevant since 2014. The Apple watch launched in 2015 and had led a revolution in wearable internet-connected tech. 4k TVs and subsequently hq streaming wasn't a thing til after 2012.
Amongst the other examples, IoT. 10 years ago it was still in its infancy. As an example, Amazon didn't acquire Ring until 2018. beyond just smart home stuff, payments like Apple and Android Pay impact daily life. The EV boom is also a massive part of IoT. Every TV is now a smart TV (which is miserable, but that's a whole other discussion)
And an IoT world has nearly as many drawbacks as it does benefits, but I think it's hard to argue it hasn't changed the way we interact with the internet in our day to day lives.
I don't think the things you're referring to have had nearly the impact you're claiming they had. They've just taken things we already had and made them slightly more convenient.
Ring is just doorbell cameras. Japan has had these for decades. Smart home is a complete failure, nobody uses Alexa or siri or ok google seriously. EVs are just cars but slightly different. Smart TVs just simplify the process of having to buy a Chromecast or plug in a laptop to your TV.
Discord and slack figured out messaging, and the web development tooling now is infinitely better than 2014. On the front end, HTML5 and responsive web design were still new in 2013, and React came out in 2013/stabilized in 2015/released Hooks in 2019.
On the backend, Next.js brought us SSR in 2016, MongoDB brought us document databases by winning “fastest growing database engine” in 2013 and 2014, and Docker brought us disposable containers in 2013.
The list is stacked towards older tech but that’s maybe because recent tech hasn’t proven itself yet: svelte in 2020 is still maturing AFAICT, and ive never heard of Vite (2021) or SolidJS (2022). I personally think many exciting non-AI trends are also ramping up now, such as Local First development (formalized in May 2023).
I think that the economy and innovation in general were curtailed due to the climaxing rampant corruption in the US, but the internet is something of an exception IMO.
This is all talking about developer abilities, of course: the constraints of corrupt capitalism mean that many of the actual sites released in the past decade have been terribly bloated and over-featured. But I think that’s partially a consequence of businesses moving everything into mobile-first websites and apps —- you’re gonna see a lot more low-margin websites.
The best time to plant a tree is twenty years ago. The second best time is today.
I don't know why anyone would sit around moping about "If only this were thirty years ago!" If it seems like your idea would be "easy" or something -- if only it were thirty years ago -- then most likely it's because we are where we are and you know what you know now that you wouldn't have known then.
It's like all the people who say things like "Youth is wasted on the young" and "I wish I had known what I know now back when I was seventeen." Yeah, you know it at all because you aren't seventeen.
I do, because figuring out what is going to be big 30 years from now is hard (arguably impossible to do deterministically), but knowing what is big now is easy, and wishing you could go back and capitalize on that is natural.
A lot of people also spend scads of time contemplating how they would spend the money if they won the lottery and it's generally a waste of time. They probably aren't going to win the lottery and it usually isn't used as a stepping stone to making real life decisions about, say, what kind of career they want to pursue or how they wish to spend their retirement.
I used to do that sort of thing and then began going "If I value x, why sit around telling myself I can't possibly have that or some piece of that without winning the lottery? Why not do what I can to actually work towards that to some degree, in some fashion?"
So you may get it. That's fine. I don't and that's not commentary on anyone but me and how my mind works.
If this were thirty years ago instead of today, the opportunity to build X[1] would be wide open, but you'd have to build it with the tools that were available 30 years ago. That might not be nearly as easy as building it with today's tools.
[1] "X" meaning "variable to be substituted for" rather than "company that made a bizarre naming choice for no apparent reason".
Also you have the clear hindsight - if I go back 30 years I'm going to be developing an iPod or something, not any of the flops from 1994 (even though people might note that there are flops in 1994 that have something quite similar be a big hit in 2004 ...).
It's like the people who talk about how obvious it was to invest in Apple or Microsoft or BRK or whatever "if they'd been there at that time" and yet by definition the companies that will be referred to like that in 20 years time almost certainly exist today, but - how do you find them?
I feel like the last 10 years have been a maturing phase. There weren’t a lot of opportunities for young upstarts without funding. It seemed like it was the era of big money innovation, burning mountains of cash trying to stake the last few open claims on the app and web ecosystem. And there was the crypto stuff, yuck, many went to prison.
But the real revolution is AI. Thinking back to 2014 and peering forward, it’s unfathomable. If there had been a sci-fi movie, I would have thought it unrealistic. I still think I have no idea where this will take is or how much our industry will change. What a great time to be an entrepreneur.
I'm young, and talking about years like 1985 feels like talking about some other reality, but to me it still feels like the two years aren't comparable.
Whenever I read about the history of computers and software in the 80s, it feels like there are always mentions of relatively new companies foraging their path, new hardware manufacturers and software developers shaping the brand new home computer industry. Sure, there were some old giants on the playing field, but it almost sounds like anyone with a decent idea could carve out a space for themselves.
2014 though? There were a bunch of opportunities back then, obviously, but it was already deep into the megacorp consolidation era, when large software or web companies owned dozens of unrelated services and had secured an almost endless influx of cash. There were startups, and a few of them became extremely successful, but it feels like most of them were either doomed to fail or be bought out by a larger company. I feel like in this last decade, internet innovation had mostly slowed - the internet of 2004 was extremely different from the internet of 2014, yet 10 more years passed since then and it doesn't feel like as much has changed.
Maybe it's just my imaginary rose-tinted view of the past or something, but it feels like it's harder than ever for a startup to compete with the big players now. The only big exception I can think of is video games - self-publishing anything was probably almost impossible in the 80s, but nowadays we have an endless sea of development and publishing tools for independent developers with fresh new ideas.
Perhaps, there's a completely new field on the horizon that will level the playing field once more, putting everyone back at square one. I think that some industries could get squeezed dry until the next big innovation comes along, or people move onto some other new space.
Boy have we been in a holding/consolidation pattern. The age of massification has been upon us; getting everyone else online has been the effort, the way to rise. Free online services we can connect to from anywhere has been an amazing change, a totally new expectation that totally changes how computing situates itself in our lives.
At much cost to figuring out further places to pioneer, I feel. We need new cycles with new energy, where lots of people are trying stuff out again. Computing & tech should not stay well settled; we should be trying crazy stuff. It feels like a lifetime ago that Tim O'Reilly was preaching "follow the alpha geeks", look for those very capable folks doing excellently for themselves. That ability to trendspot & try lots of things has been somewhat by these huge scales & systems, but I believe a return to personal-ity has to crop up again sometime. We'll find some fertile terrains where new things are a happening again.
2014 was when thing were actually really setting in place, when the pioneering stage was really giving way to some settlers (and town planners, see: https://blog.gardeviance.org/2015/03/on-pioneers-settlers-to...). There's a lot of roughness, but tech getting it's mojo back may be not far off. It's a challenge though; the reality of creating ever-connected ever-online systems is hard. We have amazing amounts of server we can fit in a pizza box, at incredibly low $/performance, but we're unpracticed at doing it without pain at smaller scales, by ourselves, anew. Trying to create new generative platforms, that serve as a basis to let more value spring up: it needs a strong foundation, so that it can keep "creating more value than it captures," another O'Reilly-ism.
As someone who has used the Internet since 1985, I constantly find myself reminded of the fact that the Internet isn't just port 80. The Internet is so much more than just the web, and when someone comes up with a cross platform, powerful application which uses some other port and some other protocol, it will be just as functional on the Internet as any other Web Browser.
We could just as easily produce clients which exchange Lua bytecode. In fact, we do (Games, etc.) .. but we could just as easily build an execution environment (which is what browsers are) that allows a much wider and broader range of application capabilities than the browser.
This, then, is what I have in mind, when I think "I've been on the Internet too long, I've become jaded": actually, the browser is just the lowest common denominator. As soon as some other client application is built which pushes user experience beyond what the browser is capable, the Internet will still be there to host it.
And I find that inspiring, even after 40 years of crud.
> As soon as some other client application is built which pushes user experience beyond what the browser is capable
The problem is that the browser is so good at being a ubiquitous, open-source, open-standards-based, cross-platform, near-instant-loading secure remote code execution environment. As Brendan Eich says, always bet on JS and WASM. I would extend that to -- always bet on browsers.
With the amount of change that has occurred in browser technology over the last 30 years, I strongly think the future looks like more improvement in browser technology (and perhaps deprecation/removal of old standards) than an entirely new paradigm. I know various early web pioneers want to build a post-Web application delivery system (notably Tim Berners-Lee and Douglas Crockford), but given the accumulated legacy of modern browsers and web apps and the ongoing investments into browser technology, I don't see how any post-Web system could gain any traction within the next 25 years.
But yes - your point still stands. If anything better than browsers ever appears, the internet will indeed still be there to host it.
1/ As guest in current browsers acting as hosts, for the zero-install ubiquity of the Web. This would require WebGPU, Wasm, fast JS<=>Wasm calls, and installable PWAs at least. Gatekeepers notably Apple can and will hinder. Once established, this new hosted app could be nativized to drop the browser dependency.
2/ As new obligate not facultative browser-like app in new device, e.g., XR eyewear. The device would have to take off at very large scale, and its maker(s) would want to do things not easily done via (1). The extant Web would be mapped onto in-world surfaces but the new device and its content stack would support things that go way beyond evolutionary improvements to the Web.
I used to think (2) might happen with Oculus being acquired by Meta, or by Apple eyewear, but I'm less optimistic now. The Big Tech players, especially Apple, can build nice h/w, but it won't get to scale quickly enough to subsume the Web.
I started programming in 1985 and thought I was very likely too late. I was the same age as Chris Espinosa, who was Apple's head of documentation at age 14. At that time it was still not perfectly clear that PCs would be regarded as essential to business. I had no one to talk to about computers and programming. I was in a very rough spot in my life and had to teach myself everything, while not being sure I'd have a job in 5 years.
A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course. They were selling stock like crazy. By the time I left 4 years later the stock had gone up 1300%. Last I checked it's gone up yet another 2500% or so--I kept my 1,000 ESPP shares mostly out of nostalgia and thanks to a company that was so good to me.
I bought a website after the first dot com bust of 2001, monetized it by hiring well, and it provided a very good living for well over two decades after that.
This is an incredible time to start a web-based business because there's a surfeit of free and almost free tools for development all the way to deployment.
I started coding in the 90's as a teen releasing shareware, with my first gig at Adobe in 2001, where they paid me $14/hr as an intern. Even though this was rough times for tech, me and my CS peers at UW felt a lot of optimism and enthusiasm about what could be built because the framework still supported idealists and we cared less about the economy (and we were definitely naive). Both the researchers and entrepreneurial types were very curious about inventing new paradigms.
When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types', with the latter having interest concentrated on what is booming (like AI), and more about what can be built but what will be able to raise large sums or attract more users. I'll caveat this with I don't know to what extent the people I talk to do have a bias, but I do wonder if there are less people willing to explore new frontiers now.
One major difference between now and then is that the fraction of US market cap that is tech, and to a lesser extent, tech's importance in the economy. I wonder if this established leader position somehow could make people less optimistic and willing to explore?
In my view the pure research types who land jobs have always been maybe 1%? UW is research heavy, don’t forget. And at the time I was at Microsoft almost no one had plans to quit and start their own company. Before I started there in 1996 I figured about 80% would—-an early indicator of how distorted my view was from press reports and my own biases. It seems to me that these days the vast majority are happy to be drones. Or am I way off?
In my time most of the employees were very smart students who just kind of fell into programming and were hooked in by Microsoft’s pioneering efforts at sending recruiters to colleges (rare at the time) and especially less-obvious choices such as Waterloo and Harvard.
What kind of shareware did you publish?
The market seems like a bad fit for pro-humanity technology—researchers at least have that (albeit fading) potential in their back pocket.
People thought Microsoft had run its course, one year after the release of Windows 95?
Just as MS Word was dominating WordPerfect? And the likes of AutoCAD were dropping support for AIX, Solaris etc? When Photoshop had just released for Windows, and Apple's market share had dropped to about 3% ?
Wow! I know they were kinda late to online stuff, but still - strange how these things turn out, in hindsight!
Not sure if I made this clear, but I am referring to industry publications, and to what seemed to be a vast majority of my colleagues at Microsoft. Experienced business watchers who could read an SEC filing, and who had a mature grasp of business, knew better, but it wasn’t sexy to keep printing opinions like that.
I’m sure Department of Justice problems weren’t helping either. So these kids were exercising stock options like crazy because of the bad press.
These were people in their mid 20s who had entered these amazing jobs just out of college, and who had experienced several years of astonishing growth. They were both internally and externally very good at taking criticism. I think they just overcorrected at the first sign of trouble. I was a decade older, and had been studying business on my own for about 20 years by that point.
I had also worked for many companies that were nowhere near as well put together as Microsoft. Microsoft makes its share of mistakes, but never in ways that jeopardize the company’s existence. That may sound trivial, but if you’ve ever run a company, you’ll understand that it’s a surprisingly rare trait. Just managing not to fuck up too badly is exceedingly rare in the business world, and I think they didn’t quite understand that. Because they were open to criticism, they really couldn’t understand how special the company was at that time. Since they came in fresh from college, they had nothing to compare their jobs to. To them it was what any job should be.
Windows 2000 was really important for MS staying relevant in anything outside consumer. It really set up the next two decades.
Few college kids have experience with really massive enterprise setups and solutions, and so those were just entirely discounted out of hand.
Actually if everyone just does that, nobody will need to work!
Bitcoin's up 10x in the last 4 years.
Broadcom's up 6x in the last 4 years, 21x in 10 years.
Facebook/META's up 5x in the last 2 years, 20x in 11 years.
Uber's up 3.5x in 2 years.
Tesla was up 11x between 2019 and 2021.
Microsoft's up 10x in the last 9 years.
Amazon's up 11x in the last 10 years.
Apple's up 10x in the last 10 years.
Oracle's up 3x in the last 10 years.
Lots of skilled programmers started coding when they were kids. But that does not mean starting to learn coding when you are 30 years old won't make you an excellent programmer: actually, you can learn even faster at that age because concepts like logic, dividing an issue into several small ones, and iterating step by step until the final solution are more intuitive to you than to a kid.
EDIT: I will say, it may be a powerful story for anybody experiencing self doubt. Different people need different kinds of motivation.
In other words, it is always easy to think it is too late when looking back. But it is never too late if we try to look forward and imagine the things that have not invented yet and which will be around in 10-20 years time.
That era represented the collapse of 8 bit microcomputers. They were huge, then nothing. Silicon Valley had boomed, then gone bust. No one had much use for computers except brave financial analysts willing to use the toylike Apple ][ for primitive spreadsheets. No one took the Mac seriously until Postscript and laser printers happened.
In October 1987 Black Monday hit, causing the worst market crash since the Depression. I hit up a bunch of angel investors—a term that didn’t even exist—to raise money for a batch file compiler. A compiler! That supported me and several employees for years.
Point is that you can always make opportunities happen even in hard times.
The large IT companies are acting in ways that are completely against innovation and self-improvement. I expect them to freeze-up like the old giant corporations that they are.
But the change only started after the US decided to change its free-money-for-rich-people policy, and that was last year.
I think most of these changes didn't affect developed nations so much, it's probably still good old Walmart and Amazon. But they were lifechanging to developing nations. We had some advancements in the rights of factory workers as they had to match gig workers, and crime dropped drastically in some places because it just wasn't worth it anymore when you could climb out of poverty by delivering food.
The last 10 years have seen a renaissance of academic painting information and education, and social media, particularly Instagram, has been the fuel.
There is so much now that simply wasn't there then. My life would be so much different and happier if I were coming of age now with those desires instead of a decade ago. Nonetheless, it is still dramatically improved by the advancements I described.
I no longer feel so alone. I suspect many people with different niches are enjoying richer lives like I am, due to the last 10 years of internet.
Social media for sure, but the entirety of AR/VR. AI, and not just GPTs. but recommendation, detection, sentiment, data mining are all things that are 'new'. We can also think about things like online banking or healthcare apps that didn't exist. I was still sending checks in 2014, and I certainly wasn't doing video visits with my doctor. As someone that is middle age, when I ask younger people they point out how much opportunity as a counterpoint to the cynicism I'm seeing here HN.
Figma comes to mind as an obvious standout example. We didn’t even have the technology to support that level of multiplayer computationally heavy UI in the browser back in 2014. No native apps had collaboration that smooth either.
Collaborative [text] document editing in general is a good example. So mundane these days in all the big web-based work apps that we don’t even notice it anymore.
What made those products boom was cheap and qualitative video calls (so, bandwidth) that are totally necessary to work in real time collaboration.
Games have generally been way ahead of business apps in this sense.
Life has changed radically here as compared to 2014.
And an IoT world has nearly as many drawbacks as it does benefits, but I think it's hard to argue it hasn't changed the way we interact with the internet in our day to day lives.
Ring is just doorbell cameras. Japan has had these for decades. Smart home is a complete failure, nobody uses Alexa or siri or ok google seriously. EVs are just cars but slightly different. Smart TVs just simplify the process of having to buy a Chromecast or plug in a laptop to your TV.
On the backend, Next.js brought us SSR in 2016, MongoDB brought us document databases by winning “fastest growing database engine” in 2013 and 2014, and Docker brought us disposable containers in 2013.
The list is stacked towards older tech but that’s maybe because recent tech hasn’t proven itself yet: svelte in 2020 is still maturing AFAICT, and ive never heard of Vite (2021) or SolidJS (2022). I personally think many exciting non-AI trends are also ramping up now, such as Local First development (formalized in May 2023).
I think that the economy and innovation in general were curtailed due to the climaxing rampant corruption in the US, but the internet is something of an exception IMO.
This is all talking about developer abilities, of course: the constraints of corrupt capitalism mean that many of the actual sites released in the past decade have been terribly bloated and over-featured. But I think that’s partially a consequence of businesses moving everything into mobile-first websites and apps —- you’re gonna see a lot more low-margin websites.
"...constraints of corrupt capitalism" caused websites to be boated?
Edit: plus flexbox
I don't know why anyone would sit around moping about "If only this were thirty years ago!" If it seems like your idea would be "easy" or something -- if only it were thirty years ago -- then most likely it's because we are where we are and you know what you know now that you wouldn't have known then.
It's like all the people who say things like "Youth is wasted on the young" and "I wish I had known what I know now back when I was seventeen." Yeah, you know it at all because you aren't seventeen.
I used to do that sort of thing and then began going "If I value x, why sit around telling myself I can't possibly have that or some piece of that without winning the lottery? Why not do what I can to actually work towards that to some degree, in some fashion?"
So you may get it. That's fine. I don't and that's not commentary on anyone but me and how my mind works.
[1] "X" meaning "variable to be substituted for" rather than "company that made a bizarre naming choice for no apparent reason".
It's like the people who talk about how obvious it was to invest in Apple or Microsoft or BRK or whatever "if they'd been there at that time" and yet by definition the companies that will be referred to like that in 20 years time almost certainly exist today, but - how do you find them?
But the real revolution is AI. Thinking back to 2014 and peering forward, it’s unfathomable. If there had been a sci-fi movie, I would have thought it unrealistic. I still think I have no idea where this will take is or how much our industry will change. What a great time to be an entrepreneur.
I recall that movie seemed cute but probably technologically impossible. Now it’s reality for many users of AI girlfriends.
Whenever I read about the history of computers and software in the 80s, it feels like there are always mentions of relatively new companies foraging their path, new hardware manufacturers and software developers shaping the brand new home computer industry. Sure, there were some old giants on the playing field, but it almost sounds like anyone with a decent idea could carve out a space for themselves.
2014 though? There were a bunch of opportunities back then, obviously, but it was already deep into the megacorp consolidation era, when large software or web companies owned dozens of unrelated services and had secured an almost endless influx of cash. There were startups, and a few of them became extremely successful, but it feels like most of them were either doomed to fail or be bought out by a larger company. I feel like in this last decade, internet innovation had mostly slowed - the internet of 2004 was extremely different from the internet of 2014, yet 10 more years passed since then and it doesn't feel like as much has changed.
Maybe it's just my imaginary rose-tinted view of the past or something, but it feels like it's harder than ever for a startup to compete with the big players now. The only big exception I can think of is video games - self-publishing anything was probably almost impossible in the 80s, but nowadays we have an endless sea of development and publishing tools for independent developers with fresh new ideas.
Perhaps, there's a completely new field on the horizon that will level the playing field once more, putting everyone back at square one. I think that some industries could get squeezed dry until the next big innovation comes along, or people move onto some other new space.
Boy have we been in a holding/consolidation pattern. The age of massification has been upon us; getting everyone else online has been the effort, the way to rise. Free online services we can connect to from anywhere has been an amazing change, a totally new expectation that totally changes how computing situates itself in our lives.
At much cost to figuring out further places to pioneer, I feel. We need new cycles with new energy, where lots of people are trying stuff out again. Computing & tech should not stay well settled; we should be trying crazy stuff. It feels like a lifetime ago that Tim O'Reilly was preaching "follow the alpha geeks", look for those very capable folks doing excellently for themselves. That ability to trendspot & try lots of things has been somewhat by these huge scales & systems, but I believe a return to personal-ity has to crop up again sometime. We'll find some fertile terrains where new things are a happening again.
2014 was when thing were actually really setting in place, when the pioneering stage was really giving way to some settlers (and town planners, see: https://blog.gardeviance.org/2015/03/on-pioneers-settlers-to...). There's a lot of roughness, but tech getting it's mojo back may be not far off. It's a challenge though; the reality of creating ever-connected ever-online systems is hard. We have amazing amounts of server we can fit in a pizza box, at incredibly low $/performance, but we're unpracticed at doing it without pain at smaller scales, by ourselves, anew. Trying to create new generative platforms, that serve as a basis to let more value spring up: it needs a strong foundation, so that it can keep "creating more value than it captures," another O'Reilly-ism.
The future is (with hope) exciting!
We could just as easily produce clients which exchange Lua bytecode. In fact, we do (Games, etc.) .. but we could just as easily build an execution environment (which is what browsers are) that allows a much wider and broader range of application capabilities than the browser.
This, then, is what I have in mind, when I think "I've been on the Internet too long, I've become jaded": actually, the browser is just the lowest common denominator. As soon as some other client application is built which pushes user experience beyond what the browser is capable, the Internet will still be there to host it.
And I find that inspiring, even after 40 years of crud.
The problem is that the browser is so good at being a ubiquitous, open-source, open-standards-based, cross-platform, near-instant-loading secure remote code execution environment. As Brendan Eich says, always bet on JS and WASM. I would extend that to -- always bet on browsers.
With the amount of change that has occurred in browser technology over the last 30 years, I strongly think the future looks like more improvement in browser technology (and perhaps deprecation/removal of old standards) than an entirely new paradigm. I know various early web pioneers want to build a post-Web application delivery system (notably Tim Berners-Lee and Douglas Crockford), but given the accumulated legacy of modern browsers and web apps and the ongoing investments into browser technology, I don't see how any post-Web system could gain any traction within the next 25 years.
But yes - your point still stands. If anything better than browsers ever appears, the internet will indeed still be there to host it.
1/ As guest in current browsers acting as hosts, for the zero-install ubiquity of the Web. This would require WebGPU, Wasm, fast JS<=>Wasm calls, and installable PWAs at least. Gatekeepers notably Apple can and will hinder. Once established, this new hosted app could be nativized to drop the browser dependency.
2/ As new obligate not facultative browser-like app in new device, e.g., XR eyewear. The device would have to take off at very large scale, and its maker(s) would want to do things not easily done via (1). The extant Web would be mapped onto in-world surfaces but the new device and its content stack would support things that go way beyond evolutionary improvements to the Web.
I used to think (2) might happen with Oculus being acquired by Meta, or by Apple eyewear, but I'm less optimistic now. The Big Tech players, especially Apple, can build nice h/w, but it won't get to scale quickly enough to subsume the Web.