What's an area that people think is up and coming? (e.g. like social networks were in 2004, mobile apps in 2010, or vlogging in 2014)
I've finished several projects simultaneously and I'm looking to work in an area with lots of users, but as yet few producers. Wouldn't even need to involve programming, but probably would need to be online, as I'm pretty introverted!
- Neural networks / ML (eg GPT-2) Definitely nowhere near its potential for being applied to a wide variety of areas. Find a niche you like and apply there.
- Security / Privacy (eg Telegram) Rapidly growing demand pretty much everywhere. Bonus points if you can make your product great for standard users and at the same time hackable/customizable for people who want to do that. Capitalize on both legs of the pareto distribution.
That all being said, if you are ambitious and talented without an all-consuming passion for software, I highly recommend you find something you can work in hardware. Since the '70s or so most industries have been basically frozen, besides computer hardware/software. Yet in the meantime materials science and engineering design has advanced considerably, both of which form the basis for innovation in new technologies. This is why SpaceX was able to build components for 10-100x cheaper than the leading suppliers in the early 2000s.
I work at a startup in nuclear fission, particularly because this tech is at <1% of its potential right now. The same could be said for many other areas.
Here's some ideas you might find interesting, that I think could work in the next decade or so: - Supersonic air travel - Electric air travel - Nano/micro-scale metallurgy and materials for industry - Biological materials - Gut/microbiome - Genetic engineering - Nuclear fission / fusion - Carbon capture - Cross-laminated timber (CLT) for construction - Indoor farming / optimizing farming in general - Synthetic meat / meat alternatives
Wind, solar power, energy storage still have huge room for improvement.
Direct solar or wind -> liquid fuel will be essential to adapting fast.
Windmills can drive production of liquid ammonia on farms (needing only air and water inputs), for use directly as fertilizer and fuel, without need for a grid attachment and without blocking sunlight needed for the plants. Ammonia is not a very dense fuel (e.g. terrible for aircraft), but that doesn't matter for farm machinery.
Direct solar -> hydrogen has been demonstrated, with bio-reaction for hydrogen + CO2 -> liquid fuel suitable for aircraft. Direct hydrogen-fueled aircraft are feasible, and more efficient than with kerosine, but the design cycle is too long.
Wind turbines will be wearing out as the blades erode. No-moving-parts screens might be the next generation, extracting power by releasing ions to be carried away from an electric field. The old towers will still be useful, and the rare earths can be mined out for other uses, probably vehicle motors.
Batteries are a very material-intensive storage medium. Underwater air storage does not need exotic materials or tech, and the pressure at depth makes the strength of materials needed minimal, other than piping.
We need to replace huge amounts of refrigeration equipment with versions that don't rely on HFCs, and get the HFCs incinerated. One gram of HFC traps as much heat as 2500 g of CO2, and lasts centuries in the atmosphere. Once vented, it cannot be recaptured. Ammonia-cycle systems need to be made safe enough for general use, and HFC versions outlawed.
Without massive progress in the next decade, civilization will probably collapse by 2035.
This is deathcult type rhetoric which has no scientific basis whatsoever.
I don't believe most would agree here. There are a lot of problems humanity faces, and humanity is a lot of different people. Those still living in extreme poverty today probably don't care about our worries of climate disruption.
> Arguably, working on anything that doesn't help there is not just wasted effort, but actively harmful.
Why is this?
> Direct hydrogen-fueled aircraft are feasible, and more efficient than with kerosine, but the design cycle is too long.
This is really interesting but doesn't strike me as true based on my background. Do you have anything I could read regarding? I would be very interested
> Without massive progress in the next decade, civilization will probably collapse by 2035.
This I really don't agree with or understand.
I'm very worried about emissions, but I don't see why civilization would collapse any time soon, if at all. 2035 is very soon.
Our climate and ecology are certainly at risk, but I think the biggest threat in the world right now is the rapid rise of fascist China. They're challenging the notion that free speech and democracy are required for capitalism and economic gain.
China has grown so emboldened under President Xi that they're no longer content to just alter or buy out our media companies. They're flat out dictating marching orders to Western organizations and asking for employees that oppose their mandates to be fired. They're kidnapping foreign nationals and holding them hostage on trumped up charges. They've grown beyond stealing our ideas - now they're trying to supplant them.
That doesn't even begin to capture the things they're doing within their own borders. Surveillance state, social credit, travel limitations, Uyghur detention camps, supposed organ harvesting, Hong Kong / Tibet / Taiwan, ... In China it's actually 1984, and they're teaching the world that it works. If they win this battle, I worry we might wind up facing similar prospects in our future.
> Without massive progress in the next decade, civilization will probably collapse by 2035.
That's a little bit overblown. Want to make a bet on longbets.org? I'll happily donate to a green cause.
This is a joke not backed by any scientific consensus or evidence. Maybe some more instability in some countries, but nothing close to the collapse of civilization.
We need to provide energy to the network of machines that will produce the basic supplies to cover humanity's fundamental needs.
To expand on your key ideas: it looks pretty comprehensive but if you're interested in making the world a better place as well, have a look at 80,000 hours [1]. They've been thinking about this question for at least 8 years and it's quite extensive. They have a simple quiz that might also be an interesting starting point [2].
[1] https://80000hours.org/key-ideas/
[2] https://80000hours.org/career-quiz/#/
I honestly feel like reading this is a moment that I will l recall in about 8 years and think: "Damn, I wish I listend to that comment about hardware from johnmorrison".
- Keeping them fed with clean, high quality, low latency feature data.
- Understanding the impact of an ML intervention on the overall system or business process, in aggregate and for all the relevant subpopulations.
- Understanding why scores may be drifting. These are very challenging alerts to investigate.
- Resource efficiency at very high QPS or in mobile contexts.
But Telegram offers just as much security and privacy as Slack.
2. I have my own issues with Telegram (the refusal to introduce sustainable funding, pushing of crypto currencies, possibly also marketing it as more than it is) but except that there's a lot to like about it and I think it is a decent starting point if someone with strong cryptographic skills wants to start with a decent client and go ahead to provide something better (either introduce e2e-encryption or improve, document and provide some way to verify the current solution.)
1. Technology that helps doctors/practices service more elderly patients
2. Wellbeing technology for seniors (Headspace/Calm should totally push towards this area)
3. Personnel management for homecare nursing
List could go on and on. None of this is especially "trendy" but there's clear demographic reasons to build startups in this area and there's going to be a natural onramp of capital & solutions for how to care for a radically older population are going be sorely needed.
Money is the biggest challenge. Elderly who can afford to pay for their care have plenty of options available for all of this. Many elderly (on Medicare or fixed income) simply don't have any money.
> 1. Technology that helps doctors/practices service more elderly patients
Telemedicine and home care services, like Honor, are addressing this. Telemedicine will improve as more technically adept people age.
> 2. Wellbeing technology for seniors (Headspace/Calm should totally push towards this area)
This is purely a monetary issue. The moment Medicare covers these services, they will be used extensively.
> 3. Personnel management for homecare nursing
This one is part money and part skill. There simply isn't enough money to pay people with homecare skills. The price most people are willing to pay typically falls in the range of a Home Health Aid. There are some great ones, but there are also tons that lack skills, ethics, and compassion for the role.
I don't blame them either. Why would you do home health when you can get similar pay working at a factory, warehouse, or, even, fast food.
My mother was a nurse for years in the aged care sector. The pay for a qualified nurse is actually pretty good (I would still argue it should be higher for the work, but it was well above average).
Homecare aids (or AIN/care assistants as they were called here) are paid horribly and treated as expendable (even when the roles are difficult to fill since no one wants to wipe asses and give 89 year old dementia patients bed baths for $18.6 AUD per hour).
Most of the care assistants I met were checked out and only there because they had no other option. The combination of minimum wage and high turnover also means you end up with some really horrible people in these roles. I remember hearing multiple stories about care assistants stealing from homes and mistreating patients. If you paid a little better and treated the staff with respect, you wouldn't need to hire the dregs of society to look after the most vulnerable in society.
There's a little bit of user blaming in that statement (not being critical, just thinking out loud). Dealing with the elderly and folks who are less familiar with tech is an interesting design space.
Does anyone know design patterns that specifically seek to improve the experience for non-tech savy elderly?
You have people aging into a technological world built by kids with hands and fingers that work well, eyes that work well, and little cognitive decline.
This industry is garbage for the disabled and old. Just garbage. Trying running a WCAG compliance check on any arbitrary piece of software. For most of you, that means looking up WCAG. For 99%+ of you, you will have trouble even finding tools for the products you work on.
Someone will make a lot of money with a much simpler cell phone with a lot of white glove handling of security and OS updates, and just a curated set of apps. Apps designed for these populations and their unique problems (but sensitive to their age related disabilities) will be big, too.
edit: for example: https://www.ncbi.nlm.nih.gov/pubmed/14748929 "Iris recognition as a biometric method after cataract surgery." and https://www.ncbi.nlm.nih.gov/pubmed/19604439 "The use of computer touch-screen technology for the collection of patient-reported outcome data in rheumatoid arthritis: comparison with standardized paper questionnaires"
There are huge problems to solve that add value. But some of these businesses are very old school.
You can't force someone to digitise if they don't want to.
There are a lot of barriers to entry, and there can be lengthy sales cycles.
Someone will definitely get there and be hugely succesful, I'm just not sure if the time is now or if we'll need to wait years.
They can click on all the shady links they want, but I haven’t had to field tech support in a long time.
The biggest problem for iPads is every time there's a software update, it asks you to set a keypad password. They set one, thinking they are guessing one they've already set, they don't remember it, and they get locked out every time.
We have been living in the golden era of software industry, thanks to Moore's law. We were able to afford RISC (= general purpose CPU arch), general purpose operating systems, general purpose languages, general purpose databases etc all because the hardware was going to evolve and get faster anyway.
Now with Moore's law showing signs of death, the future for better computing would be domain driven stack. A quick thought experiment will be that: cloud applications will be written with cloud-friendly languages, using cloud friendly databases, on cloud-ready operating systems and processors that are architected for heavy cloud workloads. Much like how gaming was relying on custom stack for performance (GPUs, play station, X-box, etc)
The advent of TPUs by Google is a symptom of this pattern too. Of course, personal computers with general-purpose-everything will keep existing, but the business industry will start shifting towards domain driven stack slowly and steadily for obvious reasons.
I see your point, but we are already using specialized algorithms to solve problems on generic hardware (CPU). You can move to different generic hardware (GPU/OpenCL/...) which might be better suited (depending on the problem), or use/rent more generic hardware on demand (cloud computing).
What you're implying is already happening, using/programming "generic" FPGAs to act as specialized accelerators seems to be slowly trending (e.g. Xilinx UltraScale); and if that's working well, "larger" process nodes seem to be getting cheaper these days (e.g. >= 45nm ASICs). But as far as I am aware the tooling and ecosystem for all this is still pretty bad; especially compared to how C/C++ compilers came a long way, JS's ease of accessibility or python's trove of libraries. (Disclaimer: I am not working in that field, so I might be outdated).
So to refine you suggestion: Improving the eco system around hardware synthetization could be a thing?
However, that doesn't seem to be what user richtapestry was thinking of(?).
Because if it's the latter, that doesn't sound like a domain driven stack to me.
Only Nintendo bothers with writing custom kernels, and historically Sony with the PS2 having exotic "Cell" processor units.
I'm not sure what that would look like. Mainframe-esque, perhaps?
Carefull, you invest your code base on a "cloud-friendly" language and clouds then could fall out of style. That goes for other components as well.
We'll have massive libraries of re-usable "components" for interesting DNA-sequences. People will slowly slice together more complicated features, and freely trade organic components with each other via post. At some point in the future electronics will catch up to bioengineering, leading to better ways to make changes to DNA. We'll eventually be able to change DNA in something like a "biology IDE" and have usable components printed out the other end. After that point, our world probably won't look anything like it does today.
It won't be long before someone decides to give themselves glowing skin or super strong muscles (and people have already tried the latter!) I for one welcome our super-human overlords.
I tend to agree. However, having spoken at length about this with a friend doing her PhD, there are a few major problems between now and then.
- Scope: if you thought Big Data (relating to human behavior) was a massive endeavor (still very much not solved, not by a long shot), try genetics. We're talking orders of magnitude Bigger Data. We have the PoC but finding practical solutions remains a hard problem as we speak (needle in a haystack).
- Money: Bio-sectors don't pay software engineers enough to compete with the tech sector (almost no one does), and bio-experts are generally not good enough at it. So there's a huge lack in terms of dual [SE skills + domain knowledge] experts for this category of problems. Research funding is massive in big (private) pharma and comparably non-existent in basic research — and for now, CRISPR-Cas9 is mostly the latter.
The first problem (resources) will probably solve itself as time goes by (assuming some Moore's Law continuation however it's done), however the second problem (domain / education politics? Idk how to call it) could virtually be forever — academia and big pharma aren't exactly known for being fast movers or innovators, let alone disruptors. Especially when CRISPR-Cas9 is a direct threat to well-established revenues in the trillions — curing whatever is much less profitable than selling drugs to ease symptoms over a lifetime.
If this sounds like somptuous irony, it's probably because it is.
One of my former lab mates was doing his thesis on some fluorescent cancers screening stuff, somewhat similar. In his presentations, he'd use a slide explaining the order of magnitude issues in finding cancer cells this way. To illustrate this he'd explain it wasn't like finding a needle i a haystack, but more like trying to find a 20-gauge needle in a Walmart filled with 19-gauge needles.
>Bio-sectors don't pay software engineers enough to compete with the tech sector (almost no one does), and bio-experts are generally not good enough at it.
I've a fair amount of programming (enough to be really dangerous), so other students would come to me with help every once in a while. One of my friends getting his PhD in neuroscience traded a case of beer for an afternoon in helping him. He was doing some vision research with gerbil and was trying to time neuron spikes with some images on a screen. By the 32nd nested 'if statement', I requested another case of beer.
Generally, research-grade programming and software is, at best, spaghetti. At worst, you get answers that you think are right, but are wildly off. You can really lie to yourself, and the rest of the world, when you publish those errors as facts. Most grad students are learning programming by the seat of their threadbare pants, and it shows.
Computing and biology are opposite one another as disciplines. Computing has been built up by people from invented principles; but in biology we are working our way down from observations to try to infer the principles underlying them.
It’s easy to look at our global computing infrastructure and think we are good at understanding complexity. But we built that; we should understand it. Again: not true for biology. There’s no guarantee that we will ever fully understand the full scope of how life works.
And computing still has problems! Side effects are rampant in software; we usually call those bugs or vulnerabilities and there are a lot of them.
And electronic computing has fewer ethical concerns. It’s generally not considered unethical to risk crashing a computer. Very different story if you are risking deforming foetuses
I can't wait for all the pre-alpha version creatures :-P
Not affiliated with the company. Just amazed at how accessible that tech is becoming.
ML in movies and rendering. Deep Fakes is just an amateur's tech demo. In a few years, I expect to see render rights actually become a thing. Like, you don't act in the movie, but give a company rights to use a 3d render of you in it.
real time and full time holograms / AR. With remote work becoming a thing, I see a huge market for full 3d renders of the person presenting or even completely virtual AR work places where people check in to work. Maybe not for a decade or two, but whoever builds the flagship product will make a ton of money.
I'm kind of apprehensive about this advancement. There's many ways to mess this up, from intentional dataset poisoning (to encourage patients to get unnecessary procedure) to just terrible errors made with just a shift in data capture methodology. This paper has a lot of detail on how such systems can fail (https://arxiv.org/pdf/1804.05296.pdf).
Within the current system- you can just skip to encouraging patients to get unnecessary procedures. At worst, ML adds an extra step.
It's important to remember that innovation doesn't have to be perfect, it just has to be better than the system it replaces.
How do you figure that it never pans out? Healthcare providers use a lot more technology today than they did in 1979.
From medical devices to billing software to electronic health records, I think this is an area that will continue to grow.
It is 100% a field that needs disrupting... someone will figure it out and become very wealthy.
Yeah, that's kinda-surface-level stuff, but these kind of health metrics are already extremely commonplace and helping open the door to further tech<->healthcare integration.
There's reason why actors earn money and it's not due to perfect face or body.
And no, I don't mean applying blockchain to everything, or internet money that just goes "up and to the right".
I mean the new applications of distributed systems research and cryptographic primitives that allow for highly composable, highly trustworthy, permissionless, autonomous machines.
MakerDAO, Arweave, the whole field of open finance (aka "DeFi"), and so many others collectively are very likely to change the fundamental assumptions we make when using software.
Here's a great talk regarding trustlessness: https://www.youtube.com/watch?v=G0rZcpfF5dU
source: https://www.dictionary.com/browse/trustless
https://web3.foundation/about/
I do agree with you, though. I recently had to work with Uniswap and I am blown away by how simple yet convenient and powerful it is. Not possible pre-blockchain at all. I think the mistake that people are making is thinking that blockchain is trying to replace some current relationships, and unsuccessfully, yet with these Ethereum projects it seems like completely novel systems are emerging that were not possible before blockchain.
I dunno there are a lot of medical conspiracy theories, but the average person doesn't know crap about how we know their bodies work. A couple of years of biophysics has taught me that biology is waaaaaaay more complicated than the nastiest distributed computer system you've ever heard of.
When I hear civilians trying to talk about health or bio, or why cancer is a hoax ... it's like being a car mechanic who knows a thing or two about cars and having someone come up to you to tell you they need their banana wipers replaced because the windshields don't roll properly when they put their foot on the cigarette lighter. Like ... I have no idea what you're even saying beyond some medieval animist notions of good stuff and bad stuff. /rant
This is not really a fair assessment. The most cost effective treatments can be self administered. But if you are coming in to see me because of persistent knee pain, more often that not you have already tried tylenol and over the counter nsaids (ibuprofen, etc.), and your pain has been persistent. The next set of treatment options, aside from maybe some physical therapy, are more expensive, because they involve using drugs that are more expensive or treatment measures that are more expensive. With re: PRP, it is not covered by insurance, since the benefits of it over a placebo are still not entirely clear. Many doctors offer it however, and to insure accuracy of needle placement, they usually perform it under ultrasound guidance- not something most people would be able to do on their own, at least not yet. As more and more physicians offer it, and if it remains uncovered, market forces will drive the prices down (like Lasik), although I'm doubtful PRP is considered an effective treatment option in the future.
I don't disagree with the notion that healthcare remains quite expensive in general. I see telemedicine, for certain conditions, reducing costs and maybe Direct Primary Care - bundling care into one package instead of serving it piecemeal could be a good way to go forward.
Seems likely it will just be a matter of time until little storefronts pop up in strip malls with various user friendly diagnostic equipment to give the remote Doctor more data to treat more conditions (maybe this already exists)?
The biggest thing is you seem extremely willing to take risks without understanding the gravity of what you're doing. Sure, you CAN give yourself PRP injections and LIKELY will have little to no side effects. Healthcare providers need extreme confidence that (a) a procedure will be effective (b) the risks are appropriate.
I once saw a surgeon try out new combined cut & staple hardware designed for bowel surgery on a patient's liver, because he had been told it would work. It didn't, and there was a ridiculous amount of staples in the patient's abdomen afterward.
Doctors often become more risk prone the more power they have, and the hierarchy of human medical doctoring contributes to this.
They kind of work, but its still a lot of work and would mean consumers getting hands on a bunch of controlled substances.
Being able to produce meth in a 'lab' that fits in your hand is not crazy... except you can clearly see how crazy that is.
In reality, I know someone who works in an environment where pharmaceuticals are manufactured. He can't even lift a ceiling tile (steam fitter) without getting approval.
Maintaining FDA quality-level may not be possible in a DIY-context. These are clean rooms where the manufacturing occurs. High up-front costs.
But it would certainly be incredible for people who pay $1,000 per pill for some life-saving medication today, if they could totally roundabout the pharmaceutical industry.
Gartner's Top 10 Trends of 2020: https://www.gartner.com/smarterwithgartner/gartner-top-10-st...
The average person who reads the first 10 pages of search results for their disease definitely knows more about it than their doctor. Doctors are smart, but medicine is huge and they aren't polymaths.
The obvious obstacle is that you legally cannot prescribe medicines without a medical license but hey who said the doctor has to physically be present or can't be a desperate Carribean med school graduate.
In terms of work being "promising", wouldn't I want to takle a global problem, not one unique to a specific country?
One of the side effects of globalization and the internet is that it becomes more and more apparent over time when there are opportunities for asymmetric impact. And there will always be bad actors that look to take advantage of that.
This is partly because we as consumers get used to the abuses and start accepting them. Back in the 90s, the onset of email spam was something that caused a lot of indignance and active outrage. Banner ads were a huge deal, too.
But a lot of it is just from the bad actors getting more sophisticated. Including state actors that are invested in causing the breakdown of democratic norms in other countries.
So I think there will continue to be opportunity in the decentralization realm. Tools for various forms of self-governance. That could mean publishing (activitypub), hosting (ipfs), or even actual governance (decision-making and voluntary policy compliance among groups).
https://www.amazon.com/Inviting-Disaster-Lessons-Edge-Techno...