Readit News logoReadit News
kalimanzaro · 3 years ago
Link to the actual article by the Oxford Physicist.

https://archive.ph/0VB0K

Valuation over value.

cerol · 3 years ago
This is how I think about it.

It's insurance money. If you're a manager of a big company like IBM, Microsoft or Google, you have to align your current product portfolio and future portfolio in such a way that shows your investor that your company will keep growing, even if your current products are stagnant.

You can surely say Quantum computing won't do much in next 5 years. But what about 10 years? 20 years? 30 years? The farther you look into the future, the bigger the probability of having a huge tech breakthrough that could give the company who has it a massive edge on the market.

Even if you have a chance of 1% of having a sort of transistor revolution from QC, it becomes a race to the bottom. If Google starts researching it, IBM will follow suit, and so will Microsoft. If in 30 years this turns out to be a big deal, no one will be 30 years behind.

bottled_poe · 3 years ago
Ah, the quantum ROI, where we affect the return by attempting to measure it.
viraj_shah · 3 years ago
This made me laugh, thank you!
sicp-enjoyer · 3 years ago
I think you are describing the company dynamics accurately, but I can't help think this is just a terrible way to invest. No party has a concrete plan or vision for how to use it, they just throw money because there is a consensus of good feeling around it. Those good feelings were probably created through academic or corporate marketing efforts in the first place.
cerol · 3 years ago
> but I can't help think this is just a terrible way to invest

Picking a company to invest in only half the job, choosing how much to invest is the other half.

It's a terrible way to invest if you put all your money into it. QC "changing the world" is a tail-end event. You allocate according to the risk.

hericium · 3 years ago
> a terrible way to invest. No party has a concrete plan or vision for how to use it, they just throw money because there is a consensus of good feeling around it.

Almost if I were reading about cryptocoins.

pyinstallwoes · 3 years ago
The economics change when you control the money (military aperatus funds most of the research, does it not?).
acomjean · 3 years ago
I mean these companies have Research and Development divisions. I was at IBM at the turn of the century when they spent 6 billion dollars on research. One of the pushes was to get research to focus on things they could market and make money with.

But the big research plays (bell labs, xerox parc) seem to get less and less funding if they exist at all. A lot of the inventions of those places were monitized outside those companies. IBM had a chip fab in the research building… long spun of was that business.

At the turn of the century IBM was researching quantum computing, but as I was leaving selling services was IBMs big push.

ptero · 3 years ago
It is not the amount of funding, it is the allocation that doomed R&D at big companies.

Forty years ago they were the best game in town for applied research (defining applied as, on success, having a fast track to commercialization). Later it became similar to the university research: on success, you write some articles, get company wows, but business people have no idea where to stick it and half heartedly throw a few applications at it to see if any stick. Most don't (e.g., deep blue, watson).

At this point large company R&D centers got passed over (by a lot) by VC- funded applied research and saw a (IMO well deserved) drop of funding.

Deleted Comment

graycat · 3 years ago
Supposedly one of the really big, important things could do with a quantum computer (QC) is quickly solve to optimality instances of NP-complete optimization problems, e.g., problems in scheduling, resource allocation, logistics, etc. which can be formulated as linear programming problems (that need just knowledge of linear equations) where want all the variables to have whole number values, that is integer linear programming (ILP).

Okay, integer linear programming problems .... To get all excited about quantum computing (QC), need to get excited by the big money to be saved by solving all those important, practical ILP problems.

Okay, I had a good background in pure/applied math and in computing and got into ILP for scheduling the fleet at FedEx. Since the promised stock was 1+ years late, I ran off and got a Ph.D., in one of the best programs, in more in hopefully useful pure/applied math, and much of that work was in ILP.

Here is some blunt truth about the NP-complete problems and the cartoon at the beginning of the famous book by Garey and Johnson: The math guys were talking to their manager explaining that they couldn't solve the manager's problem but neither could some long line of other math guys.

Here the blunt part is the meaning of "solve" -- with a computer program running in time only a polynomial in the size of the problem get an optimal solution to any instance of the problem including the worst cases. And here optimal means down to the last penny to be saved. So, for some network deployment by AT&T that was to cost $1 billion, save down to the last penny, in polynomial time, including for the worst case instance of the problem.

Yup, maybe the savings would be $51,937,228.21. And do want to save that last penny. But if the manager would settle for saving just the first $51,900,000.00 in reasonable computer time for all or nearly all the actual instances of the manager's real problem, then there would be little or no difficulty. And should be able to tell the manager that savings of more than $55 million, or some such, were impossible -- that is, have an upper bound.

So, much of the difficulty was saving the last $37,228.21, guaranteeing to do so, for all instances of the problem, including the worst cases.

Well, I can assure readers that should I have insisted on a career saving, e.g., $51,900,000.00 where savings of $55 million were impossible, then I would have spent the last several decades homeless on the streets or dead from homeless on the streets -- no joke.

Bluntly, there just is no significant demand for solving ILP problems in practice. The "managers" don't want to get involved.

Selling pizzas from the back of a truck? Sure -- might sell 100 pizzas a day. Selling solutions to ILP and other NP-complete problems -- f'get about it.

Uh, since there is no significant demand for saving $51,900,000.00 with a bound of $55 million, there stands to be not significantly more demand for saving $51,937,228.21.

Thus, there stands to be no significant value for QC for solving NP-complete ILP problems. Sorry 'bout that. If some people want to get the $51,900,000.00 savings, they've been able to do that for decades and have voted loud and clear "We don't care.".

E.g., in one of my attempts, a guy sent me an ILP problem, we talked, and two weeks later I had running code that in 900 seconds on a slow computer got a feasible solution guaranteed to be within 0.025% of optimality. The problem had 600,000 variables and 40,000 constraints. I had done the work for free. Still, then, suddenly he was not interested.

So be it.

There was another one: I was writing the code using the idea of a strongly feasible basis, and suddenly the customer was not interested and returned to some not very good heuristic code he had.

Better, a lot better, to sell something a lot of people actually want, e.g., a lot better to sell pizza.

And I am doing a startup that to me continues to look good, software running, but it has nothing to do with NP-complete or ILP and wouldn't be helped by QC.

So, to me, e.g., even if Google gets a good QC that can solve ILP problems, then I don't believe that they will have many customers or much of a business and there will be no big reason for IBM or Microsoft to worry.

Since there is no significant demand for using ILP to save money now, I don't see a significant demand for using QC on ILP to save money in the future.

Their employees might be better off selling pizzas. Let's see: From some of my arithmetic about costs of pizza, can do well for $2-3 a pizza. From a pizza truck in a good location might be able to sell the pizzas for an average of $10 each, e.g., an extra $1 for anchovies! Might sell 100 pizzas a day for $1000 a day, maybe 20 days a month. Looks like a better career than QC research!

If there is no demand for pizzas, then there won't be much demand for pizzas with anchovies.

Uh, the Google QC researchers are well paid? Terrific -- park the pizza truck near the Google QC research building!!!!

For some parts of US national security, the situation for a good QC might be significantly different -- I doubt it, but maybe.

pyinstallwoes · 3 years ago
That's the positive, positive outlook, yeah.

Negative, positive outlook is that it is a disinformation campaign so one may maintain the lead in a particular trajectory of technical dominance. Whilst doing so, as an extra game theoretic safety precaution which also amplifies the disinformation campaign is to fund any research in the direction of the disinformation campaign as both a distraction and 'impossibility canary.'

Quite... deliciously deceptive.

aqme28 · 3 years ago
I don't understand this argument at all. Of course it isn't making money yet-- that's because it's an early technology that is still being researched. Sure it might never mature, but it seems crazy to call it a "bubble" or to analyze it based on current sales figures.
sfpotter · 3 years ago
That wasn't the argument made in the article. There are reasons to believe that the technology is fundamentally unsound, and will never be able to scale or make money.
aqme28 · 3 years ago
It wasn't? The article leads with that argument.

"The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money."

I don't see any argument that the technology is fundamentally unsound or doesn't scale, even though that's an argument I'm pretty amenable to.

Cthulhu_ · 3 years ago
It's not actually early technology, it's been developed since the 80's. And if the underlying theories are unsound - if it doesn't even work in theory - then putting more money in won't make it magically viable.
krastanov · 3 years ago
1. Seems a bit unfair to say it has been developed since the 80's. In 80's a couple of people (e.g. Feynman) noticed that if you have a quantum simulator, you can simulate chemistry in a way that a classical computer is seemingly incapable of. But the transmon (one of the first possibly viable implementations of a qubit) was not developed until the 00's and complete control of some of these systems (e.g. transmon coupled to an oscillator, in order to make a memory) was not demonstrated until the 2010's. Life-times of quantum memories have also been growing exponentially for more than a decade (a trend that started in the 00's).

2. It is worth mentioning that by the standards of your comment, the time between conceiving of a classical computer (Babbage) and a scalable electronic computer (ENIAC and family) was about a century.

3. While ultimately there might be a "quantum winter" in the next few years because we (I work in the field) overpromised, this would not be the first time a tech that ultimately works gets disregarded for a decade or two because of mismanaging expectations (e.g. Liquid Crystal displays or neural networks, which were both developed for many decades before being commercially viable).

EDIT: And yes, there are some startups with misleadingly general pitches.

dgudkov · 3 years ago
Doesn't the same apply to nuclear fusion power generation?
posterboy · 3 years ago
If the output side is saturated with work load but the input side keeps growing to blow up without significantly changing the output, or at worst affecting it negatively (quantum blockchain buzzword bingo), it may be fair to speak of a bubble.
meltyness · 3 years ago
I think his model of the situation is short-sighted, to say nothing of the callbacks to that management principle involving transistors.

If you're thinking that the whole purpose of QC will be quickly subsumed by wide algorithms with superpolynomial speedup, you might be missing the point. It's about how computers are built, not about stuffing one specific abstraction into another. If suddenly we discover we can build a machine that can generate random numbers a quadrillion times faster than any current hardware design, that's a new space in computation.

I mean consider how widely deployed the parallelism construct is now, and that Amdahl's law was elucidated in the 60's.

Parallelism was just one degree of freedom for us to climb the S-curve on, quantum computing seems to provide essentially a continuum of them.

sweezyjeezy · 3 years ago
I think most people, including the author, would agree QC should be funded for fundamental research reasons. But that is clearly not the way it is being pitched to VC. Right now there is no clear use-case, that's what I felt he was warning against. If nothing materialises soon, he's probably correct to say this is a bubble.
marcosdumay · 3 years ago
> Right now there is no clear use-case

Why people keep repeating that? You mean that if somebody creates a machine that can simulate chemistry and materials science in polynomial time, nobody would use it? That's crazy.

meltyness · 3 years ago
I figure it's more like a standoff between these shops. Just consider that historically, no one would want to build ENIAC, everyone would want to design the solid-state transistor.
dr_dshiv · 3 years ago
I have a fair amount of experience in this space. It’s like, at a vacuum tube era, at best. There is a definite opportunity for advancement, but it is still extremely early.

We are building user interfaces that make it easier to “play around” with quantum computing phenomena—especially with music and art—with the idea that our aesthetic sensibilities may help drive discovery.

oldgradstudent · 3 years ago
> It’s like, at a vacuum tube era, at best.

How is it even remotely close?

Vacuum tubes were a thriving industry, producing many groundbreaking products and services.

krastanov · 3 years ago
I think the comparison is still reasonable: while we do not have scalable quantum computers, the technologies developed for them have actually seen a lot of use: squeezed light and non-classical light, color centers, Josephson junctions, nonlinear-optics at the single-photon level, to name a few "terms of art" that should be google-able, are crucial for precision sensing and telecom.
lamontcg · 3 years ago
Right now it is a pre-transistor in the 1920s and 1930s.

Vacuum tubes are analogous to the computers that we have now. In 1925 a patent for the concept of a FET was filed. It wasn't until 1948 that we had a working transistor.

That took 23 years to go from concept to useful invention. It isn't too surprising that a quantum computer is harder.

wikfwikf · 3 years ago
They are saying that quantum computing is at the stage that classical computing was at when it still relied on vacuum tubes.

Not that quantum computing is at the same stage as the vacuum tube industry was at at some unspecified time.

bbarnett · 3 years ago
Yet before that stage, they were a new tech, no one even had ideas of how to use.

So far, quantum computing is used in labs, not for any real useful purpose.

wfn · 3 years ago
Could you share some links to your work / research re: interfaces with focus on art? I'd be very interested to check it out.

https://quantumdelta.nl/ is some kind of hub but landing pages offer too much hype and too little content :)

thank you.

dr_dshiv · 3 years ago
I will have material to share in about a month!
belter · 3 years ago
"The Quantum Computing Bubble" - https://news.ycombinator.com/item?id=32630815

"Separating Quantum Hype From Quantum Reality" - https://news.ycombinator.com/item?id=32691220

photochemsyn · 3 years ago
This is something of a low-effort article, with a short-sighted focus on immediate profitability. There are many scientific programs that didn't really become private-free-market revenue generators for decades at least (the US space program, for example).

An article with a little more depth might examine the future of trapped-ion quantum computing, for example:

https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer

As far as the 'make money off new drugs' mentality, that's not really where QM chemical simulations in molecular dynamics really seems all that promising - it's more about things like the design of new catalysts to improve the efficiency of various industrial processes.

If QM computation is eventually developed, the devices will almost certainly be large and extremely expensive (kind of like the cutting-edge chip fab machines of today in scale). For most businesses, it's unlikely the benefit of owning one will justify the cost, so it'll probably be a national lab / research center type thing.

jillesvangurp · 3 years ago
The key consideration with investments is ROI. When the investor is a government; it can afford to take the long perspective. For most companies; and institutional investors, this works less well.

The key mechanism to protect inventions is patents. Patents have a limited shelf life. If you file a lot of patents today and it takes 30 years before you can apply them, they will have expired by then and others are free to take your inventions and build on that. So, if quantum computing requires another three decades to start making money, most of the companies that are currently being invested in will have failed and their patent portfolios and investment will be worthless. Their patents will have expired, their founding scientists will have moved on or retired, etc. At best those companies may be in a position to file more patents. So, any investors investing right now are making bets on how long it will take before there's a meaningful market to get an ROI and which companies are positioned best to take a chunk out of that market. The further that is out, the higher the risk of losing their investment.

There are billions flowing into quantum computing and the article is simply making the point that in terms of revenue potential there seems to be a lot of uncertainty about the practicality of current approaches, the lack of any real revenue (beyond consulting people on how awesome it would be if we had working quantum computing, etc.). And the lack of perspective on when all this will change. Very valid points. There are a few big companies investing in this stuff but none of them is betting their company on it. It's a side show at MS, Google, IBM, etc.

A long shot that might create some viable business decades further from now but if it all fails, their stocks will be fine. There's enough substance there for them to want to have a finger in the pie if it does take off but none of these companies seems to be counting on that happening any time soon.

drewbeck · 3 years ago
There’s also billions going into commercial fusion reactors, which haven’t turned net positive yet. The goal of the investment is to build that capability tho, same (I think?) as with quantum computing. Weird critique imo.
robertlagrant · 3 years ago
It's just saying that quantum startups are overvalued. Which may be true. It's certainly phrased in the article as an opinion.
mattnewport · 3 years ago
Thing is fusion is known to be possible - the sun and hydrogen bombs. Quantum Computing lacks equivalent existence proofs. There's a lot of abstract theory but no real indication it is possible to scale up in the real physical world to useful problem sizes and reasons to doubt that it is possible.
dilyevsky · 3 years ago
Neither of those are contained fusion. I think gp is implying that it might be impossible to have net energy gain for contained fusion (which currently seems to be the case even with most advanced designs)
pyb · 3 years ago
As the tech is still unproven, it's research, rather than building capability, that they're spending money on. I hope all these investors understand this...
sgt101 · 3 years ago
The physics underpinning fusion was proven in the 1950's. Since then it's been an engineering problem.

The physics underpinning qc was arguably proven in the 2020's. It's not quite done (in the way that fusion was not quite done in the 50's) but there is a fairly clear set of demonstrations that QC's with error correction are possible. However the engineering barriers are fierce and there is still a possibility that they are insurmountable. In addition there are concerns that while QC will work the class of problems that is NP and also BQP may be very small. Even if a problem is in that group then it may be that the algorithms we have are not superquadratic or quadratic - meaning that the improvement that they offer over classical algorithms may be marginal.

Worse, there are often very good heuristic approaches to some of these problems which means that although a superquadratic QC approach would be an amazing breakthrough of computer science (genuinely amazing and worthy of accolades and prizes and fundamentally important for our understanding of the universe etc) it would offer only marginal economic value (possibly). Now, this is not true of some problems where there are exponential explosions and no good heuristics... but there is an even worse catch.. Which is that the quantum algorithms offer computer scientists fresh insight into what's tripping up the classical approaches. In this scenario it can be that an amazing breakthrough happens in QC, and someone uses that to get an insight that pushes the classical approach close enough to the QC approach as to render the QC approach marginal.

The theoretical picture is moving very fast though - so we will have to see.

On the other hand the practical side is moving more slowly. We see announcements that make one think that a Moore's law type of scaling is happening but hidden in the small print there are often (always as far I can decode) catches that mean that while the results look great they are still very much mired in problems. For example, are all the bits on a QC useable at once? Can they be used to form an actual algorithm? How long does the machine run for? How long does it take to start? Some of the answers are jarring - often only a small subset of a machine can be used in an actual problem solving episode; sometimes the machines run for a few steps only; sometimes the machines take 24hrs or longer to start.

It has taken 70 years to nearly build fusion reactors, it took 70 years to create mRNA vaccines. It may well take 70 years (from now) to build practical, valuable quantum computers. And something could go wrong on that path that just renders them moot.

sgt101 · 3 years ago
I forgot to mention something else - the exact results of QC algorithms are read probabilistic from the instruments that read out the state of the machine. The confidence intervals for these results is important. I do not think that the results for every demonstration are read to 7 sigma... Be careful about this because if you are seeing a result of an exact algorithm that's read to 3 sigma then it's probably best to rate it as equivalent to a heuristic that gives a result within 0.01% 99.99% of the time (I am being generous). It's all in the small print in the annexes...