There seems to be a fundamental issue with this model. If it's economically viable for a user to use this service, there's no reason why the company wouldn't just do it themselves. The only exception is the cost of the hardware, but over the long term this is a relatively small factor compared to the cost of electricity and bandwidth. Especially considering that the company could use much more efficient hardware than the typical home or gaming computer.
I understand the 'sharing economy' desire to make use of underutilized resources, but this doesn't seem like an economically feasible way of doing so. The model works for Uber/Lyft: cars are a relatively high upfront cost compared to the cost of gas, but computer hardware is often less expensive upfront than the electricity costs of running it for a year. Additionally, much of the economic value in a service like Uber or Lyft is provided by the driver, not just the use of the car. In this service, the user doesn't provide any value, in fact, they're using up cycles/space that could otherwise be monetized.
It's viable because it allows you to sell electricity that other people are paying for, in return for money that you get to keep.
This is harder for the company itself to do, because if they just hire people to go into libraries, universities etc. to install mining bots they might be criminally liable. "Uber for CPU cycles" seems like a less felonious enterprise than installing malware on public-use hardware.
>The only exception is the cost of the hardware, but over the long term this is a relatively small factor compared to the cost of electricity and bandwidth.
My electricity rate is is 15.7 cents per kw-hour. During typical usage (MS Office, web browser, programming), my Intel 6-core desktop (without LCD monitor on) draws about 150 watts.
For back-of-the-napkin estimates, let's round the kwh cost up to 16 cents and the wattage to 300 watts (to cover scenario of some of cpu cores being 100% pegged). The electricity cost of 24x7 for one year would be ~$413.
What remains would be bandwidth costs -- if any. I like many others have Verizon FiOS and even if others have Comcast or ATT, there's no obvious residential bandwidth costs I can think of to calculate. Maybe... if the homeowner wants to upgrade the speed from 75GB@$99/month to 150@$199/month because the he wants to download the datasets faster. That extra $100 wouldn't have been spent for plain web browsing. So conceivably, that would be $1200 per year. What we don't know is how big the datasets are that must be downloaded. I assume the upload size would be minimal because the compute tasks appear to be variations on "y_output = computecombinationsmontecarlobruteforce(x_input)." The y_output answer would usually be order-of-magnitude(s) smaller than x_input.
Assuming there are no extra bandwidth costs, it would be hard for a company to buy computer hardware for less than a homeowner's $413/year electricity cost.
Perhaps suchflex's particular business model is financially wrong. However in general terms, it does seem possible to find a monetizing sweet spot of computing tasks that takes advantage of the idle and wasted resources of existing home computers. However, if the homeowner has to buy extra hardware that was only dedicated to suchflex, that's probably where the economics won't make as much sense.
>During typical usage (MS Office, web browser, programming)...
That's the issue though, this wouldn't be similar to your typical usage. Instead, if they're using your GPU to train neural networks, it'll be running close to or at full capacity.
I realize that you rounded the costs up, but lets just look at the costs of a GPU often used for machine learning - Nvidia GTX 980 TI. According to Nvidia, it draws 250W under load which according to your figures would result in a yearly cost of roughly $344. That's just for the reference card, a typical card that a consumer would purchase would draw even more. You can buy a 980 TI for a little more than $400. That doesn't even begin to look at hardware actually designed for commercial and research applications.
I think that it's possible to find a way of monetizing computer resources, however, I think it has more to do with arbitraging differences in electricity costs. Suchflex's model certainly wouldn't work where I live (electricity costs in NYC are roughly 20 cents per kw-hour) but parts of the US are under 10 cents. I could see a company attempting to profit from these differences by setting up hardware in a cheap state and negotiating a favorable electricity rate. Heavy computation could then be done on these networks for significantly less than it could in New York or California.
In summary, the value of a consumer's unused computer has more to do with their electricity rate than their hardware.
The whole thing is based on Gridcoin and BOINC anyway, so the good thing that comes out of it is that BOINC research projects like Rosetta@home get more computing power for free.
For this the suchflex guys earn Gridcoins, which they can sell directly on the market and convert to money. But a user could leave out the middleman alltogether and just mine Gridcoins (or alternatively other cryptocurrency but for that you need ASICS) themselves.
Exactly that. People have been running BOINC projects (which make up a great deal of the projects people list on that site such as Asteroids, Seti, Mind Modeling, etc.) and they are all volunteer efforts where people give their computing power away for free. I just don't see where they're coming from offering me $30/month for something people have been doing for free for years...
We will be adding: FAQ section, Power Costs calculator, Security section, TOS section to our website soon. Lots of this information is currently communicated to Beta Users through emails, but we are working on consolidating and improving information flow. Thank you for your feedback.
The disk space seems especially weird. Others have mentioned this but even services like Google Drive or Dropbox can give you storage at ~$8/TB so who's on the other end of this transaction, willing to pay premium prices for low end storage?
Might make sense if you want 100 or 1000 GPUs and don't want to run them 24/7. Usually all cloud computing prices tend to be pretty high if you compare them to just renting or purchasing hardware.
You can order backup dedicated server for 155 USD/month with contain 5 x 6TB SAS drives.
It seems that the price is about 6-7usd for TB
So I should get for it 5 x 6 x 6usd = 180USD per month
Other concern is that price for storage isn't linear and multiple small packages are more worth than single big.
So I could use my 1tb to get 2x 500GB and get 10usd instead of 8usd.
joining it with above server I should be able to get 10usd per TB, so 5 x 6 x 10 = 300USD per month - 155 USD for server = 145USD profit monthly?
What I always wonder about those kind of projects is why you never find any specs about the software they use. Not only that it is closed source (what I expect) but they don't give any information about how big it is and what you would need to run it.
You have no control about what is executed on your PC (possibly as root?). In my eyes it completely destroys the integrity and trustworthy of any computer it is installed on.
I also wonder how they calculate the money you get just for having certain hardware. It says a 6 core 3.2Ghz CPU pays out $28/month but what if I have an average of 70% CPU usage the whole time (unrealistic, but still what if)? Do they pay me still $28 even though they only get 30% of what they could get or do they reduce the payment or even worse does the software just take the whole CPU for it self and make the rest of the computer unusable?
I've seen some similar projects (don't remember the names) but none of those actually answered only one of these questions.
I don't think they pay you just based on hardware specs. It's a factor of time that you allow the program to run for, and how much usage of the hardware it gets. Their estimates are probably some middle ground of 50% of the day and 70% usage or something, with the $28+ meaning you could get more for longer/higher usage. It's unlikely that the average working person, with only a couple hours in the evening with their machines turned on (and also actively using them), would earn nearly that amount for the limited time. Especially gamers, where they have significantly better hardware than the average home pc but likely have much higher usage of the hardware. I doubt my pc with a gtx 970 would earn close to the $32 it mentions as I only use it for a couple hours an evening, and in that time the card is working really hard playing modern games in 4k resolution.
This consists of two different parts, the first is just mining cryptocurrencies, but for that you need ASICS in order to be competitive.
The second part is based on Gridcoin, a cryptocurrency that rewards BOINC computations in a decentralized way. So you do all kind of BOINC projects like SETI@home, Rosetta@home (folding proteins),... and the newly generated coins by inflation get distributed according to research done by each user. Again, the important part is that there is no central authority giving out the coins, but each client queries the Boinc projects individually to find out how much research each user has done and the clients then come to a consensus about this, which is stored in the blockchain and is then the basis for the reward each client gets when staking the next time.
Suchflex is basically just a platform built on top of the Gridcoin platform for these "earn money by doing research" projects and simple cryptocurrency mining on the other hand.
If that's the case, their estimated profit is completely bogus. I've been running BOINC pretty much 24/7 on my 2-GPU, 4-core gaming rig for the past 2 months and made about $2.50 so far.
Here's my !stats from the #gridcoin IRC channel:
fediverse> your next staked block will get you: 17 GRC (research owed)
<fediverse> per day, average: 5.7 GRC
<fediverse> (BTC (per day): 0.00005062 / $ (per day): $0.03 / $ (per 30 days): $0.88 / RAC (estimated): 136287.5 / magnitude (estimated): 28.60)
There's NO way that you're going to make $40/month with a 980 crunching BOINC projects, at least not with gridcoin. Gridcoin is $0.50 - $0.75, depending on where you look, and even the best rigs won't make more than 100 a day at best.
It looks like (although not obvious) that you get paid for running this?
If so, I'm pretty certain that it won't be economical due to the power usage. Great for teenagers with gaming machines that their parents are paying the electricity bill on, and that's about it.
I guess you wouldn't want to perform any mission critical computation on this type of system, unless you can verify the results you obtain using it are accurate, as I can imagine its fairly easy for people to tamper with your data. But I imagine theres a number of types of tasks it would be really good for.
I'm curious how the technique presented in "Reusable Garbled Circuits and Succinct Functional Encryption"
https://eprint.iacr.org/2012/733.pdf could potentially be utilised in such a system. I'm assuming it currently isn't possible due to computation overhead?
Distributed computing is most useful for stuff that you can, well, distribute. A lot of BOINC projects do this by dividing work up into workunits. They send out a packet of data to crunch, the clients crunch it, and return the results.
Many of these workunits are optimization problems without a clear solution and rely on algorithms using randomness, stuff like protein folding and bitcoin mining are examples of this. The nice thing about those is that you can send them out to as many clients as you want and just pick the best answer.
I think the general use case for this is the same as for most crowdsourced computing projects - problems that are hard to solve but whose solutions are easy to verify.
I understand the 'sharing economy' desire to make use of underutilized resources, but this doesn't seem like an economically feasible way of doing so. The model works for Uber/Lyft: cars are a relatively high upfront cost compared to the cost of gas, but computer hardware is often less expensive upfront than the electricity costs of running it for a year. Additionally, much of the economic value in a service like Uber or Lyft is provided by the driver, not just the use of the car. In this service, the user doesn't provide any value, in fact, they're using up cycles/space that could otherwise be monetized.
This is harder for the company itself to do, because if they just hire people to go into libraries, universities etc. to install mining bots they might be criminally liable. "Uber for CPU cycles" seems like a less felonious enterprise than installing malware on public-use hardware.
My electricity rate is is 15.7 cents per kw-hour. During typical usage (MS Office, web browser, programming), my Intel 6-core desktop (without LCD monitor on) draws about 150 watts.
For back-of-the-napkin estimates, let's round the kwh cost up to 16 cents and the wattage to 300 watts (to cover scenario of some of cpu cores being 100% pegged). The electricity cost of 24x7 for one year would be ~$413.
What remains would be bandwidth costs -- if any. I like many others have Verizon FiOS and even if others have Comcast or ATT, there's no obvious residential bandwidth costs I can think of to calculate. Maybe... if the homeowner wants to upgrade the speed from 75GB@$99/month to 150@$199/month because the he wants to download the datasets faster. That extra $100 wouldn't have been spent for plain web browsing. So conceivably, that would be $1200 per year. What we don't know is how big the datasets are that must be downloaded. I assume the upload size would be minimal because the compute tasks appear to be variations on "y_output = computecombinationsmontecarlobruteforce(x_input)." The y_output answer would usually be order-of-magnitude(s) smaller than x_input.
Assuming there are no extra bandwidth costs, it would be hard for a company to buy computer hardware for less than a homeowner's $413/year electricity cost.
Perhaps suchflex's particular business model is financially wrong. However in general terms, it does seem possible to find a monetizing sweet spot of computing tasks that takes advantage of the idle and wasted resources of existing home computers. However, if the homeowner has to buy extra hardware that was only dedicated to suchflex, that's probably where the economics won't make as much sense.
That's the issue though, this wouldn't be similar to your typical usage. Instead, if they're using your GPU to train neural networks, it'll be running close to or at full capacity.
I realize that you rounded the costs up, but lets just look at the costs of a GPU often used for machine learning - Nvidia GTX 980 TI. According to Nvidia, it draws 250W under load which according to your figures would result in a yearly cost of roughly $344. That's just for the reference card, a typical card that a consumer would purchase would draw even more. You can buy a 980 TI for a little more than $400. That doesn't even begin to look at hardware actually designed for commercial and research applications.
I think that it's possible to find a way of monetizing computer resources, however, I think it has more to do with arbitraging differences in electricity costs. Suchflex's model certainly wouldn't work where I live (electricity costs in NYC are roughly 20 cents per kw-hour) but parts of the US are under 10 cents. I could see a company attempting to profit from these differences by setting up hardware in a cheap state and negotiating a favorable electricity rate. Heavy computation could then be done on these networks for significantly less than it could in New York or California.
In summary, the value of a consumer's unused computer has more to do with their electricity rate than their hardware.
For this the suchflex guys earn Gridcoins, which they can sell directly on the market and convert to money. But a user could leave out the middleman alltogether and just mine Gridcoins (or alternatively other cryptocurrency but for that you need ASICS) themselves.
$480/year for a $400 video card? $240/year for 3TB of disk space?
No discussion of bandwidth.
No discussion of security.
No discussion of liability.
Dead Comment
Deleted Comment
Other concern is that price for storage isn't linear and multiple small packages are more worth than single big. So I could use my 1tb to get 2x 500GB and get 10usd instead of 8usd.
joining it with above server I should be able to get 10usd per TB, so 5 x 6 x 10 = 300USD per month - 155 USD for server = 145USD profit monthly?
PS: site is hosted on heroku
ps2: you can find extra page in google cache: http://webcache.googleusercontent.com/search?q=cache:PpTl0nt...
I also wonder how they calculate the money you get just for having certain hardware. It says a 6 core 3.2Ghz CPU pays out $28/month but what if I have an average of 70% CPU usage the whole time (unrealistic, but still what if)? Do they pay me still $28 even though they only get 30% of what they could get or do they reduce the payment or even worse does the software just take the whole CPU for it self and make the rest of the computer unusable?
I've seen some similar projects (don't remember the names) but none of those actually answered only one of these questions.
The second part is based on Gridcoin, a cryptocurrency that rewards BOINC computations in a decentralized way. So you do all kind of BOINC projects like SETI@home, Rosetta@home (folding proteins),... and the newly generated coins by inflation get distributed according to research done by each user. Again, the important part is that there is no central authority giving out the coins, but each client queries the Boinc projects individually to find out how much research each user has done and the clients then come to a consensus about this, which is stored in the blockchain and is then the basis for the reward each client gets when staking the next time.
Suchflex is basically just a platform built on top of the Gridcoin platform for these "earn money by doing research" projects and simple cryptocurrency mining on the other hand.
Here's my !stats from the #gridcoin IRC channel:
fediverse> your next staked block will get you: 17 GRC (research owed) <fediverse> per day, average: 5.7 GRC <fediverse> (BTC (per day): 0.00005062 / $ (per day): $0.03 / $ (per 30 days): $0.88 / RAC (estimated): 136287.5 / magnitude (estimated): 28.60)
There's NO way that you're going to make $40/month with a 980 crunching BOINC projects, at least not with gridcoin. Gridcoin is $0.50 - $0.75, depending on where you look, and even the best rigs won't make more than 100 a day at best.
Pure PoS cryptocurrencies cannot be mined by suchflex.
If so, I'm pretty certain that it won't be economical due to the power usage. Great for teenagers with gaming machines that their parents are paying the electricity bill on, and that's about it.
I don't get how they set such a big price for that.
I'm curious how the technique presented in "Reusable Garbled Circuits and Succinct Functional Encryption" https://eprint.iacr.org/2012/733.pdf could potentially be utilised in such a system. I'm assuming it currently isn't possible due to computation overhead?
Many of these workunits are optimization problems without a clear solution and rely on algorithms using randomness, stuff like protein folding and bitcoin mining are examples of this. The nice thing about those is that you can send them out to as many clients as you want and just pick the best answer.
Dead Comment