Readit News logoReadit News
uploaderwin · 5 years ago
Yeah this is asking for trouble. We only had a small demo on our homepage where users could upload media files and they were deleted after 24 hours and still some people managed to abuse it and nearly got our site killed, domain blacklisted in Google with a big red screen of death.

I don't want to spam any links here but if you are interested please do look at my last post about the dangers of doing this and lessons I learned from my mistake.

Please do not keep the files for 10 days. Even 24 hours is a deal-breaker. From what I've learned, anything more than 30 minutes can get you into trouble.

dheera · 5 years ago
I once had a location-based file sharing service that also got blacklisted by Google with no recourse. I hate Google trying to police the internet with no timely appeals process.

I wonder though if you could simply just block the Google crawler and bypass it. Or use a JavaScript to auto-POST something before the file gets sent for download. The Google crawler doesn't issue POST requests as far as I know.

gowld · 5 years ago
By "police", do you mean "warn people about dangers" ?
mwambua · 5 years ago
Can you remedy this problem by making it so that anyone can delete the file? That way anyone can take it down if they have a problem with it? It's supposed to be ephemeral storage anyway... people might not mind having files disappear.
dspillett · 5 years ago
Two problems there:

1. Many people are more likely to go to a lot of effort to complain loudly and widely rather than hit a simple "delete this" link.

2. Such feature is basically a self-DoS. If someone takes a disliking to the app or a user of it they can script up a "delete everything" and fire it off.

Naac · 5 years ago
Similar sites like http://ix.io/ have been up for many years with no issues. I assume spam can be a problem, but these sites must have figured something out.
Natsu · 5 years ago
I suspect spam is on the nicer side of things people might upload... :/
the_arun · 5 years ago
Uploading files without auth layer - is asking for trouble IMHO. Change without audit trail will encourage wrong doers. But I get the idea, this is an example for a file upload in a simple way using Curl or other tools.
ju-st · 5 years ago
The website is already blocked because of "Malicious Sources/Malnets" in the firewall of the company I work at.
raverbashing · 5 years ago
"This is why we can't have nice things..." sigh
southerntofu · 5 years ago
> Uploading files without auth layer - is asking for trouble IMHO.

If you make it super user-friendly and advertise it as the next Megaupload, sure. But if you keep a small audience of good-faith users it's not asking for problems.

If you can teach me to make my file upload as hacker-friendly as this service while implementing auth, i'd be glad. Here the entire point is you don't need further configuration/credentials for example to upload log/config from a server.

codegeek · 5 years ago
But it is not a matter of "if". It is a matter of "when". Bad actor(s) will find out at some point if it is even remotely popular and then it's game over.
the_arun · 5 years ago
1. How are you blocking someone from uploading millions of files programmatically?

2. How do you know someone is from small audience of good-faith?

3. What if a file has virus and corrupt all the files on your end?

If you don't need auth, there are few measures you can take on your end:

1. TTL - Make these files temporary - They will be erased after x hours. Eg. x=1

2. Throttle - Limit number of uploads from a given IP/machine or control by uploads per sec

3. May be adding a malware scanner?

ericbarrett · 5 years ago
This is a one-way road, though. The minute a bad actor finds out about your ungated image-hosting service, it's over, and you'll have a hell of a mess to clean up. If you're lucky it'll just be somebody trying to sell penis pills. If you're not, you'll have federal investigators knocking.
derefr · 5 years ago
This is incompatible with using curl as your client, but one “hacker-friendly way to do auth” is to use Github’s public SSH keys API.

You can stand up an (SCP/SFTP-subprotocol-only) SSH server, and tell the user to log in with their GitHub username + GitHub SSH key. Then configure your SSH server to call[1] a check on GitHub’s API to map the provided username to the GitHub user’s set of public SSH keys. From there, the server treats that list exactly as if it were the user’s ~/.ssh/authorized_keys file.

[1] As it happens, I wrote an OpenSSHD plugin for exactly this: https://github.com/tsutsu/github-auth3

Following that, you can configure PAM to continue the auth process however you like, policy-wise: let any GitHub user in; only let GitHub users in from a specific GitHub org; keep an LDAP directory of GitHub usernames such that you can attach metadata to them like “is banned” or “has used up their upload credits for the day” or “is on plan tier X”; etc.

Then, to actually handle the uploads, you can 1. set up automatic local user instantiation per remote user; 2. populate /etc/skel with just the right set of limited files to allow the user to upload into one “spool” directory; 3. have an inotify-like daemon that watches for files to be closed in that directory and handles them from there (e.g. uploading them to an S3 bucket, etc.)

—————

Or, alternately, you can avoid building this on top of OpenSSH, since you’re really fighting against the current by trying to virtualize everything, when OpenSSH expects to be relying on, and providing access to, a traditional POSIX environment.

Instead, you can have your own SSH server daemon that provides access to a pretend environment inside the SSH-server process, and handles SCP/SFTP upload streams through a custom in-process handler, the same way a web framework handles PUT requests.

I don’t know how common this is in other runtimes, but Erlang has an SSH server framework that you can use to implement exactly this. (As it happens, I’ve also written a high-level service that uses this SSH server framework to implement an alternative carrier for Erlang remote shell, where you can just SSH into the Erlang node to get a shell on it: https://github.com/tsutsu/exuvia. This app is also, AFAIK, the only public/FOSS demonstration of how to use Erlang’s SSH server library—which is kind of sad. People should play with things like this more! Make MUDs and such!)

opan · 5 years ago
chunk.io is a similar thing, but with auth.

https://chunk.io/

pokoleo · 5 years ago
Try using basic auth:

curl -u username:password https://

banana_giraffe · 5 years ago
Agreed.

I use a little python script that creates a curl command to upload to S3 for cases where I don't have the AWS toolchain on a remote box.

Not as easy as a single command, but at least I'm less likely to be sending files off to some random site for everyone to see.

naturalpb · 5 years ago
One can upload a file to their Dropbox via a cURL post, provided they have created an app and have an access token, which just takes a few minutes to set up.

curl -X POST https://content.dropboxapi.com/2/files/upload --header "Authorization: Bearer ACCESSTOKEN" --header "Dropbox-API-Arg: {\"path\": \"/DROPBOXFILEPATH/DROPBOXFILENAME\"}" --header "Content-Type: application/octet-stream" --data-binary @/LOCALFILEPATH/LOCALFILENAME

umvi · 5 years ago
These are always nice little sites to have around, but they can't really grow much in popularity before users start abusing them to distribute illegal things at which point the site has to start doing more and more content moderation or be shut down.
geek_at · 5 years ago
Can confirm. I had a public demo of my open source image hosting solution [1] (where you can resize images and videos by just entering a different URL) up for years without problems, until idiots started uploading CSAM (Children sexual abuse material).

Luckily I found out before law enforcement did [2] so I proactively talked to my federal bureau for months generating Excel sheets of IPs and access times and devices and countries. I didn't see many of the images myself, basically just looked at one upload per IP which was like three in total and forwarded all uploads of that IP to the police but man.. what the hell is wrong with people. 4 digit number of uploads of CSAM.

[1] https://github.com/HaschekSolutions/pictshare [2] https://blog.haschek.at/2018/fight-child-pornography-with-ra...

ivan888 · 5 years ago
The process of properly reporting and working with authorities seems daunting. (Anecdotally,) It sounds too easy to implicate yourself for a technical violation of the law by (even unknowingly) hosting this content, or accidentally transferring it to one of your personal devices. Much worse following the advice of your local police to print out images which would be completely illegal! On the other hand, if the process was too lenient on reporters, hosting a file sharing service that "gets abused" with illegal content might turn into the ultimate scapegoat for illegal content users/creators/brokers

Nice job going through the reporting process and I'm glad you blogged about it to share with others

jtokoph · 5 years ago
Came to say this. OP: If this is your site it will be used for piracy, underage porn and phishing within hours.
philshem · 5 years ago
which makes the p2p file transfer websites so special https://file.pizza/ and https://webwormhole.io/

(*based on https://github.com/magic-wormhole/magic-wormhole)

bityard · 5 years ago
Not OP, but this site has been around for months at least, maybe a year, it would be sad if it had to taken down right after landing on the front page of HN.
faeyanpiraat · 5 years ago
It might be a honeypot for exactly that purpose aswell
tobylane · 5 years ago
Agreed. I was using it* for TravisCI, and having moved to Github Actions I'm glad they have uploads stored per run.

*this one and another few before it.

anderspitman · 5 years ago
Sad but true. One reason why easy self hosting is important for these types of projects.

Deleted Comment

apayan · 5 years ago
If you like the convenience of transferring a file temporarily into the cloud to download it elsewhere (great for getting stuff out of a rancher environment), check out patchbay[0]. It uses what it calls 'HTTP channels' so if you start a POST request to a patchbay URL, it will block until a corresponding GET is made to the same endpoint which will receive the data from your POST. The operation can be done in reverse as well, with the GET blocking until the POST begins.

[0] https://patchbay.pub/

rogual · 5 years ago
This is brilliant, thank you!
exikyut · 5 years ago
Huh, WebRTC for the Web 2.0 2010 era. Nice.
40four · 5 years ago
I like the simplicity of it. One PHP file, throw it on a server with Apache and rock and roll.

Other comments are right to point out that this site is setting itself up to be abused. My feeling is that this is intended to be a demo. I doubt the creator is trying to provide a real service here. And they might be in for a rude awakening if it gains traction.

But, it looks like they intend this to be open source. Anyone can clone the repo and run this on their own server! Unfortunately, the repo does not have a license file, which makes me a little uneasy.

Edit: I didn’t say that very well. With no license file, technically we cannot actually use this code since it defaults to ‘All rights reserved’. I think the author might not realize that though. It seems they intend it to be ‘open’ based on line 334.

Also, it is not particularly good PHP code, a little rough around the edges. But hey, it's a cool demonstration on a very straight forward way to upload & share files! Could be a good starting point to develop further.

s_dev · 5 years ago
>the repo does not have a license file, which makes me a little uneasy.

Surely the author is bearing the liability of getting burned by not specifying a licence.

detaro · 5 years ago
How do they "get burned" by that? Not having a license means you don't get to use it and are violating their copyright if you do so (possibly except the things specified in Github ToS: look at the code on github)
gugagore · 5 years ago
No. If you find some code online, or on a thumb drive on the sidewalk, and it is unlicensed, it's incorrect to assume that it's equivalent to being permissively licensed for you to do whatever you want with it.
cyberbanjo · 5 years ago
Why? I thought this type of situation would default non-permissive licensing in-lieu of an explicitly permissive one.
40four · 5 years ago
Yeah, I edited my comment as I think I miss-spoke actually. Technically, GitHub as a platform will allow us to fork or clone this code. But with no license file, from a legal point of view, we cannot use it, or whatever else an open source license would allow.
frabert · 5 years ago
I don't think you have much responsibility in not specifying a license, since it defaults to "all rights reserved", thus preventing anyone else from using that code.
_joel · 5 years ago
Alternatively https://transfer.sh/
jonathantf2 · 5 years ago
Link is dead for me.
_joel · 5 years ago
Which one? https://downforeveryoneorjustme.com/transfer.sh says it's up and fine for me. If you're curling don't include the last character that the URL has in response. That's effectively a carriage return/end of data
smartbit · 5 years ago
It’s back! I thought it stopped nov 30, 2018? +1
_joel · 5 years ago
Indeed, they had a bit of a hiatus but the service was brought back.
southerntofu · 5 years ago
Another alternative: the famous nullpointer https://github.com/mia-0/0x0

A small script i use very regularly:

    #!/usr/bin/env bash
    if [ ! -f $1 ]; then echo "MISSING: $1"; exit 1; fi
    torify curl -F"file=@$1" https://YOURSERVER || echo "UPLOAD FAILED (code: $?)"