TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Common problems with large file uploads

72 点作者 ananddass将近 13 年前

13 条评论

zimbatm将近 13 年前
Given the title I was expecting the article to provide a solution.<p>From personal experience, the bigger the file, the more likely you will experience a connection cut in the middle of the upload. That is why the most important thing it to support resumable uploads.<p>At the moment there is no clear consensus on how to handle that. Amazon S3 has one protocol, Google uses two revisions of a different protocol, one on YouTube[2], another on Google Cloud Storage[3]. Both work by first creating a session that you refer to when uploading the chunks. There is also the Nginx upload module[4] that delegates the session ID to the client for some reason.<p>And there is no browser client available to my knowledge.<p>That's all I know folks<p>[1]: <a href="http://docs.amazonwebservices.com/AmazonS3/latest/API/mpUploadInitiate.html" rel="nofollow">http://docs.amazonwebservices.com/AmazonS3/latest/API/mpUplo...</a> [2]: <a href="https://developers.google.com/youtube/2.0/developers_guide_protocol_resumable_uploads" rel="nofollow">https://developers.google.com/youtube/2.0/developers_guide_p...</a> [3]: <a href="https://developers.google.com/storage/docs/developer-guide" rel="nofollow">https://developers.google.com/storage/docs/developer-guide</a> [4]: <a href="http://www.grid.net.ru/nginx/resumable_uploads.en.html" rel="nofollow">http://www.grid.net.ru/nginx/resumable_uploads.en.html</a>
评论 #4264830 未加载
ars将近 13 年前
For the HTTP/2.0 discussion there was here earlier:<p>A way to continue an interrupted file upload.<p>Because POST variable are sent in order, if you put the file first and the other variables after, the server never sees them if the file was interrupted. So when I code a form I always put the hidden ones first so at least I can give a useful error message (since I know what the user was trying to do).<p>It would be better to decouple them and upload the files and the rest of the variables separately.
tagx将近 13 年前
I'd really like to be able to use Dropbox as a magic upload handler for any file I upload on my local HD, not just those in my Dropbox folder. They handle the logic of getting all my files into the cloud. Why can't I point a website to my Dropbox and say here, this is handling the file upload?
评论 #4263133 未加载
评论 #4263107 未加载
评论 #4263413 未加载
ChrisNorstrom将近 13 年前
8gb+ files? I found a way but you have to use a JAVA FTP Applet. I tested these two here: <a href="http://jupload.sourceforge.net/" rel="nofollow">http://jupload.sourceforge.net/</a> and <a href="http://www.jfileupload.com/" rel="nofollow">http://www.jfileupload.com/</a><p>Dragged and dropped an 8gb+ file and left it on for 5 hours. Worked perfectly. No time outs, no errors, and I'm on a shared hosting account at 1and1.<p>My problem with them is that it wasn't possible to hide the FTP username and password, they were always in javascript files. I whined, I complained, I bitched, and there was nothing they could do about it. :( So you basically had to password protect the whole directory with .htaccess and be very careful with whom you shared the credentials.<p>If you don't want people to download and install software just stick with JAVA FTP Applets.
评论 #4263666 未加载
评论 #4263755 未加载
评论 #4265233 未加载
rajbot将近 13 年前
I've been dealing with browser-based large file uploads, which means dealing with lots of browser-specific issues.<p>Fortunately, things are getting better, especially for the webkit-based browsers. Firefox still has some issues, and I check <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=678648" rel="nofollow">https://bugzilla.mozilla.org/show_bug.cgi?id=678648</a> pretty regularly. Just today this bug, which was filed in 2003, changed from Status = NEW to Status = ASSIGNED.<p>Today is a good day.
评论 #4263264 未加载
评论 #4263257 未加载
t4nkd将近 13 年前
I've experienced this issue before when establishing a publisher backend for a D2D pc game business. It seems to be basically impossible without a Java applet of some kind, and even then it's wonky at best and just 'fails' at worst. The real fix for the issue seemed to be simply providing an FTP connection and letting people connect through the native client of their choosing.<p>That really seems to be the key for this problem, develop a simple native app capable of FTP uploads, that make it easy for users to deliver files to your app within the context of their use. Most browsers are capable of opening native applications via unique protocol, you could easily enrich the process by having the native app be a part of(or try to blend seamlessly with) major browsers.
jasomill将近 13 年前
As plenty of file transfer protocols, clients, and servers support resumable transfers (FTP, SFTP, rsync, proprietary browser-based tools, etc., or even basic HTTP if you arrange for the file to be pulled rather than pushed and your "client's server" has byte-range support), perhaps this should be titled "why you shouldn't use a single HTTP POST request from a browser to upload a large file". The general reason seems to be "because this is not a use case this feature is commonly designed for and tested against."
abemassry将近 13 年前
I ran into this problem with <a href="https://truefriender.com/" rel="nofollow">https://truefriender.com/</a> the solution I used was to use nginx instead of apache, nginx streams the file to disk and then I can handle it with PHP. I still have the 2GB problem but I've tested out Perl and I can go past it, now I just have to implement it.
评论 #4263272 未加载
kookster将近 13 年前
It may not work for ginormous files, but I've used a flash swf object to upload to s3, released as part of a Rails gem. The latest version is here: <a href="https://github.com/nathancolgate/s3-swf-upload-plugin" rel="nofollow">https://github.com/nathancolgate/s3-swf-upload-plugin</a>
severin将近 13 年前
Hi everyone. We developped a solution just for that! Please feel free to look at <a href="http://forgetbox.com" rel="nofollow">http://forgetbox.com</a> and give us feedback.<p>Our users send 130GB files, directly from Gmail...
zampano将近 13 年前
Excuse me if this is a stupid question, but why would timeout issues on large files affect something like Heroku more often than other types of hosting services?
评论 #4264987 未加载
graup将近 13 年前
I use node.js with this plugin: <a href="https://github.com/felixge/node-formidable/" rel="nofollow">https://github.com/felixge/node-formidable/</a><p>Works like a charm!
评论 #4264127 未加载
frytaz将近 13 年前
split them into rar/zip files with checksums on client side then upload...