Introduction
File uploads have been a very intricate part of any web application. Files of various types, sizes are uploaded to a server. Sometimes our files could be really large which could take substantial time to upload. Maybe a certain icing on the cake could help if we provide our users the facility of resumable uploads. The user might decide to pause the upload if he thinks that it might take considerable time and then could resume later. This will allow the user to operate at his discretion than being limited by the application's inflexibility.
HTML5 File API
The HTML5 File API has brought about considerable changes to the way files etc. are processed and sent over to the server. It gives a lot of power to the client to process selected files and leeway to decide the format the client needs etc. Although, the API will not solve our problem straight away but will give us the necessary tools to do so. Before we delve into solving the problem at hand, a brief introduction to the API is imperative.
The API provides couple of interfaces to access files from the local file-system:
For our sample problem of uploading a single file and making the upload resumable, the above mentioned interfaces are sufficient.
File support in browser
We could check for the support:
if(window.File && window.Blob && window.FileReader){
// good to go. File API is supported
}
Now we need to select a file, read it asynchronously and then try uploading it to a server. The file object we get using the File interface is a reference to the actual file on the filesystem. The FileReader object reads the contents of the file object and once it's done reading, the onload event of the FileReader object is triggered. Now, we have the contents and should proceed with the upload. The FileReader has several ways of reading a file asynchronously:
.myDiv{
background: url(url_to_external_image);
}
This could be done (in a better/smarter way) using data URLs.
.myDiv{
background: url(data:image/gif;base64,nhlkjdsljfsfiruuuuuuRYEEDDHAODSALKDNWE987794574987598930293KHKANHLKWKL$$SF34);
}
The value provided in the url attribute is the data URL. This saves considerable number of HTTP requests to fetch external resources and hence increases performance. Once the read operation is complete, the onload event on the reader object is triggered which gives access to the data read.
The reader could be aborted at any moment in time by using the abort() method. This will stop the ongoing read operation. We could use readAsDataUrl() or readAsBinaryString() for this article. However the advantage readAsDataUrl() provides is that the data URL could be just put as the src attribute for e.g. setting up thumbnails for uploaded files (if they are images) or for direct downloads as well. So, we will stick with readAsDataUrl(). Further, we will be using the localStorage object to store and remember which file is being uploaded currently. This will serve as a context which could be used during the resume state (will be evident as we move along). Also, we will be using the HTML5 FormData() API to communicate/send data to the server.
Our HTML and corresponding JavaScript could be set up as follows:
HTML:
<input type='file' name='myFile' id='uploadFile' />
<table>
<tr>
<td><button id='pause'>pause</button></td>
<td><button id='resume'>resume</button></td>
</tr>
</table>
JavaScript:
/** helper function to create an Ajax request */
function initiateXHR(object, method, url, mode, fileDataURL, fileName, headersObject){
//create the object
object = null;
object = new XMLHttpRequest();
object.open(method, url, true);
//append the data
var formData = new FormData();
formData.append('file', fileDataURL);
formData.append('mode', mode);
//add event listener for the xhr object
addListener(object);
//add request headers if any
for(var header in headersObject)
object.setRequestHeader(header, headersObject[header]);
object.send(formData);
}
/** helper function to add listeners to the Ajax request */
function addListener(object){
object.onload = function(){
if(object.readState == 4){
if(object.status == 200){
if(object.response === "FILE_UPLOAD_SUCCESSFUL" || object.response === "FILE_APPEND_SUCCESSFUL"){
/** note that once the upload completes, make sure to remove
* the localStorage entry.
*/
localStorage.removeItem('currentFile');
}
}
}
}
object.onerror = function(){
// xhr error
}
}
var xhr = null, _xhr2 = null, _reader = null;
/** file input change handler */
document.getElementById('uploadFile').addEventListener('change', function(e){
// some file was selected. Lets get the file and read it.
var file = this.files[0]; // this is the file reference to the file on the filesystem
var fileSize = file.size;
var fileName = file.name;
_reader = new FileReader();
_reader.readAsDataURL(file);
_reader.onload = function(event){
var data = event.target.result; //this contains the read content
// store the file into the localStorage object for future use
localStorage.setItem('currentFile', JSON.stringify({
'fileReference' : file,
'fileSize' : fileSize,
'fileName' : fileName
}));
/** initiate an XHR request to upload to server */
initiateXHR(xhr, "POST", "uploadFileToServer.php", "upload", data, fileName, { 'X-FileName' : fileName });
}
}, false);
The basic setup above will read the selected file and upload to the server.
Note that this might not seem to be the ideal way of file uploads. (Direct) File uploads are usually done via a form submit using the multipart/form-data encoding. We are however using AJAX to upload the file.
Now we need to add the pause/resume functionality. The process flow could be something like:
Lets begin with the pause part. This part is easy.
/** pause button click event handler */
document.getElementById('pause').addEventListener('click', function(e){
xhr.abort();
});
Now comes the fun part. Implementing the resume part is tough because we need to know how many bytes have been transferred to the server. This could be known by several (incorrect) ways. For e.g. we could add progress event handlers to the xhr object to monitor the progress of the upload. However, that value could be erroneous as it might have sent out a couple of bytes/bits which didn't reach the server and the abort operation was called in. The safest way to get an idea of the number of bytes transferred is to ask the server itself. So, we raise another XHR request to the server asking how many bytes of this particular file has been uploaded. Then we slice the file and continue uploading the remaining.
/** resume button click event handler */
document.getElementById('resume').addEventListener('click', function(e){
_xhr2 = new XMLHttpRequest();
_xhr2.open("GET", "getBytesFromServer.php", true);
//retrieve the entry for the stored file reference
var myFileObject = JSON.parse(localStorage.getItem('currentFile'));
var fileReference = myFileObject.fileReference;
var fileSize = myFileObject.fileSize;
var fileName = myFileObject.fileName;
_xhr2.send("fileName="+fileName);
_xhr2.onload = function(){
if(_xhr2.readyState == 4){
if(_xhr2.status == 200){
var response = JSON.parse(_xhr2.response);
/**
* Let's assume that our response from server is an object:
* {
* fileName: myAwesomeFile,
* bytesUploaded : 2440
* }
*/
var bytesSent = response.bytesUploaded;
var fileBlob = fileReference.slice(bytesSent, fileSize); // crux of this article
_reader = null;
_reader = new FileReader();
_reader.readAsDataURL(fileBlob);
_reader.onload = function(){
var remainingFile = _reader.result;
/** upload this remaining file to the server now */
initiateXHR(xhr, "POST", "uploadFileToServer.php", "resume", remainingFile, fileName, { 'X-FileName' : fileName });
}
}
}
}, false);
Server Side
Sample implementation for uploadFileToServer.php :
<?php
$response = "";
if(isset($_POST['file'])){
$allHeaders = apache_request_headers();
$fileName = $allHeaders['X-FileName'];
$fileContent = $_POST['file'];
// fileContent will contain the data URL which contains data in the format:
// data:<MIMETYPE>;base64,<base64_encoded_file_content>
$out = explode("base64,", $fileContent);
$fileData = base64_decode($out[1], TRUE);
if(isset($_POST['mode'])){
$mode = $_POST['mode'];
if($mode == "resume"){
// this is the resume mode. Append to the file instead of overwriting it.
if(file_exists($fileName)){
file_put_contents($fileName, $fileData, FILE_APPEND);
$response = "FILE_APPEND_SUCCESSFUL";
}else{
$response = "file not found !";
}
}
else if($mode == "upload"){
// this is the normal uninterrupted upload mode
file_put_contents($fileName, $fileData);
$response = "FILE_UPLOAD_SUCCESSFUL";
}
}
}
echo $response;
?>
Sample implementation for getBytesFromServer.php :
<?php
if(isset($_POST['fileName'])){
$fileName = $_POST['fileName'];
$fileSize = 0;
$response = array(
'fileName' => $fileName,
'bytesUploaded' => $fileSize
);
if($fileId != NULL){
try{
$fileSize = filesize($fileName); // filesize() is a built-in function which returns the size of the file in bytes
}
catch(Exception $e){
$fileSize = -1;
}
}
$response['bytesUploaded'] = $fileSize;
echo json_encode($response, TRUE);
}
?>
Conclusion
That's it. We're done. Note that the XHR dealing with uploading the remaining file would append the contents to the partially uploaded file rather than creating a new one. Further, I've used the localStorage object of the WebStorage API to store the data URL for the file content. Note that this would be fine if the file not really that large. Refer to this link for further details about the WebStorage API. For humongous files, it's better to switch to using IndexedDb. It supports a full fledged database support for storing large data.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.