1

What JavaScript API should I look for if I want to read a File (e.g. from an <input type="file" ...> or a drop area) in chunks and upload these chunks sequentially - but without loading the entire file (e.g. multiple gigabytes) into memory? I want only the current chunk (e.g. 1 MB) to reside in memory.

I see example codes use FileReader.readAsArrayBuffer and Blob.slice. But doesn't that involve loading the entire file into memory (readAsArrayBuffer) and then accessing it by increasing offset and fixed length sequentially?

My question is only about the client side. Server side is no problem.

5
  • 1
    When you include a local file as a "value" in an <input>, the browser doesn't start streaming out the file contents until you actually launch the POST operation. Commented Dec 25, 2024 at 13:18
  • 1
    To put it another way, how did you become concerned that the browser is loading files entirely before starting an HTTP operation? Commented Dec 25, 2024 at 13:19
  • @Pointy You're right, that's not the problem. But I don't want to use a normal form based upload but upload the files chunk-wise by JavaScript/Ajax. So the question is: What is the way to get chunk after chunk as a Blob without loading the entire file before? Commented Dec 25, 2024 at 13:35
  • I want to use chunks for several reasons. For example, you're uploading 2 GB over a not so fast connection or instable WiFi. If, for example, upload of chunk 700 of 2048 fails, I want my code to retry chunk 700 after waiting for a few seconds. Commented Dec 25, 2024 at 13:47
  • The browser HTTP facility is pretty much all the same, not counting security rules. When you use the xhr or fetch facilities, you add File objects to the data to POST and the same behavior applies: the browser will stream the file out when the HTTP request is initiated. Commented Dec 25, 2024 at 13:57

1 Answer 1

0

I see example codes use FileReader.readAsArrayBuffer and Blob.slice. But doesn't that involve loading the entire file into memory (readAsArrayBuffer)

Yes, it will load entire file into memory as ArrayBuffer. And then slice part of it.


without loading the entire file (e.g. multiple gigabytes) into memory?

This is not possible. JS code will see ArrayBuffer as entire file.

You can send form via http post request - then browser itself will send file entirely or by chunks (multipart form data) without ability to control this process by page's code.

If you want to do custom upload via ajax request - you need to get entire file into memory (like ArrayBuffer) and then do whatever you want because of you can access file (selected by user) till file field exists and you can access its contents.


Streaming can be done only for writing files to user file system using File System API:

https://developer.mozilla.org/en-US/docs/Web/API/File_System_API


Response to question in comments

Of course, each selected file from input.files (FileList) which have File interface extends Blob which have its own stream method which can stream file into specified destination, but file will not be streamed directly from user file system to destination, as it is Blob which will be created by underlying ArrayBuffer which will be loaded fully into browser memory and streamed from that memory. That's why it is not really "stream" with low memory consumption. Real stream can be achieved for example from fetched resource.

https://developer.mozilla.org/en-US/docs/Web/API/Blob/stream

https://developer.mozilla.org/en-US/docs/Web/API/Blob/arrayBuffer

The arrayBuffer() method of the Blob interface returns a Promise that resolves with the contents of the blob as binary data contained in an ArrayBuffer.

// example of `stream` method usage
const fileInput = <HTMLInputElement>document.getElementById('fileItem');

if (fileInput && fileInput.files) {
    const file = fileInput.files[0];

    file.stream().pipeTo(destination);
}

One more example of internal details:

https://developer.mozilla.org/en-US/docs/Web/API/FileReader/readAsArrayBuffer

When the read operation is finished, the readyState property becomes DONE, and the loadend event is triggered. At that time, the result property contains an ArrayBuffer representing the file's data.

So file will be present in browser memory entirely. What to do with data data: send by http form or using ajax by parts - decides developer.

P.S. Of course browser itself can optimize file loading into memory internally but this is not noted in specification and I wouldn't rely (expect/hope) on this (this is browser dependent).

Sign up to request clarification or add additional context in comments.

2 Comments

Maybe using Streams API?? w3c.github.io/FileAPI/#dom-blob-stream
@MrSnrub I added extended response to post about your question

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.