How to Handle Large File Uploads (Without Losing Your Mind)

Why Consider Optimizing Large File Uploads In frontend development, we often encounter scenarios where file uploads are required, such as uploading images, videos, audio files, etc. During file uploads, we may encounter issues if the files being too large, such as: Leading to long upload times and poor user experience. Causing excessive server load and resource consumption. Resulting in upload failures in unstable network conditions, requiring re-uploads and wasting time and bandwidth. Leading to high browser memory usage, affecting performance and stability. To address these problems, we need to optimize large file uploads. In modern internet applications, users have increasingly high demands for file uploads, such as: Sharing high-resolution images and videos on social platforms. Submitting assignments and course materials on educational platforms. Uploading project documents and reports on enterprise platforms. In these scenarios, the files users need to upload are often quite large, sometimes reaching hundreds of megabytes or even several gigabytes. If we use traditional file upload methods, we will face the aforementioned issues. The traditional file upload method involves sending the entire file as a single request body to the server. This approach has several drawbacks: Long upload times: Due to the large file size, it takes a long time to transfer the data, requiring users to wait a long time to see the upload results. High server load: Large files require the server to receive and process large amounts of data at once, potentially consuming excessive server memory, CPU, bandwidth, and other resources. Prone to failure in unstable networks: Large file transfers are susceptible to network issues such as disconnections, timeouts, or packet loss, leading to upload failures and forcing users to re-upload the entire file. High browser memory usage: Large files require the browser to read the entire file into memory and maintain the connection, which may result in high memory usage and impact the performance of other pages. To address these problems, optimizing large file uploads is essential. Design Approach The primary approaches to optimizing large file uploads include: 1. Chunking Divide a large file into smaller chunks, with each chunk sent as a separate request to the server. This reduces the amount of data per request, shortens upload times, decreases server load, and enables resumable uploads. function sliceFile(file, chunkSize) { const fileSize = file.size; const chunks = Math.ceil(fileSize / chunkSize); const slices = Array.from({ length: chunks }, (_, index) => { const start = index * chunkSize; const end = start + chunkSize; return file.slice(start, end); }); return slices; } 2. Concurrency Send multiple chunk requests to the server simultaneously to fully utilize network bandwidth and server resources, improving user experience. async function uploadChunks(fileChunks) { const uploadPromises = fileChunks.map((chunk) => fetch('/upload', { method: 'POST', body: chunk }) ); const responses = await Promise.all(uploadPromises); return responses; } 3. Compression Compress each chunk before sending it to the server to further reduce data size and improve transfer efficiency. async function compressChunk(chunk) { const compressedChunk = await new Promise((resolve, reject) => { const reader = new FileReader(); reader.onload = (event) => { const result = pako.deflate(event.target.result); resolve(result); }; reader.onerror = (event) => reject(event.error); reader.readAsArrayBuffer(chunk); }); return compressedChunk; } 4. Validation Validate each chunk before or after transmission to ensure data integrity and correctness, avoiding redundant or erroneous data transfers. async function verifyChunk(chunk) { const hash = await calculateHash(chunk); const response = await fetch(`/verify?hash=${hash}`); const result = await response.json(); return result; } 5. Resumable Uploads In the event of a network failure, continue uploading from the point of interruption instead of starting over, saving time and improving speed. async function resumeUpload(file, resumeByte) { const blob = file.slice(resumeByte); const formData = new FormData(); formData.append('file', blob); const response = await fetch('/upload', { method: 'POST', body: formData }); const result = await response.json(); return result; } 6. Instant Uploads Before slicing the file for upload, calculate its hash and send it to the server. If the server finds an identical file, it can immediately return a success response, avoiding redundant uploads. async function checkFileExists(file) { const hash = await calculateHash(file); const response = await fetch(`/check?hash=${hash}`); const result = await resp

Jan 19, 2025 - 06:06
How to Handle Large File Uploads (Without Losing Your Mind)

Cover

Why Consider Optimizing Large File Uploads

In frontend development, we often encounter scenarios where file uploads are required, such as uploading images, videos, audio files, etc. During file uploads, we may encounter issues if the files being too large, such as:

  • Leading to long upload times and poor user experience.
  • Causing excessive server load and resource consumption.
  • Resulting in upload failures in unstable network conditions, requiring re-uploads and wasting time and bandwidth.
  • Leading to high browser memory usage, affecting performance and stability.

To address these problems, we need to optimize large file uploads.

In modern internet applications, users have increasingly high demands for file uploads, such as:

  • Sharing high-resolution images and videos on social platforms.
  • Submitting assignments and course materials on educational platforms.
  • Uploading project documents and reports on enterprise platforms.

In these scenarios, the files users need to upload are often quite large, sometimes reaching hundreds of megabytes or even several gigabytes. If we use traditional file upload methods, we will face the aforementioned issues.

The traditional file upload method involves sending the entire file as a single request body to the server. This approach has several drawbacks:

  • Long upload times: Due to the large file size, it takes a long time to transfer the data, requiring users to wait a long time to see the upload results.
  • High server load: Large files require the server to receive and process large amounts of data at once, potentially consuming excessive server memory, CPU, bandwidth, and other resources.
  • Prone to failure in unstable networks: Large file transfers are susceptible to network issues such as disconnections, timeouts, or packet loss, leading to upload failures and forcing users to re-upload the entire file.
  • High browser memory usage: Large files require the browser to read the entire file into memory and maintain the connection, which may result in high memory usage and impact the performance of other pages.

To address these problems, optimizing large file uploads is essential.

Design Approach

The primary approaches to optimizing large file uploads include:

1. Chunking

Divide a large file into smaller chunks, with each chunk sent as a separate request to the server. This reduces the amount of data per request, shortens upload times, decreases server load, and enables resumable uploads.

function sliceFile(file, chunkSize) {
  const fileSize = file.size;
  const chunks = Math.ceil(fileSize / chunkSize);
  const slices = Array.from({ length: chunks }, (_, index) => {
    const start = index * chunkSize;
    const end = start + chunkSize;
    return file.slice(start, end);
  });
  return slices;
}

2. Concurrency

Send multiple chunk requests to the server simultaneously to fully utilize network bandwidth and server resources, improving user experience.

async function uploadChunks(fileChunks) {
  const uploadPromises = fileChunks.map((chunk) =>
    fetch('/upload', { method: 'POST', body: chunk })
  );
  const responses = await Promise.all(uploadPromises);
  return responses;
}

3. Compression

Compress each chunk before sending it to the server to further reduce data size and improve transfer efficiency.

async function compressChunk(chunk) {
  const compressedChunk = await new Promise((resolve, reject) => {
    const reader = new FileReader();
    reader.onload = (event) => {
      const result = pako.deflate(event.target.result);
      resolve(result);
    };
    reader.onerror = (event) => reject(event.error);
    reader.readAsArrayBuffer(chunk);
  });
  return compressedChunk;
}

4. Validation

Validate each chunk before or after transmission to ensure data integrity and correctness, avoiding redundant or erroneous data transfers.

async function verifyChunk(chunk) {
  const hash = await calculateHash(chunk);
  const response = await fetch(`/verify?hash=${hash}`);
  const result = await response.json();
  return result;
}

5. Resumable Uploads

In the event of a network failure, continue uploading from the point of interruption instead of starting over, saving time and improving speed.

async function resumeUpload(file, resumeByte) {
  const blob = file.slice(resumeByte);
  const formData = new FormData();
  formData.append('file', blob);
  const response = await fetch('/upload', { method: 'POST', body: formData });
  const result = await response.json();
  return result;
}

6. Instant Uploads

Before slicing the file for upload, calculate its hash and send it to the server. If the server finds an identical file, it can immediately return a success response, avoiding redundant uploads.

async function checkFileExists(file) {
  const hash = await calculateHash(file);
  const response = await fetch(`/check?hash=${hash}`);
  const result = await response.json();
  return result;
}

Summary

This article introduces why large file uploads need optimization and the main optimization strategies. Through code examples, it demonstrates how to implement these optimization methods, aiming to help readers understand and apply solutions for optimizing large file uploads.

We are Leapcell, your top choice for hosting backend projects — we support file uploads up to 100 MB!

Leapcell

Leapcell is the Next-Gen Serverless Platform for Web Hosting, Async Tasks, and Redis:

Multi-Language Support

  • Develop with Node.js, Python, Go, or Rust.

Deploy unlimited projects for free

  • pay only for usage — no requests, no charges.

Unbeatable Cost Efficiency

  • Pay-as-you-go with no idle charges.
  • Example: $25 supports 6.94M requests at a 60ms average response time.

Streamlined Developer Experience

  • Intuitive UI for effortless setup.
  • Fully automated CI/CD pipelines and GitOps integration.
  • Real-time metrics and logging for actionable insights.

Effortless Scalability and High Performance

  • Auto-scaling to handle high concurrency with ease.
  • Zero operational overhead — just focus on building.

Explore more in the Documentation!

Try Leapcell

Follow us on X: @LeapcellHQ

Read on our blog