Handling Chunked and Resumable File Uploads in ASP.NET

Build scalable and user-friendly upload workflows for large files

Posted by Hüseyin Sekmenoğlu on March 19, 2022 Backend Development

When users upload large files through your application, a single HTTP request may not be sufficient. Network interruptions, browser limitations or memory constraints can all cause uploads to fail. Supporting chunked and resumable uploads improves reliability and user experience. In this article you will learn how to handle these cases with ASP.NET.


๐Ÿงฑ What Are Chunked and Resumable Uploads?

  • Chunked Upload: The file is split into smaller parts (chunks) and each chunk is sent separately.

  • Resumable Upload: Uploads can be paused or interrupted and later continued from where they left off.

This approach reduces the impact of failures and enables upload progress tracking.


๐Ÿงฐ Client-Side Implementation Overview

Libraries like Resumable.js, FineUploader or Uppy can handle chunking on the frontend.

Each chunk includes metadata such as:

  • Unique file identifier

  • Total number of chunks

  • Current chunk index

  • File name

These are posted to a backend endpoint as form data or headers.


๐Ÿงฉ Design Your Upload Endpoint

Create a controller that accepts individual chunks:

[HttpPost]
public async Task<IActionResult> UploadChunk(
    IFormFile chunk,
    [FromForm] int chunkIndex,
    [FromForm] int totalChunks,
    [FromForm] string fileIdentifier,
    [FromForm] string fileName)
{
    var tempFolder = Path.Combine("TempUploads", fileIdentifier);
    Directory.CreateDirectory(tempFolder);

    var chunkPath = Path.Combine(tempFolder, $"chunk-{chunkIndex}");
    using var stream = new FileStream(chunkPath, FileMode.Create);
    await chunk.CopyToAsync(stream);

    return Ok();
}

๐Ÿงช Verify Completion and Combine Chunks

Once all chunks are uploaded, the frontend sends a request to finalize the upload:

[HttpPost]
public IActionResult FinalizeUpload([FromForm] string fileIdentifier, [FromForm] string fileName)
{
    var tempFolder = Path.Combine("TempUploads", fileIdentifier);
    var finalPath = Path.Combine("Uploads", fileName);

    var chunkFiles = Directory.GetFiles(tempFolder).OrderBy(f => f);
    using var output = new FileStream(finalPath, FileMode.Create);

    foreach (var chunkFile in chunkFiles)
    {
        using var input = new FileStream(chunkFile, FileMode.Open);
        input.CopyTo(output);
    }

    Directory.Delete(tempFolder, true);
    return Ok();
}

This assembles the full file on the server.


๐Ÿงฏ Add Safety and Validation

To make your upload system secure:

  • Limit maximum chunk size

  • Use a whitelist for file types and extensions

  • Authenticate users before allowing uploads

  • Scan for malware using antivirus tools if needed

You can also limit simultaneous uploads per user.


๐Ÿ” Support for Resuming Uploads

To resume uploads:

  • Use a persistent file identifier

  • Check which chunks already exist on the server

  • Skip uploading existing chunks

Example check endpoint:

[HttpGet]
public IActionResult CheckChunk([FromQuery] string fileIdentifier, [FromQuery] int chunkIndex)
{
    var chunkPath = Path.Combine("TempUploads", fileIdentifier, $"chunk-{chunkIndex}");
    return System.IO.File.Exists(chunkPath) ? Ok() : NotFound();
}

This lets the frontend skip chunks that were already uploaded.


๐Ÿ“Š Monitor Progress and Status

Track chunk uploads with:

  • Real-time logs or SignalR updates

  • A simple DB table to store upload state

  • Expiration logic to clean up abandoned uploads

Monitoring helps identify problems and optimize server usage.


โœ… Conclusion

Chunked and resumable uploads are essential for modern web apps dealing with large files. With ASP.NET you can build robust endpoints that support retries, progress tracking and safe assembly. Combine this with frontend libraries for a smooth user experience and reliable file handling.