Implementing Resumable Downloads and Throttling in ASP.NET

Deliver large files reliably by supporting partial downloads and controlling bandwidth

Posted by Hüseyin Sekmenoğlu on January 24, 2022 Backend Development

Large file downloads can fail due to network issues or client-side interruptions. In such cases, it is wasteful and frustrating to require users to restart the entire download. Resumable downloads solve this by allowing clients to continue from where they left off. You can also add throttling to avoid overwhelming your server or client’s network.


📦 What Are Resumable Downloads?

Resumable downloads rely on HTTP range headers. A client can request a specific byte range of a file. If the server supports it, it sends only that part of the file. This way, a client can resume a broken download without starting from the beginning.


🧪 Detecting and Handling Range Requests in ASP.NET

To support partial content, your controller must read the Range header and return the correct portion of the file along with HTTP status 206 Partial Content.

Here is how you can do it:

public async Task<IActionResult> ResumeDownload(string fileName)
{
    var filePath = Path.Combine(_env.ContentRootPath, "files", fileName);
    var fileInfo = new FileInfo(filePath);

    if (!fileInfo.Exists)
        return NotFound();

    var request = HttpContext.Request;
    var response = HttpContext.Response;

    long totalSize = fileInfo.Length;
    long start = 0;
    long end = totalSize - 1;

    if (request.Headers.ContainsKey("Range"))
    {
        var range = request.Headers["Range"].ToString();
        var parts = range.Replace("bytes=", string.Empty).Split('-');

        if (long.TryParse(parts[0], out var parsedStart))
            start = parsedStart;

        if (parts.Length > 1 && long.TryParse(parts[1], out var parsedEnd))
            end = parsedEnd;
    }

    var contentLength = end - start + 1;
    response.StatusCode = StatusCodes.Status206PartialContent;
    response.Headers.Add("Content-Range", $"bytes {start}-{end}/{totalSize}");
    response.ContentType = "application/octet-stream";
    response.ContentLength = contentLength;

    using var stream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
    stream.Seek(start, SeekOrigin.Begin);

    var buffer = new byte[64 * 1024];
    long remaining = contentLength;

    while (remaining > 0)
    {
        var read = await stream.ReadAsync(buffer, 0, (int)Math.Min(buffer.Length, remaining));
        if (read == 0)
            break;

        await response.Body.WriteAsync(buffer, 0, read);
        remaining -= read;
    }

    return new EmptyResult();
}

🚦 Throttling Downloads to Limit Bandwidth

If you want to slow down file transfers to reduce load or simulate real-world bandwidth, you can add artificial delay in the read loop:

await response.Body.WriteAsync(buffer, 0, read);
await response.Body.FlushAsync();
await Task.Delay(100); // Delay 100ms between chunks

This limits how fast the client can download the file. Adjust the delay based on your needs.


🔍 Important Considerations

  • Caching: Disable caching when using partial content to avoid conflicts

  • Mobile Networks: Resumable downloads help on unstable connections

  • Logging: Log range headers and responses for better debugging

  • Security: Validate file paths to prevent directory traversal


✅ Conclusion

Resumable downloads and bandwidth throttling improve the user experience for large file delivery in ASP.NET. Supporting Range headers allows interrupted downloads to continue where they left off and throttling keeps your infrastructure safe under high traffic. These techniques are valuable when building scalable, user-friendly file services.