File Upload to Azure Blob Storage Using Azure Functions

tamami.I Web Developer
6 min readDec 7, 2024

--

1. Scope of This Article

Azure Functions enables the efficient development of event-driven applications through serverless architecture. This article explains how to upload files to Azure Blob Storage using Azure Functions with sample code.

2. Content

2–1. Implementation Overview

Using Azure Functions, HTTP POST requests are handled to upload the submitted files to Azure Blob Storage.

// Sample code for Azure Function to upload files to Azure Blob Storage
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using MimeMapping;

namespace SampleFunctionApp
{
public class UploadFileFunction
{
private readonly ILogger<UploadFileFunction> _logger;

public UploadFileFunction(ILogger<UploadFileFunction> logger)
{
_logger = logger;
}

[Function("UploadFileFunction")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "api/files/upload")] HttpRequest req,
ILogger log)
{
log.LogInformation("UploadFileFunction triggered");

try
{
// Retrieve form data
var form = await req.ReadFormAsync();
var file = form.Files["file"];
if (file == null || file.Length == 0)
{
return new BadRequestObjectResult("No file uploaded or file is empty.");
}

// File size limit (100 MB)
const long maxFileSize = 100 * 1024 * 1024;
if (file.Length > maxFileSize)
{
return new BadRequestObjectResult($"File size exceeds the limit of {maxFileSize / (1024 * 1024)}MB.");
}

// Get MIME type
var contentType = MimeUtility.GetMimeMapping(file.FileName);

// Configure storage connection and container
var connectionString = Environment.GetEnvironmentVariable("AzureWebJobsStorage");
var containerName = "your-container-name";

// Initialize Blob client
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient(containerName);
await containerClient.CreateIfNotExistsAsync(PublicAccessType.None);

// Generate Blob name with folder structure
var folderPath = $"uploads/{DateTime.UtcNow:yyyyMMdd}";
var blobName = $"{folderPath}/{Path.GetFileNameWithoutExtension(file.FileName)}_{DateTime.UtcNow:HHmmss}{Path.GetExtension(file.FileName)}";
var blobClient = containerClient.GetBlobClient(blobName);

using var stream = file.OpenReadStream();
await blobClient.UploadAsync(stream, new BlobHttpHeaders { ContentType = contentType });

// Return success response
return new OkObjectResult(new
{
Message = "File uploaded successfully",
BlobUrl = blobClient.Uri.ToString()
});
}
catch (Exception ex)
{
log.LogError($"Error in UploadFileFunction: {ex.Message}");
return new ObjectResult(new { Error = "File upload failed", Details = ex.Message })
{
StatusCode = StatusCodes.Status500InternalServerError
};
}
}
}
}

2–2. Detailed Code Explanation

2–2–1. File Upload Trigger

[Function("UploadFileFunction")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "api/files/upload")] HttpRequest req,
ILogger log)

HTTP Trigger:

  • The HTTP trigger is executed when Azure Functions receives an HTTP request.
  • Unlike other triggers (e.g., Blob Trigger, Queue Trigger), it allows direct access from clients.
  • In this code, post is explicitly specified to allow only POST requests. Multiple HTTP methods can be permitted if needed.
[HttpTrigger(AuthorizationLevel.Function, "post", "put", Route = "api/files/upload")]

Authentication Level:

With AuthorizationLevel.Function, a function key is required, ensuring security while allowing straightforward access control. Other options include.

  • Anonymous: No authentication required (not recommended for sensitive operations).
  • Admin: Requires an admin key, providing the highest level of security and restricted access.

2–2–2. Retrieving Form Data and Basic Validation

var form = await req.ReadFormAsync();
var file = form.Files["file"];
if (file == null || file.Length == 0)
{
return new BadRequestObjectResult("No file uploaded or file is empty.");
}

Retrieving Form Data:

Using ReadFormAsync, the uploaded file (IFormFile) is retrieved. If the file is empty or the upload fails, a 400 Bad Request response is returned.

2–2–3. File Size Limitation

const long maxFileSize = 100 * 1024 * 1024; // 100 MB
if (file.Length > maxFileSize)
{
return new BadRequestObjectResult($"File size exceeds the limit of {maxFileSize / (1024 * 1024)}MB.");
}

In the Case of Azure Functions:

In Azure Functions, the HTTP trigger has a default request body size limit of 100 MB. This limit can be adjusted through the host.json file, which controls the overall behavior of Azure Functions. If you want to reduce the file size limit to less than 100 MB, follow these steps.

Add the following configuration to the host.json file and set maxRequestLength to an appropriate value (in bytes). This setting applies to the entire Function App but cannot exceed 100 MB. In the example below, the request body size limit is set to 50 MB.

{
"version": "2.0",
"extensions": {
"http": {
"maxRequestLength": 52428800 // 50MB
}
}
}

In the Case of App Service Plan:

When hosted on an App Service Plan, the request body size limit is determined by IIS, which typically defaults to 4 MB. This limitation can be extended by modifying the web.config file. The size limit is specified in bytes. For example, to set the request body size limit to 50 MB (52428800 bytes), use the following configuration.

<configuration>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="52428800" /> <!-- 50MB -->
</requestFiltering>
</security>
</system.webServer>
</configuration>

2–2–4. Determining MIME Type

var contentType = MimeUtility.GetMimeMapping(file.FileName);

Example Mappings:

  • example.txttext/plain
  • image.jpgimage/jpeg
  • Unknown file extensions → Defaults to application/octet-stream.

If using FileExtensionContentTypeProvider from Microsoft.AspNetCore.StaticFiles:

new FileExtensionContentTypeProvider().TryGetContentType(file.FileName, out string? contentType);
contentType ??= "application/octet-stream";

FileExtensionContentTypeProvider relies on Microsoft.AspNetCore.StaticFiles, which has been deprecated. As a replacement, the MimeMapping package can be used to determine the MIME type of files effectively.

2–2–5. Azure Blob Storage Connection Configuration

var connectionString = Environment.GetEnvironmentVariable("AzureWebJobsStorage");

For testing purposes, the connection string is set in local.settings.json. However, in production or scenarios requiring enhanced security, Azure Key Vault should be used to store and retrieve the connection string securely.

// Determine if running locally
var isLocal = string.IsNullOrEmpty(Environment.GetEnvironmentVariable("WEBSITE_INSTANCE_ID"));

// Retrieve connection string from Key Vault in production
if (!isLocal)
{
var keyVaultUrl = Environment.GetEnvironmentVariable("KeyVaultUrl");
var secretClient = new SecretClient(new Uri(keyVaultUrl), new DefaultAzureCredential());
var secret = await secretClient.GetSecretAsync("AzureWebJobsStorage");
connectionString = secret.Value.Value;
}

var blobServiceClient = new BlobServiceClient(connectionString);

2–2–6. Creating the Blob Client

var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient(containerName);
await containerClient.CreateIfNotExistsAsync(PublicAccessType.None);

Blob Service Client:

  • Connect to the Azure Blob Storage service using the connection string.
  • BlobServiceClient is the client class that serves as the entry point for all operations within Blob Storage. In addition to connection strings, it also supports Azure AD authentication and SAS tokens.

Verifying and Creating the Container:

If the specified container does not exist, CreateIfNotExistsAsync is called to automatically create it. By specifying PublicAccessType.None, public access to the container is disabled, ensuring security.

Public Access Settings:

  1. PublicAccessType.None:
    Disables public access by default, requiring authentication (recommended).
  2. PublicAccessType.Blob:
    Allows anonymous access to individual blobs within the container.
  3. PublicAccessType.Container:
    Grants anonymous access to all blobs in the container (not recommended).

2–2–7. Generating Blob Names with Folder Structure

var folderPath = $"uploads/{DateTime.UtcNow:yyyyMMdd}";
var blobName = $"{folderPath}/{Path.GetFileNameWithoutExtension(file.FileName)}_{DateTime.UtcNow:HHmmss}{Path.GetExtension(file.FileName)}";

Folder Structure:

  • Blob Storage does not inherently support a folder structure, but a virtual folder structure can be achieved by including path-like formats in the Blob names. Using path-like formats allows data to be organized logically, making it appear as though folders exist.

Blob Name:

  • Adding a timestamp in the HHmmss format to the file name ensures unique Blob names, even within the same day.

Preventing Overwrites of Existing Blobs:

In Azure Blob Storage, uploading a file with the same Blob name overwrites the existing Blob. If overwriting is acceptable, you can skip the existence check and proceed with the upload. In this case, the existing Blob will be replaced by the new one.

  • Generate Unique Names:
    Append a unique identifier, like a GUID, to ensure each Blob name is distinct:
var blobName = $"{folderPath}/{Path.GetFileNameWithoutExtension(file.FileName)}_{Guid.NewGuid()}{Path.GetExtension(file.FileName)}";
  • Check for Existing Blobs:
    Verify if the blob exists before uploading:
if (await blobClient.ExistsAsync())
{
return new ConflictObjectResult(new
{
Message = "A file with the same name already exists.",
ExistingBlobUrl = blobClient.Uri.ToString()
});
}

2–2–8. File Upload

var blobClient = containerClient.GetBlobClient(blobName);

using var stream = file.OpenReadStream();
await blobClient.UploadAsync(stream, new BlobHttpHeaders { ContentType = contentType });

Blob Client:

Using containerClient.GetBlobClient(blobName), a client is generated based on the specified Blob name, enabling operations on that specific Blob.

File Upload:

The blobClient.UploadAsync method is used to upload the file stream to the Blob asynchronously. The Azure Blob Storage SDK is designed with asynchronous methods optimized for I/O operations. To maximize performance, always use asynchronous methods.

Additional Information on File Uploads:

Here are some tips and techniques to enhance the reliability and performance of file uploads to Azure Blob Storage:

  • Tracking Upload Progress:
    For large file uploads, track the progress to provide feedback to the user:
var progressHandler = new Progress<long>(bytesTransferred =>
{
Console.WriteLine($"Uploaded {bytesTransferred} bytes.");
});

await blobClient.UploadAsync(stream, new BlobUploadOptions
{
ProgressHandler = progressHandler
});
  • Retry Policy for Uploads:
    Retry settings can be configured to handle network errors or other transient issues. While the Azure SDK includes a default retry policy, custom configurations are also supported.
var options = new BlobClientOptions
{
Retry =
{
MaxRetries = 5, // Maximum number of retries
Delay = TimeSpan.FromSeconds(2), // Delay between retries
Mode = RetryMode.Exponential // Exponential backoff for retries
}
};

var blobClient = new BlobClient(connectionString, containerName, blobName, options);

The Azure Blob Storage SDK also supports an automatic chunked upload feature, which can efficiently handle large files by splitting them into smaller parts during the upload process.

3. Conclusion

This article provided a detailed explanation of implementing file uploads to Azure Blob Storage using Azure Functions. Leveraging the HTTP trigger in Azure Functions maximizes the benefits of serverless architecture, enabling scalable and efficient design.

In production, considerations such as file size limits and security settings are important, but Azure Functions’ pay-as-you-go model ensures cost efficiency, charging only for actual usage. This combination is especially suitable for those exploring serverless solutions and Azure Blob Storage for the first time, offering a simple and effective approach.

--

--

tamami.I Web Developer
tamami.I Web Developer

Written by tamami.I Web Developer

I’m an engineer from Japan who loves coffee and mystery novels. I focus on cloud-native tech and share insights on tips, reflections, and hobbies.

No responses yet