Azure Blob Storage, an integral component of Microsoft Azure’s services, stores a diverse range of unstructured data in the cloud. This data comprises text documents, images, multimedia files, and application logs.
Its standout benefits include the ability to reduce latency by storing data closer to access points and improve data retrieval times by using data tiering and indexing. Azure Blob Storage lets you choose hot, cool, cold, or archive tiers to fine-tune your storage costs.
This tutorial will guide you in managing Azure Blob Storage effectively using the .NET software development kit (SDK). The SDK offers several advantages over making direct calls to the Azure Blob Storage REST API, including ease of use since it bypasses direct call intricacies. Its uniform approach ensures consistency, making code maintenance easier. The .NET SDK has built-in error handling, logging, and other functions. Its official and community support also ensures you stay aligned with the latest features and best practices.
In this article, you’ll learn how to leverage the .NET SDK to manage Azure Blob Storage efficiently.
Before you begin, you’ll need to:
Consider a scenario where your organization generates a substantial volume of log files on a legacy server and needs to do the following:
This process might look like the following:
Fig. 1: Flow of log files from a server to an uploader, blob container, manager, or processor for additional analyticsTo lay the foundation for managing Azure Blob Storage, you first need to allocate and configure your Azure resources. Here's a table outlining the resources you’ll be provisioning and their specific usage:
Resource | Usage |
---|---|
Storage Account | Serves as the foundation for storing blobs and is essential for the Function App operations |
Blob Container | Acts as a specific storage location (akin to a folder) for organizing your blobs |
Function App | A serverless compute service that lets you run event-driven code without managing infrastructure |
The Function App needs a storage account for smooth operation. You’ll reuse that account for your blob storage for simplicity.
First, open Azure Portal and click Create a resource > Storage Account. Fill in the details for Subscription, Resource Group, and Storage Account name (note this name). Click Review, then Create to provision your storage account.
Go to your storage account in the Azure portal. Click Access Keys in the left pane. Copy key 1 for later use.
Fig. 2: Details of the blob storage access keysBefore uploading any blobs, create a container in that storage account to hold them. Think of containers as folders that organize your blobs.
Go to your Azure Storage account in the Azure Portal. Click Containers in the left pane. Then, click + Container. Next, name the container (for example, “logfiles”). Set the access level to Private so the container can’t be accessed without logging into the Azure Portal. Finally, click Create.
Before diving into Azure Blob Storage, set up a new Function App. First, open the Azure portal and click Create a resource > Web > Function App. Fill in the necessary details like Subscription, Resource Group, and Function App name (this tutorial uses “blob-sdk-example-function”). For the Runtime stack, select .NET. In the Storage tab, choose the newly created account. Finally, click Review + create to deploy your Function App.
Fig. 3: Options to create a Function AppWith all your resources provisioned in Azure, you can create a new Function App project. For simplicity, this tutorial uses Visual Studio.
You’ll create three Azure functions:
Begin by creating the first function to upload files. Open Visual Studio and go to File > New > Project.
Fig. 4: Azure Functions Additional information screenSearch for and select Azure Functions. Name the project BlobSdkExampleFunction and click Create.
Next, choose the Timer trigger template. Set the Schedule to the CRON expression 0 */1 * * * * to run every minute for testing. Rename the Function and the file containing it to LogsUploader and UploadLogs.cs, respectively.
Then, add the following log output to the file to test that it works:
public class LogsUploader
{
[FunctionName("LogsUploader")]
public void Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation("Uploading logs…";)
}
}
Now, build and run the project. Log messages should appear in the console logs every minute.
To interact with Azure Blob Storage, add its SDK to your project. Go to the NuGet package manager, search for “Azure.Storage.Blobs,” and install this package.
Azure Blob Storage supports multiple blob types. Block blobs are ideal for text and binary data, like files, documents, and media. Append blobs are useful for live logging data or situations requiring data appendage. Page blobs, which virtual machines (VMs) typically use, are best suited for frequent read and write operations.
This tutorial focuses on block blobs due to their prevalence in file storage. They also match the default behavior for SDK clients.
The Azure Blob Storage SDK provides developers with specialized client interfaces, each designed to interact with specific components of the Blob Storage. These clients act as intermediaries, allowing you to integrate, manage, and manipulate your blob data seamlessly.
BlobServiceClient is the primary entry point for developers when working with the Azure Blob Storage account. With this client, you can:
Think of BlobContainerClient as your toolbox for a specific container within your Azure Blob Storage. This client offers functionality to:
As the name suggests, BlobClient is for managing individual blobs. With the BlobClient, you can:
First, update your LogsUploader function by adding the following code. Include your storage account name and access key you noted earlier:
[FunctionName("LogsUploader")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
const string accountName = "<Your storage account name>";
const string accountKey = ""<Your Account Key Here">";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";
var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);
var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");
//…
}
Place files in a folder on your computer’s local file system to serve as your fictional log files. Copy this folder’s path into a new variable in your LogsUploader function. For example:
[var localFolderPath = @"C:\temp\logs";
When uploading a blob, you can set a blob name different from the file you’re uploading. For this example, keep the same name as the local file.
Next, update your function to iterate over all the files in your folder. Put it below the code you added for initializing the BlobServiceClient and BlobContainerClient. For each file, you will create a new BlobClient, upload the file to your container as a blob, and delete the file from the local disk.
[FunctionName("LogsUploader")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…
const string localFolderPath = @"<Your local log files path>";
foreach (var filePath in Directory.GetFiles(localFolderPath))
{
var blobName = Path.GetFileName(filePath);
var blobClient = containerClient.GetBlobClient(blobName);
await using (var uploadFileStream = new FileStream(filePath, FileMode.Open))
{
await blobClient.UploadAsync(uploadFileStream, true);
}
File.Delete(filePath);
log.LogInformation($"Uploaded and deleted local file: {filePath}");
}
}
Now, build and run your Function App. It will check for new files every minute, upload any it finds, and remove the local copies.
Inspect the container in the Azure portal to confirm the function worked as expected. Or, list the contents using the .NET SDK, which you’ll do next.
Azure Blob Storage provides multiple methods to list and retrieve blobs. Here, you’ll focus on listing all blobs in a container and retrieving a specific blob by name.
To list all blobs in a container, use the GetBlobsAsync method from the BlobContainerClient class. Update your function to output all the blobs to the logging process after the upload:
[FunctionName("LogsUploader")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync()) {
log.LogInformation($"Blob name: {blobItem.Name}");
}
}
When you run the app now, you should get a log output in the console, showing all files currently in your blob container.
This tutorial’s fictional scenario requires triggering another function each time a new blob is uploaded, then downloading it for additional processing elsewhere. Use the GetBlobClient method from the BlobContainerClient class and specify the blob’s name.
You’ll need a new function to do this. Create a new file in your LogsProcessor.cs project and paste the following code. Remember to add your storage account name and key:
public static class LogsProcessor
{
[FunctionName("LogsProcessor")]
public static async TaskRun(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
var data = JsonConvert.DeserializeObject(requestBody);
string blobName = data?.BlobName;
const string accountName = "<Your storage account name>";
const string accountKey = "<Your Account Key Here>";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";
var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);
var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");
var blobClient = containerClient.GetBlobClient(blobName);
const string localFolderPath = @"<Your local downloaded log files path>";
var downloadFilePath = Path.Combine(localFolderPath, blobName);
BlobDownloadInfo download = await blobClient.DownloadAsync();
await using FileStream fs = File.OpenWrite(downloadFilePath);
await download.Content.CopyToAsync(fs);
fs.Close();
log.LogInformation($"Downloaded blob to: {downloadFilePath}");
return new OkObjectResult($"Downloaded blob to: {downloadFilePath}");
}
}
In production, you’d likely ship these logs to some other analytics service. However, for demonstration purposes, you’ll download them locally here. Create a new folder to store downloaded logs and set the path for the variable localFolderPath.
Add code to the LogsUploader function to trigger this processor. In the Run function, append the following line:
[FunctionName("LogsUploader")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…
await TriggerProcessorFunction(blobName);
}
Then, add the new method to the LogsUploader class:
private async Task TriggerProcessorFunction(string blobName)
{
var client = new HttpClient();
var payload = new { BlobName = blobName };
var content = new StringContent(JsonConvert.SerializeObject(payload), Encoding.UTF8, "application/json");
// Check the port being used when running and adjust if necessary
await client.PostAsync("http://localhost:7071/api/LogsProcessor", content);
}
Important: Confirm the port where your LogsProcessor runs.
Build and rerun your Function App project, and place the new file in your original folder. The LogsUploader function moves the file to your Blob storage as before, then triggers LogProcesor to download a copy.
Now, create another function scheduled to run every minute. The setup will be the same as earlier, and you’ll need a BlobContainerClient again to interact with your container.
For this tutorial, change the access tier of blobs older than 3 minutes to “cool” using the SetAccessTierAsync method with the BlobClient. Next, delete any blobs older than 6 minutes using the DeleteIfExistsAsync method.
Bring it all together with the following code:
public class LogsManager
{
[FunctionName("LogsManager")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
const string accountName = "<Your storage account name>";
const string accountKey = "<Your Account Key Here>";
const string blobServiceEndpoint = $"https://{accountName}.blob.core.windows.net";
var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(new Uri(blobServiceEndpoint), credentials);
var containerClient = blobServiceClient.GetBlobContainerClient("logfiles");
var currentTime = DateTimeOffset.UtcNow;
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
BlobClient blobClient = containerClient.GetBlobClient(blobItem.Name);
BlobProperties properties = await blobClient.GetPropertiesAsync();
var ageInMinutes = (currentTime - properties.CreatedOn).TotalMinutes;
switch (ageInMinutes)
{
case >= 6:
await blobClient.DeleteIfExistsAsync();
log.LogInformation($"Deleted blob: {blobItem.Name}");
break;
case >= 3 when properties.AccessTier == AccessTier.Hot:
await blobClient.SetAccessTierAsync(AccessTier.Cool);
log.LogInformation($"Moved blob to cool storage: {blobItem.Name}");
break;
}
}
}
}
By following these steps, you can effectively manage your blobs’ lifecycles, optimizing storage costs and improving data retrieval times. The screenshot below shows some log files and their access tiers:
Fig. 5: Log files in a blobNow, build and run the application. As you pass files through, you can check the access tier updates in the Azure portal.
When working with Azure Blob Storage, ensuring secure access to your data is paramount. Shared access signatures (SAS) offer one of the most flexible ways to secure your blobs.
A SAS Uniform Resource Identifier (URI) grants clients limited access to Azure Storage without giving out account keys. It’s great for temporary access or when third-party clients aren’t fully trustworthy.
Imagine that the LogsProcessor function represents a third-party service. They only need brief access to copy each new blob, so you don’t want to give them full access to the storage account.
One solution is to have your LogsUploader function generate an SAS token scoped to each blob and limited to one hour. You can then build a complete URI to download a blob, including this token, and then send it to LogsProcessor.
First, add the following method to your LogsUploader class
private static Uri GenerateBlobSasToken(BlobClient blobClient)
{
var sasBuilder = new BlobSasBuilder
{
BlobContainerName = blobClient.BlobContainerName,
BlobName = blobClient.Name,
Resource = "b",
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
sasBuilder.SetPermissions(BlobSasPermissions.Read);
return blobClient.GenerateSasUri(sasBuilder);
}
The function accepts a BlobClient instance, representing the SAS token’s target blob. The Resource is set to b, indicating that the SAS is for a single Blob.
Since you’re now sending a complete URI to the processor function, modify your trigger method:
private async Task TriggerProcessorFunction(Uri blobUriWithSas)
{
var client = new HttpClient();
var payload = new { BlobUriWithSas = blobUriWithSas.ToString() };
var content = new StringContent(JsonConvert.SerializeObject(payload), Encoding.UTF8, "application/json");
// Check the port being used when running and adjust if necessary
await client.PostAsync("http://localhost:7071/api/LogsProcessor", content);
}
Next, update the code at the end of your LogsUploader Run function:
[FunctionName("LogsUploader")]
public async Task Run([TimerTrigger("0 */1 * * * *")] TimerInfo myTimer, ILogger log)
{
//…
var blobUriWithSas = GenerateBlobSasToken(blobClient);
await TriggerProcessorFunction(blobUriWithSas);
}
The LogsProcessor no longer needs to authenticate with a key for the whole storage account. When triggered, it receives a URI that points to only the trigger blob, including the SAS token for access to that blob only. So, update your LogsProcessor code to use this URI:
[FunctionName("LogsProcessor")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
var data = JsonConvert.DeserializeObject<dynamic>(requestBody);
string blobUriWithSas = data?.BlobUriWithSas;
const string localFolderPath = @"<Your local downloaded log files path";
var blobClient = new BlobClient(new Uri(blobUriWithSas));
var blobName = blobClient.Name;
var downloadFilePath = Path.Combine(localFolderPath, blobName);
BlobDownloadInfo download = await blobClient.DownloadAsync();
await using FileStream fs = File.OpenWrite(downloadFilePath);
await download.Content.CopyToAsync(fs);
fs.Close();
log.LogInformation($"Downloaded blob to: {downloadFilePath}");
return new OkObjectResult($"Downloaded blob to: {downloadFilePath}");
}
You’re no longer sharing full storage account access with your fictional third-party processing service.
This tutorial demonstrated managing Azure Blob Storage using the .NET SDK. From integrating the SDK into your applications to uploading, retrieving, modifying, and securing your blobs, you explored a broad spectrum of operations essential for any developer working with Azure Blob Storage.
The synergy of .NET and Azure Blob Storage provides a robust platform for building scalable and secure cloud storage solutions. Understanding and effectively managing your cloud storage resources is key to building efficient and secure applications.
Keep experimenting with the Azure Blob Storage SDK and exploring its myriad features to optimize your storage management strategies. And to ensure you’re using your Azure Blob Storage resources effectively, try Site24x7’s Azure Monitoring tool. Its capabilities, including tracking ingress/egress volumes, checking blob capacity, and counting containers, help you improve your Azure Storage resource consumption, and its IT automation, reports, and alerts features help you stay in-the-know about your Azure Storage services.
Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.
Apply Now