Fixing performance issues in Azure Functions

Azure Functions is a popular service on the Azure public cloud platform. It provides a serverless, cloud-native, highly-available, and scalable runtime. Users can write code in their preferred development language, including C#, .NET, PowerShell, Java, Python, and others. Azure Functions then implements its system's logic into readily available code blocks.

These code blocks are called "functions." Different functions run when you respond to critical events. Each instance of the triggered function runs in a stand-alone runtime.

For example, in an e-commerce app, you would trigger a new action for each transaction in the product- buying cycle, such as a "move order to the basket," "run payment transaction," and "shipment." You must scale multiple iterations of each function separately. If customers are not making orders; you won't have any Azure computing consumption costs as no functions are running.

However, you should not assume all functions run optimally. The complexity of functions may result in overlooking opportunities to improve performance. Additionally, depending on the Azure Functions pricing options you select, you might face delays due to the default warm-up time before a function is ready to use.

In this article, we walk you through the process of publishing an existing sample Azure Function which interacts with an Azure Cosmos DB to retrieve data from the database. If you want, you can download the sample code to help follow the article.

Optimizing Azure Functions

Although Azure Functions only contain the code for a specific task, functions do not always run at full performance.

Best practices to consider when developing functions include:

  • Use Durable Functions if a stateful scenario is vital to your workload.
  • Deploy the storage for your functions in the same Azure region as your app.
  • Link each function to its pricing plan.
  • Make all layers of the application reliable.
  • Integrate function chains (a sequence of functions) with a messaging or queueing service such as Azure Storage Queue, Service Bus, or others.
  • Do not load all results in memory when connecting to a database endpoint such as Cosmos DB or Storage Tables.
  • Integrate queries to filter data — such as id — to make the function state-aware.
  • Optimize the performance of the data endpoint to avoid possible bottlenecks.

The sample workload scenario

The scenario in this example is a typical serverless API architecture. The Azure Function connects to an Azure Cosmos DB to pull information and present it in a JSON file. Users can present this in a front-end web application such as Blazor, React, or Vue, among others.

Prerequisites

For this demonstration, you need the following:

  • An Azure subscription with contributor permissions to deploy Azure Functions.
  • An Azure Cosmos DB.
  • Visual Studio 2022 (any edition works, including the free Community Edition) with Azure Functions tools enabled.
  • Basic knowledge of Azure services such as Azure Functions and Azure Cosmos DB.
  • Basic understanding of NuGet package integration in .NET applications.
  • Basic knowledge of how to run the dotnet command line.

Deploy Azure Cosmos DB

From the Azure Portal, select New Resource. Search for Azure Cosmos DB and provide the following deployment settings:

  • API Azure Cosmos DB for NOSQL
  • Subscription: Select your Azure Subscription
  • Resource Group: Create a new Resource Group or select an existing one
  • Account Name: Unique name for your Cosmos DB account, for example, “FuncBenchDB”
  • Location: Azure Region of choice
  • Capacity Mode: Provisioned Throughput (you get the first 1000 RU for free)
Deployment settings for an Azure Cosmos DB Fig. 1: Deployment settings for an Azure Cosmos DB

Click Review + create to confirm the Azure Cosmos DB deployment. This should only take a few minutes.

Once it’s deployed, navigate to the Cosmos DB resource. Then, navigate to Data Explorer and select New Database. Enter FAQDB as the Database id, the name of the database. Accept the default value of 1000 RU/s.

This reflects the database’s performance.

Database ID information and throughput Fig. 2: Database ID information and throughput

Next, create a new container. Click Data Explorer and select New Container from the top menu:

Process for creating a new container Fig. 3: Process for creating a new container

Use the existing value — FAQDB — as the Database id. Enter “FAQContainer” as the Container id:

Container ID and throughput Fig. 4: Container ID and throughput

Finally, let’s create a new item in the database. Select the FAQContainer and click New Item on the menu. Paste the following sample JSON document:

{  
"id": "replace_with_new_document_id",
"question": "how cool is Azure?",
"answer": "way cool"
}
New item in the database Fig. 5: New item in the database

The Cosmo DB holds Azure FAQs based on a question-and-answer field in the sample application. While you can create a few additional questions yourself, the source code folder contains a FAQ-questions-sample.JSON file that you can use.

Note: Don’t copy the complete sample JSON file as a single database item. Copy each entry between braces (“{}”) to a separate item in the database. Since each database item in Cosmos DB is a stand-alone JSON item, copying the full file would result in an error when importing it.

This completes the setup of the Azure Cosmos DB.

Load the Visual Studio 2022 solution

This scenario provides two different Visual Studio 2022 solutions, the “Function-BE”, and the “Function-BE – Optimized.”

In the Function-BE source code folder, open the FAQFunctionApp.sln file.

Contents of the Function-BE source code folder Fig. 6: Contents of the Function-BE source code folder

This opens the solution in Visual Studio 2022.

Visual Studio 2022 Solution Explorer Fig. 7: Visual Studio 2022 Solution Explorer

Note: Before running the Azure Function, you must make minor updates to the local.settings.json file. First, update DBConnectionString, which points to the Cosmo DB instance created earlier.

Retrieve the connection string from the Keys section in the Azure Cosmos DB resource you deployed earlier by navigating to the Azure Cosmos DB Resource and selecting Keys from within the Settings section.

Copy the key from the PRIMARY CONNECTION STRING field and provide it as the value for the DBConnectionString parameter.

{  
"IsEncrypted": false,
Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"DBConnectionString": "AccountEndpoint="PASTE_PRIMARY_CONNECTION_STRING_HERE”
}
}

Next, in the AzureFaq.cs file, validate the names of the Cosmos DB and container to correspond to the names you used when creating them in the Cosmos DB:

Names of the Cosmos DB and container in the AzureFaq.cs file Fig. 8: Names of the Cosmos DB and container in the AzureFaq.cs file

Run the Azure Function

After editing the above changes, the Azure Function is ready to run. Start the Run/Debug (F5) process from within Visual Studio. This will start the Azure Function using the local Azure Functions Core Tools and open the command console.

Running the FAQFunctionApp Fig. 9: Running the FAQFunctionApp

The Azure Function provides two different API routes:

  • /api/AzureFAQ, which retrieves Cosmos DB items.
  • /api/CreateAzureFAQ can add new items to the Cosmos DB. However, copying and pasting items into the Cosmos DB editor from the Azure Portal is better using the sample JSON file provided.

When triggering this /api/AzureFAQ API request to retrieve data from the Cosmos DB instance, using the URL provided by the Azure Functions tools, the function connects to the Cosmos DB and reads out different items in the database.

Confirming the Azure Function can connect to the Cosmos DB Fig. 10: Confirming the Azure Function can connect to the Cosmos DB

This confirms the Azure Function can connect to the Cosmos DB from your local machine.

Run the BenchmarkDotNet test

To test the performance of our Azure Function code, use the BenchmarkDotNet tool. The Visual Studio solution source code includes the necessary source files and BenchmarkDotNet config files.

Solution Explorer showing the BenchmarkDotNet tool Fig. 11: Solution Explorer showing the BenchmarkDotNet tool

The Benchmarktests.cs file contains the defined configuration settings of the benchmark test.

GlobalSetup

The [GlobalSetup] section contains the global parameters the test uses. The test triggers httpClient to connect to the running Azure Function. There, the BaseAddress points to the URL of the Azure Function.

You may have to update the Function Port 7071 to a different Azure Function port used on your development workstation.

GlobalSetup section of the Benchmarktests.cs file Fig. 12: GlobalSetup section of the Benchmarktests.cs file

Benchmark

The [Benchmark] section contains the details of what the test validates. In short, the for loop triggers ten Azure Function calls to the URL. This triggers /api/azurefaq and stores the result in the response variable.

Benchmark section of the Benchmarktests.cs file Fig. 13: Benchmark section of the Benchmarktests.cs file

You can run the test from a terminal window in or outside of Visual Studio.

Go to the directory of the sample application:

C:\<samplecodefolder>\Function-BE\Benchmarking.

Contents of the sample application’s directory Fig. 14: Contents of the sample application’s directory

Run the following command to start the BenchmarkDotNet program:

Dotnet run -c Release

The output of the test should look similar to the screenshot below:

Output of the BenchmarkDotNet test Fig. 15: Output of the BenchmarkDotNet test

You can find the performance testing information near the end of the log, especially from the summary table. These values might differ on your machine.

 // * Summary *  

BenchmarkDotNet=v0.13.4, OS=Windows 11 (10.0.22621.1265)
11th Gen Intel Core i7-1185G7 3.00GHz, 1 CPU, 8 logical and 4 physical cores
.NET SDK=7.0.102
[Host] : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
DefaultJob : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2


| Method | Mean | Error | StdDev |
|------------ |--------:|---------:|---------:|
| RunFunction | 3.807 s | 0.0514 s | 0.0415 s |

// * Legends *
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
1 s : 1 Second (1 sec)

// ***** BenchmarkRunner: End *****
Run time: 00:01:23 (83.84 sec), executed benchmarks: 1

Global total time: 00:01:31 (91.94 sec), executed benchmarks: 1
// * Artifacts cleanup *

Load the Optimized Visual Studio 2022 solution

This scenario provides two different Visual Studio 2022 solutions, the Function-BE, and the Function-BE- Optimized.

In the Function-BE-Optimized source code folder, open the FAQFunctionApp.sln file.

Contents of the Function-BE - Optimized source code folder Fig. 16: Contents of the Function-BE - Optimized source code folder

What was changed?

This example uses the database query filter to load a subset of data in memory.

This code change is noticeable in the OptimizedAzureFAQ source file:

Code changes in the OptimizedAzureFAQ source file Fig. 17: Code changes in the OptimizedAzureFAQ source file

You updated the Cosmo DB dataset with the SqlQuery parameter (1), performed a filter (2), and limited the results to a maximum of ten (3). You should alter these parameters outside of this example to suit your application’s workload.

You must make some changes to the Azure Function source files before the function runs successfully:

  • Cosmo DB settings for Container and Database Name (lines 19 and 20 in the OptimizedAzureFAQ.cs file)
  • Cosmo DB Connection String (in the local.settings.json)
  • Baseaddress URL in Benchmarktest (line 20 in the Benchmarktests.cs file)

Once these files are updated, run and debug the OptimizedFunction.

Running the OptimizedFunction Fig. 18: Running the OptimizedFunction

Run the optimized Benchmarktest

Run the Benchmarktest in the Function-BE-Optimized directory using the same steps as before.

Running the optimized Benchmarktest Fig. 19: Running the optimized Benchmarktest

The output from the summary should look similar to this:

 // * Summary *  

BenchmarkDotNet=v0.13.4, OS=Windows 11 (10.0.22621.1265)
11th Gen Intel Core i7-1185G7 3.00GHz, 1 CPU, 8 logical and 4 physical cores
.NET SDK=7.0.102
[Host] : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
DefaultJob : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2


| Method | Mean | Error | StdDev |
|------------ |--------:|---------:|---------:|
| RunFunction | 2.407 s | 0.0258 s | 0.0241 s |

// * Legends *
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
1 s : 1 Second (1 sec)

// ***** BenchmarkRunner: End *****
Run time: 00:01:03 (63.84 sec), executed benchmarks: 1

Interpretations

There’s a 25-35% difference between the two function apps. In this example, integrating the SqLQuery filter is the biggest part of this optimization.

There’s also a 10% performance difference when running the function on a Linux platform instead of Windows.

Both these results confirm the best practices stated at the start of this article. Adding data into the Cosmos DB — combined with increasing the (free) RU performance amount — may provide additional performance differences.

Conclusion

With Azure Functions, Microsoft enables serverless cloud architecture. Functions are optimized by design since they typically run a specific short-running task. More complex scenarios where different functions run back-to-back or in parallel are known as durable functions. Apart from running the function code and triggering API calls, Azure Functions are the perfect back-end solution for interacting with other Azure resources such as Storage Accounts, Cosmo DB, and Service Bus, among others.

Azure Functions’ performance isn’t always a given, as the architecture has so many moving parts. Microsoft provides extensive documentation on performance best practices when developing Azure Functions. The most significant benefit could be defining the correct pricing plan that fits your scenario. As demonstrated in this sample workload scenario, the subsequent optimizations are possible from within the Azure Functions code itself.

Developers are responsible for providing the most-optimized version of the code and rely on the cloud operations team to configure the same for the actual runtimes. By adapting these best practices, your organization can benefit from the cloud’s serverless and microservices architectures while optimizing costs.

Was this article helpful?
Monitor your Azure infrastructure

Monitor over 100 Azure resources with Site24x7's Azure monitoring tool for optimal Azure performance.

Related Articles

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 "Learn" portal. Get paid for your writing.

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.

Apply Now
Write For Us