Monitor over 100 Azure resources with Site24x7's Azure monitoring tool for optimal Azure performance.
Azure Functions is a popular service on the Azure public cloud platform. It provides a serverless, cloud-native, highly-available, and scalable runtime. Users can write code in their preferred development language, including C#, .NET, PowerShell, Java, Python, and others. Azure Functions then implements its system's logic into readily available code blocks.
These code blocks are called "functions." Different functions run when you respond to critical events. Each instance of the triggered function runs in a stand-alone runtime.
For example, in an e-commerce app, you would trigger a new action for each transaction in the product- buying cycle, such as a "move order to the basket," "run payment transaction," and "shipment." You must scale multiple iterations of each function separately. If customers are not making orders; you won't have any Azure computing consumption costs as no functions are running.
However, you should not assume all functions run optimally. The complexity of functions may result in overlooking opportunities to improve performance. Additionally, depending on the Azure Functions pricing options you select, you might face delays due to the default warm-up time before a function is ready to use.
In this article, we walk you through the process of publishing an existing sample Azure Function which interacts with an Azure Cosmos DB to retrieve data from the database. If you want, you can download the sample code to help follow the article.
Although Azure Functions only contain the code for a specific task, functions do not always run at full performance.
Best practices to consider when developing functions include:
The scenario in this example is a typical serverless API architecture. The Azure Function connects to an Azure Cosmos DB to pull information and present it in a JSON file. Users can present this in a front-end web application such as Blazor, React, or Vue, among others.
For this demonstration, you need the following:
From the Azure Portal, select New Resource. Search for Azure Cosmos DB and provide the following deployment settings:
Click Review + create to confirm the Azure Cosmos DB deployment. This should only take a few minutes.
Once it’s deployed, navigate to the Cosmos DB resource. Then, navigate to Data Explorer and select New Database. Enter FAQDB as the Database id, the name of the database. Accept the default value of 1000 RU/s.
This reflects the database’s performance.
Fig. 2: Database ID information and throughputNext, create a new container. Click Data Explorer and select New Container from the top menu:
Fig. 3: Process for creating a new containerUse the existing value — FAQDB — as the Database id. Enter “FAQContainer” as the Container id:
Fig. 4: Container ID and throughputFinally, let’s create a new item in the database. Select the FAQContainer and click New Item on the menu. Paste the following sample JSON document:
{
"id": "replace_with_new_document_id",
"question": "how cool is Azure?",
"answer": "way cool"
}
The Cosmo DB holds Azure FAQs based on a question-and-answer field in the sample application. While you can create a few additional questions yourself, the source code folder contains a FAQ-questions-sample.JSON file that you can use.
Note: Don’t copy the complete sample JSON file as a single database item. Copy each entry between braces (“{}”) to a separate item in the database. Since each database item in Cosmos DB is a stand-alone JSON item, copying the full file would result in an error when importing it.
This completes the setup of the Azure Cosmos DB.
This scenario provides two different Visual Studio 2022 solutions, the “Function-BE”, and the “Function-BE – Optimized.”
In the Function-BE source code folder, open the FAQFunctionApp.sln file.
Fig. 6: Contents of the Function-BE source code folderThis opens the solution in Visual Studio 2022.
Fig. 7: Visual Studio 2022 Solution ExplorerNote: Before running the Azure Function, you must make minor updates to the local.settings.json file. First, update DBConnectionString, which points to the Cosmo DB instance created earlier.
Retrieve the connection string from the Keys section in the Azure Cosmos DB resource you deployed earlier by navigating to the Azure Cosmos DB Resource and selecting Keys from within the Settings section.
Copy the key from the PRIMARY CONNECTION STRING field and provide it as the value for the DBConnectionString parameter.
{
"IsEncrypted": false,
Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"DBConnectionString": "AccountEndpoint="PASTE_PRIMARY_CONNECTION_STRING_HERE”
}
}
Next, in the AzureFaq.cs file, validate the names of the Cosmos DB and container to correspond to the names you used when creating them in the Cosmos DB:
Fig. 8: Names of the Cosmos DB and container in the AzureFaq.cs fileAfter editing the above changes, the Azure Function is ready to run. Start the Run/Debug (F5) process from within Visual Studio. This will start the Azure Function using the local Azure Functions Core Tools and open the command console.
Fig. 9: Running the FAQFunctionAppThe Azure Function provides two different API routes:
When triggering this /api/AzureFAQ API request to retrieve data from the Cosmos DB instance, using the URL provided by the Azure Functions tools, the function connects to the Cosmos DB and reads out different items in the database.
Fig. 10: Confirming the Azure Function can connect to the Cosmos DBThis confirms the Azure Function can connect to the Cosmos DB from your local machine.
To test the performance of our Azure Function code, use the BenchmarkDotNet tool. The Visual Studio solution source code includes the necessary source files and BenchmarkDotNet config files.
Fig. 11: Solution Explorer showing the BenchmarkDotNet toolThe Benchmarktests.cs file contains the defined configuration settings of the benchmark test.
The [GlobalSetup] section contains the global parameters the test uses. The test triggers httpClient to connect to the running Azure Function. There, the BaseAddress points to the URL of the Azure Function.
You may have to update the Function Port 7071 to a different Azure Function port used on your development workstation.
Fig. 12: GlobalSetup section of the Benchmarktests.cs fileThe [Benchmark] section contains the details of what the test validates. In short, the for loop triggers ten Azure Function calls to the URL. This triggers /api/azurefaq and stores the result in the response variable.
Fig. 13: Benchmark section of the Benchmarktests.cs fileYou can run the test from a terminal window in or outside of Visual Studio.
Go to the directory of the sample application:
C:\<samplecodefolder>\Function-BE\Benchmarking.
Run the following command to start the BenchmarkDotNet program:
Dotnet run -c Release
The output of the test should look similar to the screenshot below:
Fig. 15: Output of the BenchmarkDotNet testYou can find the performance testing information near the end of the log, especially from the summary table. These values might differ on your machine.
// * Summary *
BenchmarkDotNet=v0.13.4, OS=Windows 11 (10.0.22621.1265)
11th Gen Intel Core i7-1185G7 3.00GHz, 1 CPU, 8 logical and 4 physical cores
.NET SDK=7.0.102
[Host] : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
DefaultJob : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
| Method | Mean | Error | StdDev |
|------------ |--------:|---------:|---------:|
| RunFunction | 3.807 s | 0.0514 s | 0.0415 s |
// * Legends *
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
1 s : 1 Second (1 sec)
// ***** BenchmarkRunner: End *****
Run time: 00:01:23 (83.84 sec), executed benchmarks: 1
Global total time: 00:01:31 (91.94 sec), executed benchmarks: 1
// * Artifacts cleanup *
This scenario provides two different Visual Studio 2022 solutions, the Function-BE, and the Function-BE- Optimized.
In the Function-BE-Optimized source code folder, open the FAQFunctionApp.sln file.
Fig. 16: Contents of the Function-BE - Optimized source code folderThis example uses the database query filter to load a subset of data in memory.
This code change is noticeable in the OptimizedAzureFAQ source file:
Fig. 17: Code changes in the OptimizedAzureFAQ source fileYou updated the Cosmo DB dataset with the SqlQuery parameter (1), performed a filter (2), and limited the results to a maximum of ten (3). You should alter these parameters outside of this example to suit your application’s workload.
You must make some changes to the Azure Function source files before the function runs successfully:
Once these files are updated, run and debug the OptimizedFunction.
Fig. 18: Running the OptimizedFunctionRun the Benchmarktest in the Function-BE-Optimized directory using the same steps as before.
Fig. 19: Running the optimized BenchmarktestThe output from the summary should look similar to this:
// * Summary *
BenchmarkDotNet=v0.13.4, OS=Windows 11 (10.0.22621.1265)
11th Gen Intel Core i7-1185G7 3.00GHz, 1 CPU, 8 logical and 4 physical cores
.NET SDK=7.0.102
[Host] : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
DefaultJob : .NET 6.0.14 (6.0.1423.7309), X64 RyuJIT AVX2
| Method | Mean | Error | StdDev |
|------------ |--------:|---------:|---------:|
| RunFunction | 2.407 s | 0.0258 s | 0.0241 s |
// * Legends *
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
1 s : 1 Second (1 sec)
// ***** BenchmarkRunner: End *****
Run time: 00:01:03 (63.84 sec), executed benchmarks: 1
There’s a 25-35% difference between the two function apps. In this example, integrating the SqLQuery filter is the biggest part of this optimization.
There’s also a 10% performance difference when running the function on a Linux platform instead of Windows.
Both these results confirm the best practices stated at the start of this article. Adding data into the Cosmos DB — combined with increasing the (free) RU performance amount — may provide additional performance differences.
With Azure Functions, Microsoft enables serverless cloud architecture. Functions are optimized by design since they typically run a specific short-running task. More complex scenarios where different functions run back-to-back or in parallel are known as durable functions. Apart from running the function code and triggering API calls, Azure Functions are the perfect back-end solution for interacting with other Azure resources such as Storage Accounts, Cosmo DB, and Service Bus, among others.
Azure Functions’ performance isn’t always a given, as the architecture has so many moving parts. Microsoft provides extensive documentation on performance best practices when developing Azure Functions. The most significant benefit could be defining the correct pricing plan that fits your scenario. As demonstrated in this sample workload scenario, the subsequent optimizations are possible from within the Azure Functions code itself.
Developers are responsible for providing the most-optimized version of the code and rely on the cloud operations team to configure the same for the actual runtimes. By adapting these best practices, your organization can benefit from the cloud’s serverless and microservices architectures while optimizing costs.
Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.
Apply Now