Fixing the top 4 .NET performance bottlenecks

.NET applications have many intricacies that are challenging to monitor, like memory stacks and heaps, a garbage collector, and multithreading. Improperly monitoring and optimizing these components can result in unnecessary consumption of resources, delayed processing, and application downtime, among other performance issues.

Understanding and resolving your .NET application’s issues helps improve performance. This hands-on article explores four of the most common problems causing .NET application performance bottlenecks and the best practices for fixing them.

How to fix the top 4 .NET performance bottlenecks

Prerequisites

To follow this tutorial, ensure you have the following:

  • Visual Studio 2022 with the .NET desktop development workload installed. The workload includes all the required components.
  • Some knowledge of C# and .NET development
  • A monitoring tool: either dotMemory or .NET Memory Profiler

1. Memory leaks

Memory leaks occur when a program improperly manages memory resources. If the program doesn’t release its allocated memory when it’s no longer needed, other processes can’t use that memory, potentially causing poor performance and outages.

.NET applications are particularly prone to memory leaks because they rely on the C# garbage collector. The garbage collector runs in the background and occasionally frees memory by clearing out objects the application is no longer using. However, it doesn’t automatically clear out file streams, network connections, database connections, or other unmanaged resources. You must manually dispose of these resources to free the memory.

Profiling tools like dotMemory and .NET Memory Profiler help identify these memory leaks. For example, the following image shows dotMemory breaking down memory use on a small console application that reads a file from the hard drive.

Screenshot of dotMemory while profiling Fig. 1: Screenshot of dotMemory while profiling

dotMemory’s main screen displays a timeline of the application’s memory usage. On the right, there’s a breakdown of the memory heaps. In this example, most of the application’s memory is unmanaged, indicating a possible memory leak.

Armed with this information, you can find ways to dispose of unmanaged resources. For example, you can use the IDisposable interface to handle these unmanaged resources. Then, use the interface’s Dispose method to release the unused resources.

The code snippet below shows the StreamReader class reading a file from your local drive without invoking the Dispose method.

 
StreamReader reader = new StreamReader(@"C:\Temp\Examples\ExampleFile.txt");

var textFromFile = reader.ReadToEnd();
Console.WriteLine(textFromFile);

Since the code doesn’t call the Dispose method, the unmanaged StreamReader resource stays open, potentially causing memory leaks because the garbage collector doesn’t automatically clear out the allocated resource. To avoid this, use the Dispose method to close and dispose of the StreamReader resource. StreamReader internally implements the IDisposable interface, so you can call the Dispose method to close and release the resource using this code:

 
StreamReader reader = new StreamReader(@"C:\Temp\Examples\ExampleFile.txt");
var textFromFile = reader.ReadToEnd();

reader.Dispose();
Console.WriteLine(textFromFile);

An alternative way to release the resource is by wrapping it in the using block, like in the following code. Doing so automatically closes and disposes of the resource at the end of its scope.

 
var textFromFile = string.Empty;
using (StreamReader reader = new StreamReader(@"C:\Temp\Examples\ExampleFile.txt"))
{
textFromFile = reader.ReadToEnd();
}

Add the necessary code in the using block — the processing occurs between the using statement’s curly brackets. This method ensures that unmanaged resources remain open only as long as needed.

Unfortunately, you don’t know when the garbage collector will make its next run and free the allocated memory from your unmanaged resources. However, you can use the IDisposable interface to notify the garbage collector that your unmanaged resource is gone, so it can clear out the allocated memory. Do so by wrapping your unmanaged resources in a custom class that implements the IDisposable interface:

 
public class DisposableClass : IDisposable
{
bool disposed;
private StreamReader _reader;
string _fileText;

public DisposableClass(string pathToFile)
{
_reader = new StreamReader(pathToFile);
}

public string ReadFile()
public string ReadFile()
{
_fileText = _reader.ReadToEnd();
return _fileText;
}

public void Dispose()
{
Dispose(disposing: true);
GC.SuppressFinalize(this);
}

protected virtual void Dispose(bool disposing)
{
if(disposed)
return;

if (disposing)
{
// Clear managed resources
_fileText = null;
}
// Clear unmanaged resources
_reader.Close();
_reader.Dispose();
}

~DisposableClass()
{
Dispose(false);
}
}

The code has two Dispose functions. The first Dispose function is used to call the second Dispose function to close the target resource. You can’t override the first function, but you can do so in the second function.

The second Dispose function clears the managed and unmanaged resources. It uses a local Boolean which is passed in as a parameter to ensure it only disposes of an unmanaged resource once. If the Boolean value is true, it has cleared your managed resources.

If the user forgets to call the Dispose function, you can use a finalizer to dispose of the unmanaged resources. A finalizer is a tool the IDisposable interface provides that an object can use to free resources and perform other cleanup tasks before the garbage collector runs its cleanup routines. The garbage collector automatically calls the finalizer on its next run. Then, the finalizer calls your virtual Dispose function with the accepting bool set to false. You only need to focus on unmanaged resources since the garbage collector handles the managed resources.

The garbage collector doesn’t have to run the finalizer if you manually call your public Dispose function. So, you can add GC.SuppressFinalize(this) to your public Dispose function.

Next, implement the custom class you made. You’ll then dispose of your resources by calling the Dispose function of your custom class:

 
string text = string.Empty;
DisposableClass customDisposableClass = new DisposableClass(/*pathToFile*/);
text = customDisposableClass.ReadFile();
customDisposableClass.Dispose();

Implementing the finalizer clears your resources even if you forget to call the Dispose function. Otherwise, use the using statement, which automatically triggers the Dispose function at the end of its lifetime:

 
string text = string.Empty;
DisposableClass customDisposableClass = new DisposableClass(/*pathToFile*/);
text = customDisposableClass.ReadFile();
customDisposableClass.Dispose();

Implementing the finalizer clears your resources even if you forget to call the Dispose function. Otherwise, use the using statement, which automatically triggers the Dispose function at the end of its lifetime:

 
using (DisposableClass customDisposableClassUsing = new DisposableClass())
{
text = customDisposableClassUsing.ReadFile();
}

You can decrease memory leaks and consume fewer resources when running your application by disposing of your resources, implementing the IDisposable interface, and using monitoring tools.

2. CPU-bound code

In CPU-bound code, the CPU controls all the program’s processes. Unfortunately, this method slows complex programs and works better with a powerful CPU. In .NET applications, actions like reading large files from external storage or updating real-time network-dependent processes can be slow.

One way to resolve this slowness is to run code for such processes on a separate thread instead of the main one. Multithreading (when the CPU executes more than one thread concurrently) can do this.

Add multithreading to your .NET applications by implementing the Task Parallel Library (TPL). Do that by including the System.Threading packages in your class and using async Task and await.

To explore this method, create a simple new Console App (.NET Framework) in Visual Studio. Replace the autogenerated class code with this:

 
internal class Program
{
static void Main(string[] args)
{
Console.Title = "Async Methods";

SomeBackgroundTask();
MainThreadTask();

Console.ReadLine();
}

static void MainThreadTask()
{
Console.WriteLine("MAIN THREAD TASK");
}
static async Task SomeBackgroundTask()
{
Console.WriteLine("Starting background task");
await Task.Run(() => {
Random rndObj = new Random(32);

Console.WriteLine("Transferring backup data");

for (int i = 1; i <= 500; ++i)
{
Console.WriteLine("*********************");
Console.WriteLine($"Server ID: {i * rndObj.Next(1000000, 10000000)}");
Console.WriteLine("*********************");

for (int j = 1; j <= 1000; j++)
{
Console.WriteLine($"Data Block: {j}");
}
}

});

Console.WriteLine($"Finished background task");
}
}

This code contains three methods. First, the Main method runs on your main thread when the program starts. It calls for two other methods. The first method runs on your main thread, and the other runs on a separate background thread.

The background method, SomeBackgroundTask(), is an async Task. It runs asynchronously to the main thread. The background method also contains an await operator, which suspends the background method until it has finished processing.

In Visual Studio, click the Start button below the Tools menu option to run the program. Visual Studio terminal displays an output described below.

Screenshot of the application’s output Fig. 2: Screenshot of the application’s output

In the example above, the method SomeBackgroundTask started first and printed the words “Starting background task.” After printing the first line, it created a task that simulates a more extensive process, such as updating records in a database.

Depending on your dataset’s size, updating a database and reading or writing a file can take some time. To simulate this slow process, the code above uses a for loop with an inner loop to simulate data transfer by 500 servers.

Below is a perfmon monitor of the CPU metrics before the program is run, when it’s being run, and shortly after it has finished running.

A screenshot from perfmon showing CPU usage when the application is running Fig. 3: A screenshot from perfmon showing CPU usage when the application is running

If an async method is part of the main thread, the entire application has to wait for the process to finish. As only the sample code’s background thread pauses, the main thread continues processing. By separating the processes this way, the code improves performance.

3. Large object heap (LOH) fragmentation

In .NET applications, when you create objects, the garbage collector assigns them to generation 0, 1, or 2. It assigns a newly created object to generation 0 unless it’s equal to or greater than 85,000 bytes, since these large objects affect the garbage collector’s performance.

Memory heaps Fig. 4: Memory heaps

When the garbage collector runs, it removes unused objects in generation 0 and releases the allocated memory. All the remaining objects in that generation then move up a generation, and the process repeats until they reach generation 2. The garbage collector typically doesn’t run older, higher generations as often as the younger generations. It only processes the LOH alongside generation 2, so large objects remain in memory until then.

A prevalent issue when frequently working with the LOH is memory fragmentation. When the garbage collector allocates and releases memory, it creates gaps of available space between the blocks of assigned memory. Those fragments of space can accept new data but are often too small to accommodate it, as the LOH only works with large objects. The garbage collector must request more memory from the operating system (OS) before committing the new data to memory. This lengthy process diminishes your application’s performance.

The best way to combat LOH fragmentation is to work as much as possible with small objects or custom value types — such as structs or enums. This approach keeps your objects in the generation heaps instead of the LOH. For example, only select the required columns when pulling data from a database, keeping the objects small and lightweight. The objects will be more manageable for the garbage collector.

The garbage collector stores enums in the generational stack instead of the memory heaps. Your application can access them much faster, improving performance. Use enums to replace any repeating strings to make your code more maintainable.

You can also use performance counters to collect data on your code’s performance. These counters help analyze memory usage, processor latency, and data transfer over a network. Use the PerformanceCounter class to add one to your application.

Finally, you can adjust the garbage collector’s latency mode. Latency is the time that the garbage collector takes to intrude on your application. Adjust this setting by changing the GCLatencyMode enum assigned to the GCSettings object under the System.Runtime namespace. Running the garbage collector can clear memory more frequently, but it can also slow performance, so developers must find a balance.

Working with smaller objects helps improve your .NET application’s performance. Adjusting the latency mode lets you clear out memory more or less often, while the performance counters help you decide which approach is best.

4. String concatenation

String concatenation in .NET can become expensive, performance-wise. When an application concatenates two strings, it creates a new instance of the string object. This situation could cause your application to slow down and cause memory issues, especially when string concatenation often occurs or on a larger scale.

The application creates a new instance of the string object because string is an immutable type. The application can’t change the states of immutable objects. So, operations that change a string object’s value will create a new string every time.

The most efficient way to concatenate strings is by using the StringBuilder class. Since the StringBuilder is a mutable object, you can modify it after instantiation.

You don’t have to replace all strings with the StringBuilder class, though. You only need to use StringBuilder when manipulating a string multiple times or looping through a large or unknown number of strings.

Say you want to concatenate an array of strings into a single string object. One way is to create a temporary string, loop through your array, and keep adding to that temporary string, like in the code below.

 
string[] stringArray = new string[] { "This", "is", "a", "test"};
string compositeString = string.Empty;

foreach (var str in stringArray)
{
compositeString += $"{str} ";
}

Console.WriteLine(compositeString);
The code sample’s output Fig. 5: The code sample’s output

This approach concatenates your strings, which is your goal. Unfortunately, because compositeString is immutable, assigning the string this way creates a new instance of the string each time the loop runs. These extra copies take up additional resources.

Implementing the StringBuilder class instead can improve your application’s efficiency and memory usage. First, replace your temporary string, compositeString, with an instance of StringBuilder. Then, in your loop, use the StringBuilder’s Append function. The Append function adds a copy of the string you wish to concatenate to your StringBuilder instance. The code looks like this:

 
string[] stringArray = new string[] { "This", "is", "a", "test"};
StringBuilder sb = new StringBuilder();

foreach (var str in stringArray)
{
sb.Append($"{str} ");
}

Console.WriteLine(sb);

The code’s output looks like the screenshot below.

Output of using the Append function Fig. 6: Output of using the Append function

You now have the same output as the first method but a much better performance. The perfmon graph below compares the two programs, one not using the StringBuilder class (annotated with 1) and using the StringBuilder class (annotated with 2). Note that the two programs were running amidst other programs like browsers. But the point being brought out is, there was more processor usage when the StringBuilder class was not used.

A screenshot from perfmon showing CPU usage when the application is running without  and with the StringBuilder interface Fig. 7: A screenshot from perfmon showing CPU usage when the application is running without and with the StringBuilder interface

Conclusion

Bottlenecks can occur at multiple points in an application. It’s essential to keep these points in mind when designing, testing, and updating your application:

  • Implement using statements when working with unmanaged resources or when you implement the IDisposable interface. This approach disposes the resources for you without having to write the disposing code from scratch.
  • Use multithreading and parallelism within your application to keep your main thread free of expensive CPU-bound processes and keep your application running smoothly.
  • Keep your objects small. The garbage collector manages small objects more efficiently.
  • Use the StringBuilder class when manipulating a string frequently or a large dataset of strings. It performs better than trying to work with an immutable type.

Use performance-enhancing tools with features such as performance logging, code analyzers, and AI-enhanced performance recommendations, which can alert you when crashes or memory overloads occur. These tools will help you create highly performant applications.

Most importantly, each scenario may need a different performance-enhancing strategy. So, monitoring and testing are the best ways to optimize your .NET application’s performance.

Start tracking your application’s performance metrics in real time using Site24x7 APM.

Was this article helpful?

Related Articles

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 "Learn" portal. Get paid for your writing.

Write For Us

Write for Site24x7 is a special writing program that supports writers who create content for Site24x7 “Learn” portal. Get paid for your writing.

Apply Now
Write For Us