This is the technical blog of Keyvan Nayyeri, a 29 years old software engineer at Match.Com, speaker and author. You will find content about computer science, programming, and technology on here.
Those who have worked with Performance Counters in .NET and know the list of these counters would know that there isn't any built-in counter for network utilization of a specific network interface. Of course there are some counters for different purposes but none of them cover this important parameter.
I never had to calculate network utilization for my projects until today. Facing with this problem forced me to do some search to find an implementation for .NET but this attempt was unsuccessful. So I had to implement something myself and I'm sharing it here with others.
My implementation consists of two stages: first I had to find a formula to find the utilization based on existing parameters and then write the code to apply this formula.
Utilization is a general concept in physics and mathematics. When I was at university we studied some chapters about utilization and learned many things about its concepts. Utilization of various systems was a common example for our Statistics, Physics and Numerical Analysis courses.
Utilization has a simplified formula that calculates the utilization from the division of mean arrival to mean service rate.
The general formula has another parameter (M) that brings the number of servers to the calculation. As we usually work with one server this parameter is equal to one and can be ignored to get the above formula.
Utilization rate is a value that should be less than one to be considered as a normal state of the system. Note that very small values for utilization rate mean to have an idle system!
Having this background, I had to find a formula for network utilization in .NET. Before writing my own formula I did some search and found a forum post that contained an initial formula that I could use.
Author points to the following formula for the network utilization:
%utilization = ((8*(dataSent + dataReceived))/bandwidth*time_in_sec)) * 100
He also asks that why his rate goes upper than %100 in some cases while Windows Task Manager also shows him same value (greater than %100)? Now that I'm writing about this, let me answer him as well.
The reason that you may get some percentages greater than %100 is you may have an ill network. This utilization ratio isn't calculated based on constant parameters that are in a specific range so sometimes you may have values greater than 1 for your ratio (or greater than %100). As I stated above, utilization can be greater than 1 and it means that we have an overhead of requests on the server. Although network utilization should be smaller than 0.5 (%50) in many cases and it rarely goes upper than 1 (%100) but it's not impossible to have values larger than 1. In such cases you should take care of the system and find out why it's behaving this way! So the author of that forum post had a wrong consideration of utilization ratio and with a right consideration of this concept you understand why it may have such unexpected values! Actually we have this ratio to find the problems with our systems and getting such unexpected results would be normal for us.
However, this formula for utilization is correct and I used it, too. But in my own opinion it's not a good idea to use percentage for utilization in such cases because it can end with something like the abovementioned forum post and similar confusions.
By the way, as Windows Task Manager uses percentage for this ratio I kept this formula as was and calculated the percentage of the network utilization.
Before jumping into the implementation let me give a short description about different parameters in the formula:
You may wonder why we use bytes rather than bits in this formula?! The reason is the fact that .NET performance counters for Network Interface category return their values based on bytes rather than bits. The reason that we multiply an eight number into the summation of dataSent and dataReceived parameters is to calculate the number of bits.
I think that the rest of formula is obvious. You can ask your questions from Phil who should be better on this stuff than me!
Well, mathematical stuff is enough! Let's implement the formula in .NET.
.NET comes with three performance counters for the used parameters in network utilization formula out of the box. All of these counters are located in Network Interface category and are named "Bytes Sent/sec", "Bytes Received/sec" and "Current Bandwidth". The only parameter that requires some extra effort to be calculated is time_in_sec.
"Bytes Sent/sec" and "Bytes Received/sec" counters calculate their values based on different samples and the best way to get a better value from these counters is finding the summation of their values in a loop because in some cases their values may be zero or very different from the real state of the network. Then we can find the time_in_sec parameter by finding the number of times that the loop is iterated because our performance counters find their values for one seconds the overall time in seconds is equal to the number of iterations.
Below code shows the source code of a function that simply calculates the network utilization and returns it.
As an example, I can apply this implementation in a console application to find the utilization of one of my network cards. Don't forget that you need to pass the name of your network card to .NET performance counters to let them calculate their values.
At the end I'd point that this implementation has an error for its own and the result is approximately true. But this error is very very small (smaller than %0.1) and was ignorable for myself and most likely it's ignorable for others. I outline the reasons to have this error here:
You can download the source code sample of this post form here.