Skip to content

Process.GetCurrentProcess().TotalProcessorTime returns strange values on macOS #29527

@gregkalapos

Description

@gregkalapos

I'd like to gather CPU usage information and for this I use the System.Diagnostics.Process class. On macOS the Process.TotalProcessorTime property seems to return values that are higher than I'd expect. E.g. in a 5 sec timeframe with a 12 core CPU I see values higher than 12 * 5sec.

Reproducer (This returns valid values both on Windows - compared to Task Manager - and Linux, but returns seemingly invalid values on macOS):

using System;
using System.Timers;

namespace ProcessTimeRepro
{
    class Program
    {
        static void Main(string[] args)
        {
            var t = new Timer(5000);

            var process = System.Diagnostics.Process.GetCurrentProcess();

            Console.WriteLine($"Pid: {process.Id}");
            Console.WriteLine($"ProcessorCount: {Environment.ProcessorCount}");

            var b = false;
            var lastSample = DateTime.UtcNow;
            var lastTotalProcessorTime = new TimeSpan();
            
            t.Elapsed += (sender, eventArgs) =>
            {
                var currentDateTime = DateTime.UtcNow;
                var currentTotalProcessorTime = process.TotalProcessorTime;
                if (b)
                {
                    var timeDiff = (currentDateTime - lastSample).TotalMilliseconds;
                    var cpuUsage = (currentTotalProcessorTime - lastTotalProcessorTime).TotalMilliseconds;
                    
                    Console.WriteLine("=======timer elapsed=======");
                    Console.WriteLine($"total: {currentTotalProcessorTime}");
                    Console.WriteLine($"system: {process.UserProcessorTime}");
                    Console.WriteLine($"user: {process.PrivilegedProcessorTime}");
                    Console.WriteLine();
                    Console.WriteLine($"timeDiff: {timeDiff} ms");
                    Console.WriteLine($"cpuUsageDiff: {cpuUsage} ms");
                    Console.WriteLine($"Cpu usage: {cpuUsage / (timeDiff * Environment.ProcessorCount) * 100}%");
                    Console.WriteLine("=======timer elapsed end=======");
                    Console.WriteLine();
                }

                lastTotalProcessorTime = currentTotalProcessorTime;
                lastSample = currentDateTime;
                b = true;


            };

            t.Start();

            Console.WriteLine("Work starts");
            var rnd = new Random();
            for (int i = 0; i < int.MaxValue; i++)
            {
                if (rnd.Next() < 0) //make sure real work is done, this is not optimized away
                {
                    Console.WriteLine("bla");
                }
            }

            Console.WriteLine("Work done");

            Console.ReadLine();
        }
    }
}

In short: we have 1 thread doing some work (generate random numbers) for a while and a timer that captures CPU usage.

Output on macOS (macOS Mojave 10.14.4, MacBook Pro 2018, 2.9 Ghz Intel Core i9):

dotnet run -c Release
Pid: 2989
ProcessorCount: 12
Work starts
=======timer elapsed=======
total: 00:16:45.0497401
system: 00:16:34.7100833
user: 00:00:10.3697294

timeDiff: 4992.206 ms
cpuUsageDiff: 497341.0443 ms
Cpu usage: 830.195849790654%
=======timer elapsed end=======

=======timer elapsed=======
total: 00:25:04.1338311
system: 00:24:49.7922366
user: 00:00:14.3803124

timeDiff: 5004.88 ms
cpuUsageDiff: 499084.091 ms
Cpu usage: 830.995766465263%
=======timer elapsed end=======

Work done
=======timer elapsed=======
total: 00:26:53.6510655
system: 00:26:38.4277113
user: 00:00:15.2315355

timeDiff: 4997.881 ms
cpuUsageDiff: 109517.2344 ms
Cpu usage: 182.606112470465%
=======timer elapsed end=======

=======timer elapsed=======
total: 00:26:53.7347993
system: 00:26:38.4790252
user: 00:00:15.2648931

timeDiff: 4996.953 ms
cpuUsageDiff: 83.7338 ms
Cpu usage: 0.139641430821276%
=======timer elapsed end=======

E.g. when the timer fires the 2. time we have this output:

timeDiff: 5004.88 ms
cpuUsageDiff: 499084.091 ms

This means in a 5004.88ms time-slot this app consumed 499084.091 ms CPU, which is I think impossible. As you can see this is a 12 core machine, so in case all 12 cores do 100% work on this process the CPU time would be 12*5004.88ms, if I'm not mistaken, there is no way to have a higher CPU time in this time range. Also in the reproducer, the work is done on a single thread, so I'd expect to have a 100% CPU usage on 1 core and some minimal other work done by the timer (probably negligible), so for a timeDiff: 5004.88 ms the cpuUsageDiff should be also close to 5sec - and this is exactly what I see on Linux and Windows.

So either I do something wrong, or the value returned from Process.TotalProcessorTime is very inaccurate.

Also when I compare the increasing total value (that's the plain value of Process.TotalProcessorTime to Activity Monitor (for Windows people: that's the Task manager of macOS) or top I see very different numbers. top and Activity monitor report significantly lower numbers.

Are we sure that rusage_info_v3 returns info.ri_system_time and info.ri_user_time reported in 100-nanosecond units here?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions