You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running the agent attached to a .NET 6 Preview application, the agent's log file shows messages like this:
“NewRelic FINEST: [pid: 24300, tid: .NET ThreadPool Worker]” instead of having the tid field be a numeric thread ID as it currently is in all other .NET runtimes. We're assuming this is because the .NET threadpool is re-implemented in managed code in .NET 6.
More specifically, our Log4Net logging template uses the %thread formatting directive. Under the covers, Log4Net first tries to get the thread's name and then falls back to getting the numeric thread ID if no name is set. Apparently pre-.NET 6 the threadpool threads were not named by default, and in .NET 6 they will be. See this SO discussion for more context and a possible solution: https://stackoverflow.com/questions/42675024/log4net-printing-threadid-and-thread-name-in-the-log/44112461
The text was updated successfully, but these errors were encountered:
When running the agent attached to a .NET 6 Preview application, the agent's log file shows messages like this:
“NewRelic FINEST: [pid: 24300, tid: .NET ThreadPool Worker]”
instead of having thetid
field be a numeric thread ID as it currently is in all other .NET runtimes. We're assuming this is because the .NET threadpool is re-implemented in managed code in .NET 6.More specifically, our Log4Net logging template uses the
%thread
formatting directive. Under the covers, Log4Net first tries to get the thread's name and then falls back to getting the numeric thread ID if no name is set. Apparently pre-.NET 6 the threadpool threads were not named by default, and in .NET 6 they will be. See this SO discussion for more context and a possible solution: https://stackoverflow.com/questions/42675024/log4net-printing-threadid-and-thread-name-in-the-log/44112461The text was updated successfully, but these errors were encountered: