You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using SQL Server with Entity Framework Core in an ASP.net Core app.
There is a hosted (IHostedService) service with a Thread.Timer that runs once a week, on the DoWork of the timer I have async functions (total: 13) to clone a lot of data into the same table (one function per table).
MS SQL Server has enabled CDC globally
The issue is when it tries to save the new data for Table2 (approx 2k new records) Entity Framework never completes the SaveChangesAsync this process is blocked and it starts consuming the whole RAM available.
I've created a question on Stackoverflow, and I was able to save data calculating all properties on the client-side, but I also had to reduce the number of records. If some table tries to insert more than ~10k the process is blocked on the next table. I mean, if the table1 inserts more than ~10k records when the table2 tries to insert its data (~2k records) the process will be blocked.
If I disable CDC on the SQL side, everything works fine. But I can't disable it because is used by another project.
dotnet exec --depsfile $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\bin\Debug\netcoreapp3.1\Company.Product.AdminCenter.deps.json --additionalprobingpath $HOME\.nuget\packages --additionalprobingpath "C:\Program Files\dotnet\sdk\NuGetFallbackFolder" --runtimeconfig $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\bin\Debug\netcoreapp3.1\Company.Product.AdminCenter.runtimeconfig.json $HOME\.dotnet\tools\.store\dotnet-ef\5.0.4\dotnet-ef\5.0.4\tools\netcoreapp3.1\any\tools\netcoreapp2.0\any\ef.dll dbcontext list --assembly $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\bin\Debug\netcoreapp3.1\Company.Product.AdminCenter.dll --startup-assembly $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\bin\Debug\netcoreapp3.1\Company.Product.AdminCenter.dll --project-dir $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\ --language C# --working-dir $SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter --verbose --root-namespace Company.Product.AdminCenter
Using assembly 'Company.Product.AdminCenter'.
Using startup assembly 'Company.Product.AdminCenter'.
Using application base '$SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\bin\Debug\netcoreapp3.1'.
Using working directory '$SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter'.
Using root namespace 'Company.Product.AdminCenter'.
Using project directory '$SOURCE\Company.DataWarehouse.Access\Company.Product.AdminCenter\'.
Remaining arguments: .
Finding DbContext classes...
Finding IDesignTimeDbContextFactory implementations...
Finding application service provider in assembly 'Company.Product.AdminCenter'...
Finding Microsoft.Extensions.Hosting service provider...
Using environment 'Development'.
Using application service provider from Microsoft.Extensions.Hosting.
Found DbContext 'ApplicationDbContext'.
Finding DbContext classes in the project...
@tecnologer As far as we can tell, this looks like slowness of inserting when using CDC. You might want to try disabling EF Core batching since it is possible CDC doesn't handle large batches of inserts in one go. Beyond that, the performance here is likely to improve with future implementation of #15059, and the associated #9118 and #10443,
Ask a question
I'm using SQL Server with Entity Framework Core in an ASP.net Core app.
IHostedService
) service with aThread.Timer
that runs once a week, on theDoWork
of the timer I have async functions (total: 13) to clone a lot of data into the same table (one function per table).The issue is when it tries to save the new data for Table2 (approx 2k new records) Entity Framework never completes the
SaveChangesAsync
this process is blocked and it starts consuming the whole RAM available.I've created a question on Stackoverflow, and I was able to save data calculating all properties on the client-side, but I also had to reduce the number of records. If some table tries to insert more than ~10k the process is blocked on the next table. I mean, if the table1 inserts more than ~10k records when the table2 tries to insert its data (~2k records) the process will be blocked.
How should manage this kind of task?
Include your code
I call the cloner functions like this:
Each cloner function is like this one, there are other functions that require a loop to update randomly FKs:
Include stack traces
There is no stack trace, I've never got an error.
Include verbose output
Include provider and version information
Local:
SQL Server remote:
The text was updated successfully, but these errors were encountered: