OutOfMemory when deleting rows> 500000 EntityFramework 6
What I have:
I have a large list of addresses (ip addr)> million
What I am trying to do:
Remove 500k addresses via EntityFramework
My problem:
Right now I am paginating lists of 10,000 addresses and using RemoveRange (ListOfaddresses)
if (addresses.Count() > 10000)
{
var addressChunkList = extension.BreakIntoChunks<Address>(addresses.ToList(), 10000);
foreach (var chunk in addressChunkList)
{
db.Address.RemoveRange(chunk);
}
}
but I get OutOfMemoryException
which should mean it doesn't free up resources even if I split my addresses into separate lists.
What can I do to avoid getting an OutOfMemoryException and still remove a large number of addresses in a reasonable amount of time?
source to share
When I needed to do something like this, I moved on to the next plugin (I am not linked).
https://github.com/loresoft/EntityFramework.Extended
This allows bulk deletes to be performed using the Entity Framework without having to select and load the object into memory, which is of course more efficient.
Example from the website:
context.Users.Delete(u => u.FirstName == "firstname");
source to share
So? How did you understand that EF is an ETL / bulk data management tool?
This is not true. Doing half a million deletes in a single transaction will be slow (delete one by one) and EF just doesn't execute for that. How did you find out.
There is nothing you can do here. Start using EF as part of your design parameters, or choose an alternative approach for these bulk operations. There are cases where ORM makes little sense.
source to share
Several suggestions.
- Use a stored procedure or plain SQL
-
Move your DbContext to a narrower scope:
for (int i = 0; i < 500000; i += 1000) { using (var db = new DbContext()) { var chunk = largeListOfAddress.Take(1000).Select(a => new Address { Id = a.Id }); db.Address.RemoveRange(chunk); db.SaveChanges(); } }
source to share