Memory start when processing 4 million records

I have the following code that tries to process 4,000,000 records in a database:

private void intenseProcess4()
{
    using (connection1 = new SqlConnection("connection string goes here"))
    {
        using (command1 = new SqlCommand(@"stored procedure name goes here", connection1))
        {
            command1.CommandType = CommandType.StoredProcedure;

            try
            {
                connection1.Open();

                using (reader1 = command1.ExecuteReader())
                {
                    while (reader1.Read())
                    {
                        int PrjNameIndex1 = reader1.GetOrdinal("full_path");
                        Directory.CreateDirectory(Path.Combine(reader1.GetString(PrjNameIndex1)));
                    }

                    if (reader1.NextResult())
                    {
                        while (reader1.Read())
                        {
                            System.IO.File.Copy(reader1.GetString(SourceIndex), reader1.GetString(DestinationIndex), true);
                        }
                    }
                }
            }
            catch (SqlException ex)
            {
                File.AppendAllText(Path.Combine(@"h:\X\log\error.log"), ex + " SqlException caught." + Environment.NewLine);
            }
        }
    }
}

      

Once started, it works fine for about an hour, but then it gives the following error message:

Problem signature:
    Problem Event Name: CLR20r3
    Problem Signature 01:   devenv.exe
    Problem Signature 02:   12.0.21005.1
    Problem Signature 03:   524fcb34
    Problem Signature 04:   mscorlib
    Problem Signature 05:   4.0.30319.34209
    Problem Signature 06:   534894cc
Problem Signature 07:   226e
Problem Signature 08:   6
Problem Signature 09:   System.OutOfMemoryException
OS Version: 6.1.7601.2.1.0.256.49
Locale ID:  2057
Additional Information 1:   0a9e
Additional Information 2:   0a9e372d3b4ad19135b953a78882e789
Additional Information 3:   0a9e
Additional Information 4:   0a9e372d3b4ad19135b953a78882e789

      

At this time, he receives only about 35 thousand records.

+3


source to share


2 answers


Use a terminal command instead of a GUI to create an archive.



0


source


Try CommandBehavior.SequentialAccess

:

When you specify SequentialAccess, you need to read from the columns in the order in which they are returned, although you do not need to read every column. After you have read the location data in the returned data stream, data at that location or before that location can no longer be read from the DataReader.



But the wisdom of trying to process a 4M record in one pass is questionable. I'm sure you'll never be able to copy 4M files and keep the open result. You are doomed to keep repeating over and over. Use packages instead, extract a small set of files to process, copy, record progress, then get another batch. In case of failure, resume from the last move.

You should also consider running multiple copies in parallel (using async IO, not threads!).

+1


source







All Articles