Using a SQL Agent Job to Call Procedures in a Loop

I am putting together a job in SQL Enterprise Manager 2000 to copy and delete records in a pair of database tables. We run a direct mass bulk copy and delete the stored procedure, but it can run on millions of lines and therefore hangs on the server. I was interested in trying to start the service in 100 hash blocks in one go, so the server doesn't stop (this is a live web database). I want this service to run once a day, so I put it in the agent's job. Is there a way to translate the calls to stored procedures that actually do the copy and delete, and then "sleep" between each call to give the server time to catch up? I know there is a WAITFOR command, but I'm not sure if this will hold the CPU or let it run other requests in the meantime.

Thank!

+2


source to share


3 answers


Chunkifying your deletes is the preferred way to remove excess data without bloating your transaction log files. The post by BradC is a good example of this.

These loops are best managed in a single stored procedure. In order to spread such work over time, I will still abide by it. Inserting a WAITFOR in a loop will put a "pause" between each set of deletions if you feel you need to deal with possible concurrency issues. Use a SQL Agent job to determine when a procedure starts, and if you need to make sure it stops for a specific amount of time, do it in a loop as well.

My turnover on this code will be:



--  NOTE: This is a code sample, I have not tested it
CREATE PROCEDURE ArchiveData

    @StopBy DateTime
    --  Pass in a cutoff time.  If it runs this long, the procedure will stop.
AS

DECLARE @LastBatch  int

SET @LastBatch = 1
--  Initialized to make sure the loop runs at least once


WHILE @LastBatch > 0
 BEGIN

    WAITFOR DELAY '00:00:02'
    --  Set this to your desired delay factor

    DELETE top 1000  --  Or however many per pass are desired
     from SourceTable
    --  Be sure to add a where clause if you don't want to delete everything!

    SET @LastBatch = @@rowcount

    IF getdate() > @StopBy
        SET @LastBatch = 0

 END

RETURN 0

      

Hmm. When rereading the message, you must first copy the data before deleting it. To do this, I set up a temporary table, and inside the loop, first cut the temporary table, and then copy it into the primary keys of the TOP N elements, insert into the "archive" table via a temporary table join, then drop the original table also via a temporary table join. (A bit more complicated than direct deletion, isn't it?)

+2


source


Don't worry about waiting between loops, the SQL server has to handle the conflict between your maintenance job and normal server activity.

What really causes a problem in situations like this is that the entire deletion process happens simultaneously, within a single transaction. This blows up the log for the database and can cause problems that sound the way you are going through.

Use a loop like this to delete in managed chunks:



DECLARE @i INT
SET @i = 1

SET ROWCOUNT 10000

WHILE @i > 0
BEGIN
    BEGIN TRAN
        DELETE TOP 1000 FROM dbo.SuperBigTable
        WHERE RowDate < '2009-01-01'
    COMMIT

    SELECT @i = @@ROWCOUNT
END
SET ROWCOUNT 0

      

You can use similar logic for your copy.

+1


source


WAITFOR

will let other processes "go". I used this technique to stop a large DELETE blocking the machine. Create a WHILE loop, remove the block of lines, and then WAITFOR for a few seconds (or less, whichever is appropriate).

0


source







All Articles