Foreach with do while to build an insert query

I have this PowerShell script

 $CSV = Import-Csv "records.csv"
        Foreach ($Row in $CSV) {
            $Q = "INSERT INTO database..table([id], [FirstName], [LastName]) VALUES ('"+ ($Row.'id') + "','" + ($Row.'First Name') + "','" + ($Row.'Last Name') + "')"
            Invoke-QsSqlQuery -query $Q -SQLServer <servername> -database <databaseName>
            }

      

Note : Invoke-QsSqlQuery

this is my own function.

My problem is I am calling the SQL command for every line. This created a performance issue.

I want to build $Q

so that it has 1000 rows and then it is called to SQL server. An array is not possible because with a large size the file needs to be copied locally to the server and this is not allowed.

Since do while

I can count to 1000, which is not difficult, but what if my remaining record is less than 1000?

How do I create a query that will update multiple records at once?

+3


source to share


2 answers


It should be pretty simple

  • Assign all statements to an array of strings (your foreach loop is entirely appropriate here)
  • Split your string array into multiple string arrays 1000 or less
  • Combining each group
  • Execute a request



$CSV = Import-Csv "records.csv"

$SQLServer   = "dbserver.corp.company.tld"
$SQLDatabase = "database"

# Set up a string format template
$InsertTemplate = "INSERT INTO database..table([id], [FirstName], [LastName]) VALUES ('{0}','{1}','{2}')"

# Generate all insert statements and store in string array
$AllInserts = foreach($Row in $CSV){
    $InsertTemplate -f $Row.id,$Row.'First Name',$Row.'Last Name'
}

# Split array into an array of 1000 (or fewer) string arrays
$RowArrays = for($i=0; $i -lt $AllInserts.Length; $i+=1000){
    ,@($AllInserts[$i..($i+999)])
}

# Foreach array of 1000 (or less) insert statements, concatenate them with a new line and invoke it
foreach($RowArray in $RowArrays){
    $Query = $RowArray -join [System.Environment]::NewLine
    Invoke-QsSqlQuery -query $Query -SQLServer $SQLServer -database $SQLDatabase
}

      

+2


source


depending on some factors, you can use bulk insert instead of multiple insert.

requirements to satisfy 2:
- the file to be imported must be on sql server
- CSV files must meet specific requirements specified in msdn for the format of the imported file



if you can meet the above requirements, you can import the whole file with a simple statement:

BULK INSERT database..table
FROM 'C:\FileToImport.csv';
GO

      

+1


source







All Articles