Long lines truncated during insertion

I am running a .NET 4.0 application that communicates using LINQ to SQL with SQL Server 2008 R2 Standard Edition. I have a table with a column of type nvarchar(max)

and an application populates that column with rows as part of its job.

For rows with sizes> ~ 30MB, we find that after the insert / update is complete, part of the row is truncated, and what is stored on the server is not a complete row. The point is that strings are truncated at variable positions (we still leave more than 30MB of data after truncation), so there isn't any fixed point that might lead me towards size limitation (although that might still be the case).

I don't see any errors during inserts, although sometimes I notice that during such long inserts / updates, the SQL server closes the connection. But shouldn't the operation be rolled back in this case?

Some ideas would be needed. Not sure how to proceed.

+3


source to share


2 answers


I would suspect it is not SQL Server or even a connection inserting data. I wonder if this is data corruption when setting a command parameter value or even inside the application. I've experimented with connection and command timeouts to induce them on insert, and your assumption that it will rollback the rolled back implicit transaction is correct. Here's an example:

// Use connection with timeout of 1 second to generate a timeout when inserting
using (var conn = new SqlConnection(@"Data Source=(localdb)\mssqllocaldb;Integrated Security=SSPI;Initial Catalog=tempdb;Connection Timeout=1"))
using (var cmd = conn.CreateCommand())
{
    cmd.CommandTimeout = 1;
    conn.Open();
    cmd.CommandText = @"if not exists(select * from sys.tables where name = 'LongStringTruncation')
                        begin
                            create table LongStringTruncation (Data nvarchar(max));
                        end";
    cmd.ExecuteNonQuery();

    cmd.CommandText = "insert LongStringTruncation values (@t)";
    var t = cmd.CreateParameter();
    t.DbType = DbType.String;
    t.ParameterName = "@t";
    t.Value = new String('A', 30000000); // 30,000,000 chars = 60MB
    cmd.Parameters.Add(t);
    cmd.ExecuteNonQuery();
}

      

When it succeeds (i.e. the request completed before timeout); the next query shows that all 30,000,000 characters have been transferred and inserted; when it fails with a timeout, the table is empty.



select len(Data) from LongStringTruncation;

      

Note that I was using SQL 2014 here, so there might be an error in SQL 2008 R2 that this test might detect. My test uses parameters as well, and if you instead use string concatenation to build your SQL, that could be another source of problems.

If none of these explain the problem, then the only conclusion that makes sense is that the application itself is truncating the data in some way before inserting it. Given the random nature, I would think my first suspects would be thinking about how you are receiving string data (like a buffered stream without painting before reading the result) or a multi-threaded race when receiving a value / signaling that it is complete.

+3


source


I had the same problem. Solved by changing the data type of the input parameters in the stored procedure.

Try checking a few of the following things again: The datatype of the input parameter of the stored procedure must be the same as the datatype of the table column into which you are inserting the value.



Also check your C # code.

Data type and length For Ex: VARCHAR(50)

must be the same for SP, table and code.

0


source







All Articles