MS SQL problem. Limit the length of the field.

I have MS SQL DB with various tables and one field in particular is making me sad.

The datatype is set to varchar (100), however the field is limited to 60 characters.

If I try to put any string with more than 60 characters in the field, I get an exception, "The string or binary data will be ignored." Although the string is shorter than the explicitly specified data type, it still throws an exception.

Is it possible that this would be a DB setup? What can lead to overwriting of an explicitly specified data type?

Edit:
Triggers don't copy the value or paste it into another table, and they don't use the data either. - (Wrong)

Strings less than 60 characters work fine.

All columns with varchar (100) give the same problem, but all other columns take on the correct values. The varchar (10) column works great.

Any row in this table throws an exception if I try to update a field longer than 60 characters.

I am trying to insert data directly into a field using SQL Server Management Studio.

No additive used.

Answer:

There was a second table in which the column was set to 60. The update trigger is called a stored procedure that inserts data into the "Denormalised" table.

Thanks for the help.

+1


source to share


2 answers


What is the COL_LENGTH function for metadata - is this column's specific size?

Do you have a default constraint for this column?

Assuming from previous answers that this is not a trigger issue or an nvarchar issue and you think the truncation is 60, what does updating this single column with SUBSTRING length 60 and then 61 do? This may confirm or invalidate your theory.



Alternatively, it is possible to change the database sorting and / or encoding settings from the moment the initial data was entered. This can lead to some peculiarities. You say this is a copy of the original database. Is it sitting on another instance of SQL Server? If so, do both instances of SQL Server have the same collation and encoding settings?

EDIT : The use of the ANSI_PADDING option you are discussing is deprecated and the setting will be permanently enabled in future versions of SQL Server. But the fact that you are looking at this suggests that the value you are trying to insert is padded in some way, possibly with trailing spaces. However, this is not consistent with the results of your SUBSTRING experiment, which shows a 60 character cutoff. So I'm not sure if this parameter matters, especially since it's always on for an nvarchar column.

Does any 61 character string throw an update exception? Also, while you have checked the table's immediate triggers, are there any cascading (indirect) triggers that might throw this exception?

+1


source


Perhaps you want NVARCHAR (100), or rather NVARCHAR (60).

A single character in NVARCHAR is twice the size of VARCHAR. You will use NVARCHAR if your input is unicode

EDIT:



Based on your comments, it looks like using nvarchar is not a necessary solution to the problem, and it is rather difficult to guess which problems are related to the information provided:

Could you script your table with constraints and triggers and post some code that will definitely help you find the source of the problem.

+2


source







All Articles