Extra bytes appended to file when stored in SQL Server
I am receiving a file via an SFTP session on a WCF service. This file is read into an object
. I successfully transferred this stream to a file on disk and verified that the file was transferred successfully (this was done using
But when I convert this to
order to store it in a SQL Server table (with the column declared as
), it seems like 8 bytes have been added to the file.
As an example, I downloaded a .txt file using the WCF SFTP service and saved it to the database. Then, using BCP, I flush the file to disk. When the file is opened, a "U" is appended to the first line ("U" plus 7 spaces).
The same happens with Office documents. I did the same procedure as above for the .xls file, which is 49Kb large at startup. But after outputting it with BCP, the file is corrupted and is 50Kb in size.
Oddly enough, .pdf files seem to be saved and exported correctly.
Several snippets of code that I am using:
To convert a stream to a byte array
var stream = (MemoryStream)data; stream.Seek(0, SeekOrigin.Begin); byte m_Bytes = stream.ToArray();
To store a byte array in SQL Server
cmd.Parameters.Add("@File", System.Data.SqlDbType.VarBinary, -1).Value = file;
To export a file from SQL Server to a file using BCP
bcp "SELECT [File] FROM SFTPUpload.dbo.Upload" queryout "C:\SFTP\Test.txt" -T -N -S Server_Name\Instance_Name
Could there be a problem with the method I'm using to save the file to the database or how do I retrieve the file using BCP? The latter seems more likely because saving the file directly to disk did not spoil it.
Any help or advice is greatly appreciated.
source to share
I had the same problem and here's how to fix it.
In my example, I had a SQL Server table called "
", and one of its fields
containing the raw data of a file, usually the entire Excel file.
What I wanted to do was use it
to quickly export this raw data to a file on our database server.
This command worked almost perfectly ...
-- This does not work - don't use it !! EXEC master..xp_cmdshell 'BCP "select ef.ExtFile_Data FROM [External_File] ef WHERE ExtFile_ID = 1005" queryout "G:\ImportData\TestFile.xlsm" -T -N'
... but he added 8 bytes at the beginning of each file, so none of them could be opened.
The solution was to have bcp use a format file which I named "BCPformat.fmt" ...
EXEC master..xp_cmdshell 'BCP "select ef.ExtFile_Data FROM [External_File] ef WHERE ExtFile_ID = 1005" queryout "G:\ImportData\TestFile.xlsm" -T -f "G:\ImportData\BCPformat.fmt" '
This is what my file looked like
10.0 1 1 SQLBINARY 0 0 "" 1 ExtFile_Data ""
Please note that your format file must match the structure of the data you are trying to export.
In my case, I only exported one column named
, so that is the field name in my .fmt file. The field names will be different, so your .fmt file will need to reflect this.
But. with this .fmt file in place (and accessible by any SQL Server user running the script), I could successfully export the raw data to an Excel file without those extra 8 bytes at the start.
And this method is much faster, turning large chunks of raw SQL Server data into a real file, than any other method I could find.
It's just ironic that it
doesn't have a simple flag to dump a field
into a file without adding something to that piece of data.
Hope it helps.
source to share