Update for recording file and backward compatibility

I have a file like this:

file of record
  Str: string[250];
  RecType: Cardinal;
end;

      

but after a while using this file my client found that Str never exceeds 100 characters and also needs additional fields.

In the new version, we have a file like this:

file of packed record
  Str: string[200];
  Reserved: array[1..47] of Byte;
  NewFiled: Cardinal;
  RecType: Cardinal;
end;

      

This record is the same size, in the previous record there was one unused byte between Str and RecType when aligned to 8 bytes.

Question: What happened when this new file is read from the old code? It needs backward compatibility.

An example of reading old code:

var
  FS: TFileStream;
  Rec: record
         Str: string[250];
         RecType: Cardinal;
       end;
...
// reading record by record from file:
FS.Read(Rec, SizeOf(Rec));

      

+3


source to share


1 answer


The old school pascal string uses the first byte of the string (index 0) to store the length of the string.

Look at the memory of this entry:

byte    0  1  2  3  4  5  6  7  8  9 10 11  12  13 ........ 243..246 247..250
value  10 65 66 67 68 69 70 71 72 73 74  0 200 130          NewField RecType

      



From byte 11 to 242, the memory can contain garbage, it is simply ignored by the program (never shown), since it takes the value 10 in byte 0 as the length of the string, so the string becomes 'ABCDEFGHIJ'

This ensures that an old program reading a file built using the most recent version never sees garbage at the end of lines, since the representation of those lines will be limited by the actual size of the line, and these memory positions are simply ignored.

You need to double check to see if the old program has changed the values ​​stored in the event that they write records back to the file. I think this is safe too, but I'm just not sure and we don't have Delphi to test.

+3


source







All Articles