Results 1 to 6 of 6

Thread: Bulk loading of records from a stream. (Chunk VS for-loop)

  1. #1

    Bulk loading of records from a stream. (Chunk VS for-loop)

    I have two different questions related to this, but I figured the best thing to do at the current time is to just ask them.

    Here's a basic example:[pascal]type
    TEnum = (enum1,enum2,enum3,enum4,enum5);
    MyRecord = record
    s: string[10];
    n: word;
    e: TEnum;
    end;

    var
    MyRecords: array of MyRecord;
    L,I: longword;
    F: TFileStream;

    begin
    f := TFileStream.Create('myfile.dat');

    Stream.Read(l,Sizeof(l));
    SetLength(MyRecords,l);
    for i := 0 to l-1 do
    Stream.Read(MyRecords[i],Sizeof(MyRecords[i]));
    end;[/pascal]

    Now, can I shortcut by skipping the for-loop? Would the 3bit enumerated type trip me up, or just add 5 padding bits? I know the for-loop method works, but I don't know how the following hypothetical bulk array loading would work:
    [pascal] Stream.Read(l,Sizeof(l));
    SetLength(MyRecords,l);
    Stream.Read(@MyRecords[0],SizeOf(MyRecords[0])*L);[/pascal]
    I'm concerned that it won't work right, but a bulk loading would help things an awful lot.

  2. #2

    Bulk loading of records from a stream. (Chunk VS for-loop)

    The only problem you might run into is if SizeOf(MyRecords[0])*L is greater than the cap of read. In that case, you would roll over to a much smaller number and not read the entire thing.

    If this is a major concern, then you should find the type associated with the reader and check against its max before you perform your read. If it is greater then read in a block loop of maxsize.

    Sorry for not so much detail, but things like compiler and options can change how stream.read works and the maximum block size it can read. Checking the result of stream.read against the value you expect is also a good idea.

  3. #3

    Bulk loading of records from a stream. (Chunk VS for-loop)

    Nah, it's good. I'm not going to be handling 2gb files or anything of the sort. Or if I am they'll be split between pieces. At least it should work.

    Right now I'm working on a sort of virtual file system with compression support, etc, that's optimized for games. Squeezing a little performance out of the style of the archive.

  4. #4

    Bulk loading of records from a stream. (Chunk VS for-loop)

    You can print out pointers of MyRecords[0], MyRecords[1] and SizeOf(MyRecords[0])... Even Sizeof(MyRecords) Those should clear all things out for you, but i would guess it works anyway.

    For example:
    ShowMessage(format('%d, %d, %d, %d', [
    integer(@MyRecords[0]), integer(@MyRecords[1]), SizeOf(MyRecords[0]), Sizeof(MyRecords) ] ));

  5. #5

    Bulk loading of records from a stream. (Chunk VS for-loop)

    [pascal]
    Stream.Read(l,Sizeof(l));
    SetLength(MyRecords,l);
    Stream.Read(MyRecords[0],SizeOf(MyRecord)*L);
    [/pascal]
    This would be the way. I do it like that and it has never failed me
    Peregrinus, expectavi pedes meos in cymbalis
    Nullus norvegicorum sole urinat

  6. #6

    Bulk loading of records from a stream. (Chunk VS for-loop)

    Awesome, glad to know that will work. Thanks for the confirmation.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •