Not really related to the memory leak question, but...

I've written a couple of CSV parsers in the past and I always wondered how to make something like that very fast.

I wrote one parser that worked pretty much like xGTx's, one that essentially loops over the entire length of the string (see below) and one that works kinda like the one below but uses pointers instead of string operations.
The pointer one was pretty fast, but if you've ever tried MS Excel's CSV import, you'll be quite impressed and mine wasn't nearly as fast as that.
Any idea how to write a REALLY fast CSV parser?

[pascal]
procedure Split(S: string; Delimiter: string; OutList: TStringList);
var
I: Integer;
Buf: string;
Current: string;
begin
Buf := '';
Current := '';
for I := 1 to Length(S) do
begin
if Length(Buf) = Length(Delimiter) then
Delete(Buf, 1, 1);
Buf := Buf + S[I];
Current := Current + S[I];
if Buf = Delimiter then
begin
Buf := '';
Delete(Current, Length(Current) - (Length(Delimiter) - 1), Length(Delimiter));
OutList.Add(Current);
Current := '';
end
end;
if Length(Current) > 0 then
OutList.Add(Current)
end;
[/pascal]

the code above takes ~460 msecs for 10,000 elements.
xGTx's takes ~16,578 msecs for the same 10,000 elements (I guess because Pos is pretty slow).