I'm currently researching the feasability to show my 2D texture in false color.

Some points about the app that I have:
- a rectangle_ARB bmp. up to 4096x7000, but in this case way less, 0.5-1Mpixels
- currently it is 8-bit in the future also 12-bit (which will probably become 16-bit)
- There is some overlay drawn on top of the ARB. That is mostly important because of Depth_testing required, otherwise not.
- Using Mike Lischke's dglOpenGL headers.
- Speed, speed, speed. 10ms time budget. Card gf7x00+ or gf8x00 or even heavier in special cases.
- Older NV's had a palette extension but that is gone in modern cards. (and I need the more modern card (6x00/7x00 minimum) to upload such big BMPs 10+/s)

So in short, for every pixel in the bitmap, a lookup should be done in a 256 or 4096 table (1d texture I guess), probably using the (fragment?) shader.

Does anybody know an example for something that might look like this? Is it sane to begin with?