PDA

View Full Version : Terrible dithering on some computers



czar
07-02-2004, 11:24 PM
Hi all,

I am having some annoying difficulties with drawing imagelists on the screen. On my development computer the images appear exactly as they should. However, on two other machines these same images look like they have been dithered back to 256 colours. Has anyone noticed this before and if so is there are work around / fix?

All graphics parameters are the same of all three machines

windows 2000 or 98, 32 bit colour desktop

Works on ATI 9700 pro,

incorrect on
ATI 8500 and a TNT2.

On the two machine sthat play up, if I run it in 16 bit (switches to the microsoft renderer) then the images look correct, but only get about 3 FPS (see my other posts :) ).

I have played around with various options to no success.

Any ideas or help would be very much appreciated.

Andreaz
20-02-2004, 09:11 AM
Sorry for my late reply.

Must say that i don't really know what this depends on, the only thing i can come up with is that the drivers are out of date.

czar
20-02-2004, 07:06 PM
:wink:

I had noticed that you haven't been around. Holiday perhaps.

I will try and update the drivers on the ATI 8500 and see what happens. I will let you know.

BTW I love your components and I hope that you keep improving and working on them. The step from DelphiX to GLXtreem has been very easy.


R

Andreaz
21-02-2004, 07:37 PM
Have had a tough period in school latly, studying computer engineering at Halmstad university. =) I have made some changes in the component engine while coding the Snake game, but thats just about it.

lHopefully the workload will decrease some over the next weeks so i can get some more work done on GLXTreem.

Have patience. :D

czar
29-02-2004, 03:52 AM
I have an update and a further question. I installed the latest drivers on my ATI 8500 and nothing changed, the bitmaps still looked terrible as if they were dithered to 256 colours.

I went to the screen properties and played around with the OpenGL settings, when I switched the "Texture Preference" to "high quality" the dithering problem dissapears, a lower setting and the problem returns.

I decided to try these settings on my ATI 9700 pro. However, the dithering problem did not occur even when I switched off all the quality settings.

So it seems to be related to the quality setting of the textures. If this is the case is there a command I can use to force my app to the use the highest Texture Quality setting by passing whatever has been chosen in the control panel?

Can anyone suggest a command or way to achieve this?

Alimonster
29-02-2004, 10:04 AM
Are you certain you're not thinking about banding from 24 bit bitmaps being converted into 16 bit bitmaps, rather than down to 256 colours? ATI drivers did have a problem related to texture preference, where if you created a texture without specifying the exact bit depths wanted, it sometimes chose 16 bit instead of 32 bit (e.g. GL_RGB versus GL_R8G8B8 -- can't remember the exact constants though).

czar
01-03-2004, 07:21 PM
Hi alimonster.

I am very used to the banding look you get going from 24 bit to 16 bit. Most of the programs I do for work are in 16 bit colour (with 24 bit pics). However, banding does not account for the screens that I am seeing. I could make a screen dump to show everyone, however, uploading a picture does not appear to be a simple matter.

The effect is much worse than banding, the picture looks too distorted, it is very possible that it is in fact not reducing the picture to 256 colours however, that is what the end result looks like.

Any ideas as to commands (OpenGL) I could try to force the renderer to switch to the higher quality output? OpenGL is completely new for me.

czar
15-03-2004, 08:29 AM
Found some information about the problem :)

Found on dejanews or google

==============

> Hi, I have a windows openGL application that uses texture maps. On a
> system with an Nvidia geForce 2, all is well. However, on my ATI
> Radeon 9000 system, the textures look as if they are being stored in
> something like 4 bits per channel colour. (The images I send to the
> card are 24 bits per pixel).

Are you using GL_RGBA8 or GL_RGBA for your internal format in
glTexImage2D()? Using GL_RGBA8 will likely give better results on an ATI.

> If I change the Texture preference slider on the openGL tab in the
> display settings from "quality" to "high quality", this fixes things.
> My question is, is there a way of setting the card to "high quality"
> texture mode from software. Perhaps an opengl extension command?
>
> Many thanks for any help
>
> Matt Taylor

=============

and here

=============

http://www.flipcode.com/cworks/issue05.shtml

Andreaz
15-03-2004, 03:50 PM
Nice done, that will be changed in the v2 of glxtreem that we are working at.