Page 5 of 5 FirstFirst ... 345
Results 41 to 45 of 45

Thread: 2006 PGD Annual: Stage 5 feedback

  1. #41

    2006 PGD Annual: Stage 5 feedback

    Thank you all for the feedback! It's obvious now that I have some critical problems with Radeon:
    - Loading time (creating display lists for animations) is ridicuously long.
    - Volumetric fog somehow gets applied to upper rock parts.
    - 2d controls problems --- sword should be with alpha-test, item slots' borders should be with blending. But on Radeon screenshots they all look as if they were drawn "just like that", covering things underneath.
    - Switching task (Alt+Tab) craps graphics when switching back to game (although this is known to happen on older Windowses (98 and like) with crappy graphic drivers, it definitely should not happen on XP with latest ATI drivers).

    Traveler's screenshot from geforce 6800gt also shows artifact when rendering life indicator --- it should be rendered with blending (like on Huehnerschaender screenshot), while on Traveler's screenshots it shows white background underneath.

    There's also the problem with wrong mouse sensitivity, it seems that the mouse sensitivity values get assigned completely wrong. Sometimes it gets 9.0, sometimes it gets some ridiculous large value (though it's not MaxSingle). I understand now that this was also meant by Traveler in "Something else I noticed is that changed controls are not loaded properly again when restarting the game. I think you are converting values incorrectly.". This is very strange, because these values are actually changed only when reading them from config file, or setting them to default, or moving the slider in the menu. And converting is done using the plain StrToFloat / FloatToStr routines. And, needless to say, I don't experience these problems --- neither on Linux nor on Windows (XP and 2000 Prof). So they may indicate some memory corruption.

    Sadly, none of these problems are reproduceable on my 2 computers at home... Fixing this will have to wait for tomorrow or Thursday, when I may be able to test on different machines, some of them hopefully with Radeon. Unless someone has time to actually debug and see what's going on. For starters, compile the game in debug mode (make DEBUG=t build-win32), FPC 2.0.2 is required, then you can add some Writelns to see what's going on. Or use gdb, or just Lazarus (I just added appropriate files into the source project to load the project from Lazarus IDE, this will be included in next release, if someone is interested in it now --- let me know).

    Oh, and I just tested the game on Mesa (i.e. standard Linux software OpenGL implementation, no NVidia OpenGL) and the game runs correctly (it's awfully unacceptably slow, but runs . I was hoping to maybe reproduce some Radeon problems with Mesa, but I guess it's not that easy.

    Quote Originally Posted by WILL
    I also got some funky popup from the program about not being able to go to 800x600, but it would try to continue anyway...? Very odd.
    The dialog box should say "Can't change display settings to 800x600. Will continue in windowed mode."

    This is valid if your laptop screen cannot be resized to 800x600. When the game starts it tries to go to fullscreen 800x600, if that fails it runs in windowed mode. You can turn "Allow screen resize on startup" to "No" to disable this message on each startup. (I guess that I should automatically set this to "No" after it fails for the 1st time ? Done.).

    Quote Originally Posted by WILL
    I also noticed one of my programs 'Peer Guardian 2' complaining (22+ popups) about uncaught exceptions I thought THAT was interesting because I've never seen PG2 act that way ever. Maybe that'll give you some clues.
    Uhm, I'm clueless ? You mean errors occur in 'Peer Guardian 2' ? This may mean that my game somehow drained your system's resources, which shouldn't happen since you have perfectly good CPU, RAM and graphic card... But seeing all the other problems occuring with Radeon, this *can* be a consequence of them...

  2. #42
    Co-Founder / PGD Elder WILL's Avatar
    Join Date
    Apr 2003
    Location
    Canada
    Posts
    6,107
    Blog Entries
    25

    2006 PGD Annual: Stage 5 feedback

    Quote Originally Posted by michalis
    Quote Originally Posted by WILL
    I also got some funky popup from the program about not being able to go to 800x600, but it would try to continue anyway...? Very odd.
    The dialog box should say "Can't change display settings to 800x600. Will continue in windowed mode."

    This is valid if your laptop screen cannot be resized to 800x600. When the game starts it tries to go to fullscreen 800x600, if that fails it runs in windowed mode. You can turn "Allow screen resize on startup" to "No" to disable this message on each startup. (I guess that I should automatically set this to "No" after it fails for the 1st time ? Done.).
    That'll probably do it. The game ran in full screen however (at least from what I saw) so maybe it just did not detect that my screen could go 800x600 somehow? Seems unlikely, but other games can go 800x600 fullscreen quite readily. My standard desktop res is 1280x800, but I play games that go both 1024x768 and 800x600.

    Quote Originally Posted by michalis
    Quote Originally Posted by WILL
    I also noticed one of my programs 'Peer Guardian 2' complaining (22+ popups) about uncaught exceptions I thought THAT was interesting because I've never seen PG2 act that way ever. Maybe that'll give you some clues.
    Uhm, I'm clueless ? You mean errors occur in 'Peer Guardian 2' ? This may mean that my game somehow drained your system's resources, which shouldn't happen since you have perfectly good CPU, RAM and graphic card... But seeing all the other problems occuring with Radeon, this *can* be a consequence of them...
    'Peer Guardian 2' is a networking tool that blocks bad IPs from Spyware, Government, Ad and such systems when using Peer-2-Peer apps. Why this program would complain while trying to load a level is beyond me, but it does. :? Very odd indeed. Perhaps you are trying to directly access hardware interupts that don't exist in a Radeon and in doing so just slow down the system's resources handling all these interupt calls? Just an idea... What kind of OpenGL headers or library are you using?
    Jason McMillen
    Pascal Game Development
    Co-Founder





  3. #43

    2006 PGD Annual: Stage 5 feedback

    Quote Originally Posted by WILL
    Perhaps you are trying to directly access hardware interupts that don't exist in a Radeon and in doing so just slow down the system's resources handling all these interupt calls? Just an idea... What kind of OpenGL headers or library are you using?
    Gosh, no, I don't do any dirty hardware tricks Especially since the game is portable to both Windows and Linux/FreeBSD. Whole graphic is done in plain OpenGL, with some popular (not specific to any particular producer) extensions (like EXT_fog_coord and like --- things that are supported on both Radeon and GeForce).

    My OpenGL header was started from old Mike Lischke's OpenGL12 (it was quite modified since then, for FPC, Linux etc.). That's the same source that is used in e.g. GLScene. You can find my unit is castle's sources in file source/units/opengl/openglh.pas

    Although most of the base code can be also compiled with USE_GL_GLU_UNITS, and then standard FPC gl, glu, glext headers will be used --- my intention is to drop my OpenGLh unit and switch entirely to using FPC's gl, glu, glext units at some point (but for now I stick to my OpenGLh unit as it's tested). Initially I didn't use OpenGL headers provided with FPC because they had problems, but it's fixed in FPC since a long time now.

  4. #44

    2006 PGD Annual: Stage 5 feedback

    there is "Loading creatures" when creature's animations are prepared. This means loading VRML files, creating intermediate animation frames by interpolating between two models, and then creating a display list for each animation frame.
    You are crazy. The normal people interpolate the animation on the fly. ATI cards, probably, are unable to cope with using the display lists the way they were not designed to.

    How many polygons your monsters have and *what* are you trying to accelerate by using display lists?

  5. #45

    2006 PGD Annual: Stage 5 feedback

    Quote Originally Posted by Chebmaster
    there is "Loading creatures" when creature's animations are prepared. This means loading VRML files, creating intermediate animation frames by interpolating between two models, and then creating a display list for each animation frame.
    You are crazy. The normal people interpolate the animation on the fly. ATI cards, probably, are unable to cope with using the display lists the way they were not designed to.

    How many polygons your monsters have and *what* are you trying to accelerate by using display lists?
    Interpolating on the fly is good when you have really low-poly models and they are constructed of a few parts that are constant, and are simply transformed differently in each animation frame. In such case I will use only one display list, not create display list for each animation frame, so I handle this case efficiently.

    However, when models have more triangles and animation means deforming the mesh (actually, it can mean deforming anything --- you can e.g. change things such as materials during animation), interpolating the things on the fly is not doable --- it would take too long. So vertices must be precalculated, and then each animation frame is stored in a separate display list. Display lists are needed so that every vertex (this includes it's normal vector, possibly material setting, possibly texture coord etc.) is already stored in precalculated state. I could replace using display lists with other OpenGL buffers, but it would lead to the same situation anyway --- the vertices have to be stored along with their information.

    How many vertices do my models have ? As far as I remember the largest one that is animated using deforming has 4200 vertices. Tests shown that it needs about 16 * 5 display lists. So that's 80 * 4200 vertices info in display lists. That's *not* something that the OpenGL implementation shouldn't be capable of handling, and NVidia shows this, as all the animations of this model took about 20 MB memory. That's pretty acceptable (and note that this the largest creature, others have 2000 or 500 vertices). All creature animations eat 130 MB memory with NVidia (I'm speaking here about latest version on my page, 0.6.3). So, as you see, these are not some terrible counts.

    IOW, it's true that my models use more vertices than in usual FPSes and they are sometimes animated using more complex method. I *know* that this is the cause of my problems But the fact is that the memory (and loading time) overhead is perfectly acceptable on NVidia cards. And calculating how much memory the vertices should take, the overhead *should* be acceptable. So the only question is, what do I do wrong that the overhead on Radeon is so large ? Possibly Radeon is trying to immediately pull all these display list info into some resource-limited area on the card.

Page 5 of 5 FirstFirst ... 345

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •