Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 24

Thread: FBOs and OpenGL Extensions

  1. #11
    Quote Originally Posted by code_glitch View Post
    And I know this would only work on the first loop but it crashes before then - on glGenFrameBuffersEXT() again which means I must clearly be missing something when I initialize OpenGl or doing something so idiotic I'm overlooking it each time...
    This extension might not be supported by your driver or not loaded properly in the headers. Make sure to call ReadExtensions or something similar.

    P.S. FBOs are not deprecated; in fact, they are adopted into core. See here.

  2. #12
    PGD Staff code_glitch's Avatar
    Join Date
    Oct 2009
    Location
    UK (England, the bigger bit)
    Posts
    933
    Blog Entries
    45
    I'm pretty sure that FBOs and most extensions are supported by my driver (latest catalyst 12.6 for my 5750), or at least that's the case for my desktop. Just to check I also hunted down an FBO demo to run which itself runs fine. That should narrow it down to something going pear shaped on the pascal side right?

    Also, out of curiosity, if an extension is adopted into core in say, OpenGl 3; would that automatically mean all cards that support OpenGl 3 support these extensions unlike OpenGl 2 where its vendor specific?

    Thanks
    I once tried to change the world. But they wouldn't give me the source code. Damned evil cunning.

  3. #13
    PGD Staff / News Reporter phibermon's Avatar
    Join Date
    Sep 2009
    Location
    England
    Posts
    524
    Quote Originally Posted by code_glitch View Post
    Also, out of curiosity, if an extension is adopted into core in say, OpenGl 3; would that automatically mean all cards that support OpenGl 3 support these extensions unlike OpenGl 2 where its vendor specific?
    Well I can't confidently claim that any OpenGL card/driver set that supports OpenGL 3.0 will *always* support equivalent extensions when working on GL2.

    ARB – Extensions officially approved by the OpenGL Architecture Review Board
    EXT – Extensions agreed upon by multiple OpenGL vendors

    in a GL2 style framework, it's always advisable to check the GL extensions string stuff first to see if it's available.

    (-tip for anybody studying this topic-
    If you've got the time to write alternative fallback rendering code, one way to manage it could be :
    Create a list of record structures that hold callbacks to render functions, a prefered priority, a list of required extensions.
    Then when the engine/framework starts up, you simply go thru the list in priority order and attach that given render callback function (or set a var etc) somewhere for the first technique that matches the extension requirments. This is a good way to handle card/driver specific quirks and different GL versions as well)
    When the moon hits your eye like a big pizza pie - that's an extinction level impact event.

  4. #14
    PGD Staff code_glitch's Avatar
    Join Date
    Oct 2009
    Location
    UK (England, the bigger bit)
    Posts
    933
    Blog Entries
    45
    Well I checked the compatibility of all my cards and its all fine there, I also tested some demos and they have no issues which means its only my code thats crashing. Are there any specific calls I should be making to initialize OpenGl extensions before calling them which could cause this? All the errors occur at $0000000000000000 which should narrow the cause to a null pointer somewhere which after doing some reading about glew in C/C++ seems to be normal as apparently glGenFrameBuffersEXT is a pointer itself and must be set O.o. Or at least thats my understanding of the C/C++ side of things. If this is correct, how would one get the correct pointer to the function and assign it?
    I once tried to change the world. But they wouldn't give me the source code. Damned evil cunning.

  5. #15
    Before initializing OpenGL you need to create its window context first. During this context creation it is typical to request specific Core functionality; if you don't, GL2 and lower is used by default. Note that some features may not be enabled until you request 3.0 core functionality explicitly, which may also disable some older deprecated functionality.

    Therefore, unless you are explicitly using Core 3 functionality, you *should* check for extension support before use.

    Although it may not be your case, on some video cards (mostly by Intel) there are bugs in OpenGL implementation that may lead to crashes. For instance, on Intel X3100 and latest drivers, creating and releasing textures frequently causes AV.

    By the way, if you are using DGL headers, where do you call InitOpenGL(), ReadExtensions() and ReadImplementationProperties()?
    Last edited by LP; 25-07-2012 at 07:18 PM.

  6. #16
    PGD Staff code_glitch's Avatar
    Join Date
    Oct 2009
    Location
    UK (England, the bigger bit)
    Posts
    933
    Blog Entries
    45
    Hmm.... OpenGl 2 and lower is the default? Sounds like there lies my issue. At the moment, I'm not using the dglOpenGl headers because I didn't find any noticeable difference when I played with them earlier on. The graphics cards I'm using are an ATI 5750 with the latest amd ccc drivers, the generic intel drivers shipped with linux for the GMA4500MHD from intel and an nvidia 8200M with the latest nvidia drivers for linux and at the moment all exhibit the same behaviour.

    Which calls should I make to request OpenGl 3 and such functionality?
    I once tried to change the world. But they wouldn't give me the source code. Damned evil cunning.

  7. #17
    Quote Originally Posted by Lifepower View Post
    By the way, if you are using DGL headers, where do you call InitOpenGL(), ReadExtensions() and ReadImplementationProperties()?
    - (Optional, create window here. You can use ready TForm, or with Lazarus you should use TOpenGLContext.)
    - InitOpenGL()
    - Create window (if wasn't already)
    - Initialize rendering context (don't need to do much with TOpenGLContext, at least MakeCurrent())
    - ReadExtensions()
    - ReadImplementationProperties()
    - Now you can initialize the projection matrix, load textures etc.

  8. #18
    Quote Originally Posted by User137 View Post
    - (Optional, create window here. You can use ready TForm, or with Lazarus you should use TOpenGLContext.)
    - InitOpenGL()
    - Create window (if wasn't already)
    - Initialize rendering context (don't need to do much with TOpenGLContext, at least MakeCurrent())
    - ReadExtensions()
    - ReadImplementationProperties()
    - Now you can initialize the projection matrix, load textures etc.
    My question was addressed to code_glitch (so I don't understand why are you quoting me?) as the issue may be in context creation and/or when reading extensions, but the order of the calls that you described should work.

  9. #19
    Misread topic flow when skimmed quickly. Your message body seems actually guiding, not asking how it's done sort...

  10. #20
    Quote Originally Posted by code_glitch View Post
    At the moment, I'm not using the dglOpenGl headers because I didn't find any noticeable difference when I played with them earlier on.
    Even if you didn't notice any difference with the dglOpenGL headers, the best thing is that they work on Windows, Linux, and OSX

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •