gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > Problems with glEnable/glDisable

#66857 - ficedula - Fri Jan 13, 2006 10:33 pm

I seem to be having problems turning polygon outlining off ... although I haven't noticed before since I'd always rendered everything with polyid 1 before, hence no boundaries and no outlines ;)

Writing to GFX_CONTROL directly is fine, but using glEnable/glDisable seems to screw up majorly. Unless I've misunderstood what it's supposed to do totally, the following code from libnds seems to be wrong...

Code:

//---------------------------------------------------------------------------------
void glEnable(int bits) {
//---------------------------------------------------------------------------------
   enable_bits |= bits | (GL_TEXTURE_2D|GL_TOON_HIGHLIGHT|GL_OUTLINE|GL_ANTIALIAS);
   GFX_CONTROL = enable_bits;
}


Have I missed something, or does this really enable everything whenever I call glEnable? I mean, it does enable what I asked for, but I don't want everything else too... ;)

#66912 - HyperHacker - Sat Jan 14, 2006 10:35 am

I haven't actually used it, but it looks like it should be "enable_bits |= bits & (GL_TEXTURE...".

#67136 - duencil - Sun Jan 15, 2006 8:35 pm

HyperHacker wrote:
I haven't actually used it, but it looks like it should be "enable_bits |= bits & (GL_TEXTURE...".

That sounds likely, though it depends what the intention was. If its limiting what you can set with glEnable, then you are correct, although some other flags need to be included (GL_ALPHA_TEST|GL_BLEND). If the intention is to prevent you turning certain bits off -- seeing GL_TEXTURE_2D there might suggest that, it might be correct as it is, although some flags should be removed (GL_TOON_HIGHLIGHT|GL_OUTLINE|GL_ANTIALIAS) and others added ((1<<13) | (1<<14))

#112104 - Goosey - Wed Dec 13, 2006 7:27 am

Sorry to dig up this old topic, but seeing as how this is still an issue I figured I would post my fix.

I have submitted a patch to devkitPro (download here)

For temporary use I put in these functions into my engine:
Code:

const u16 kConstBits   = GL_TEXTURE_2D;// | (1<<13) | (1<<14);
static u16 enable_bits   = kConstBits;

void glEnableFIX(int bits)
{
   enable_bits |= bits | kConstBits;
   GFX_CONTROL = enable_bits;
}

void glDisableFIX(int bits)
{
   enable_bits &= ~(bits | kConstBits);
   GFX_CONTROL = enable_bits;
}

void glResetFIX(void)
{
   while (GFX_STATUS & (1<<27)); // wait till gfx engine is not busy

   // Clear the FIFO
   GFX_STATUS |= (1<<29);

   // Clear overflows for list memory
   GFX_CONTROL = enable_bits | ((1<<12) | (1<<13)) | GL_TEXTURE_2D;
   glResetMatrixStack();

   GFX_TEX_FORMAT = 0;
   GFX_POLY_FORMAT = 0;

   glMatrixMode(GL_PROJECTION);
   glIdentity();

   glMatrixMode(GL_MODELVIEW);
   glIdentity();
}



This fixes the issue of glEnable enabling things you don't request, glDisable disabling things you don't request, and glReset losing your settings when called.

I haven't given this too robust a testing, so use with caution, but it seems to work so far...