gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > [Help wanted] Font on 8-bit bitmap background

#166425 - ChronoDK - Sat Feb 07, 2009 11:06 pm

I'm copying font graphics to an off-screen part of a 256x256 sized bitmap background, and then writing them to the screen by copying from the off-screen to the on-screen part.

I had this working in 16-bit mode, but I'm trying to change it to work with a paletted bitmap instead. Can someone spot any mistakes below? I'm guessing the problem lies in the inner most loop, but I can't think of what it would be.

Code:

   std::string s = "TESTtest!";

   u16* screen = (u16*)bgGetGfxPtr(bg3);
   u16* font = (u16*)bgGetGfxPtr(bg3) + (256*192);

   for ( int k = 0; k < s.size(); k++ ) {
      
      //Glyph char code
      int num = (int)s.at(k) - 32;
      
      //Screen coordinates
      int sx = 0; int sy = 0;

      //Glyph size
      int w = 8; int h = 8;

      //Glyph coordinates
      int fx = (num * w) - ( (num / (256/w)) * (256/w) * w );
      int fy = (num / (256/h)) * h;
      
      //Copy font pixels from off-screen buffer to on-screen buffer
      for ( int i = 0; i < h; i++ ) {
         for ( int j = 0; j < w; j++ ) {
            screen[((sx + j + (k * w)) + (256 * (sy + i)))/2] = font[(fx + j) + (256 * (fy + i))/2];
         }
      }
   }

#166428 - Dwedit - Sun Feb 08, 2009 12:25 am

Division by two?
_________________
"We are merely sprites that dance at the beck and call of our button pressing overlord."

#166429 - ChronoDK - Sun Feb 08, 2009 12:34 am

Yeah, that is new compared to my 16-bit bitmap code. Call it a failed attempt to fix things - :)

#166473 - dovoto - Sun Feb 08, 2009 6:33 pm

You can not write to vram in 8 bits. You have to do at least 2 bytes at a time.
_________________
www.drunkencoders.com