#166425 - ChronoDK - Sat Feb 07, 2009 11:06 pm
I'm copying font graphics to an off-screen part of a 256x256 sized bitmap background, and then writing them to the screen by copying from the off-screen to the on-screen part.
I had this working in 16-bit mode, but I'm trying to change it to work with a paletted bitmap instead. Can someone spot any mistakes below? I'm guessing the problem lies in the inner most loop, but I can't think of what it would be.
I had this working in 16-bit mode, but I'm trying to change it to work with a paletted bitmap instead. Can someone spot any mistakes below? I'm guessing the problem lies in the inner most loop, but I can't think of what it would be.
Code: |
std::string s = "TESTtest!"; u16* screen = (u16*)bgGetGfxPtr(bg3); u16* font = (u16*)bgGetGfxPtr(bg3) + (256*192); for ( int k = 0; k < s.size(); k++ ) { //Glyph char code int num = (int)s.at(k) - 32; //Screen coordinates int sx = 0; int sy = 0; //Glyph size int w = 8; int h = 8; //Glyph coordinates int fx = (num * w) - ( (num / (256/w)) * (256/w) * w ); int fy = (num / (256/h)) * h; //Copy font pixels from off-screen buffer to on-screen buffer for ( int i = 0; i < h; i++ ) { for ( int j = 0; j < w; j++ ) { screen[((sx + j + (k * w)) + (256 * (sy + i)))/2] = font[(fx + j) + (256 * (fy + i))/2]; } } } |