gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > Console Font on Sub Screen Issue with 16 bit Background

#148847 - jonezer4 - Fri Jan 11, 2008 4:16 am

Hey all, I have a problem getting console font to work on my subscreen, displaying over a graphic background. It used to be a 256x256 8-bit background, and my code worked fine, but I have had to change it to a 16-bit (I couldn't get Grit to make 8bit backgrounds).

So anyway, my question is, how do I go about getting a console default font to work over the new 16 bit background?

Here's my old code (the one that worked with 8 bit backgrounds):

Code:

void initializeVRAM()
{
   //Map VRAM
     vramSetMainBanks(VRAM_A_MAIN_BG_0x06000000, VRAM_B_MAIN_SPRITE_0x06400000,
                VRAM_C_SUB_BG_0x06200000, VRAM_D_SUB_SPRITE);


   //Use main Screen (bottom) for image
   videoSetMode(MODE_5_2D   | DISPLAY_BG3_ACTIVE
                     | DISPLAY_SPR_ACTIVE      //Initiate sprites
                     | DISPLAY_SPR_1D         //1D sprites, tile mode
                     | DISPLAY_SPR_1D_SIZE_256   //Setting each index to 256 bytes. This
                     );                     //will allow for indexes of multiples
                                          //of 16, and hence 1024/16=64 sprites.
                     
   BG3_CR = BG_BMP8_256x256 | BG_BMP_BASE(0) | BG_PRIORITY(3);
   

    // Use the sub screen (top) for output
   videoSetModeSub(MODE_5_2D | DISPLAY_BG0_ACTIVE //text
                       | DISPLAY_BG2_ACTIVE //image
                       | DISPLAY_SPR_ACTIVE   //sprites!
                       | DISPLAY_SPR_1D
                       | DISPLAY_SPR_1D_SIZE_256
                       );         
            
   //vramSetBankH(VRAM_H_SUB_BG);
   //vramSetBankI(VRAM_I_SUB_SPRITE);

  ///////////////set up our bitmap background///////////////////////

   SUB_BG2_CR = BG_BMP8_256x256 | BG_BMP_RAM_SUB(1)  | BG_PRIORITY(3);
   SUB_BG0_CR = BG_MAP_BASE(31) |  BG_PRIORITY(1)    | BG_COLOR_16 ;//use bg1 for the text
   //SUB_BG2_CR = BG_BMP8_256x256 | BG_BMP_BASE(1)  | BG_PRIORITY(1);            
   //these are rotation backgrounds so you must set the rotation attributes:
    //these are fixed point numbers with the low 8 bits the fractional part
    //this basicaly gives it a 1:1 translation in x and y so you get a nice flat bitmap
        BG3_XDX = 1 << 8;
        BG3_XDY = 0;
        BG3_YDX = 0;
        BG3_YDY = 1 << 8;
        BG3_CX = 0 << 8;
        BG3_CY = 0 << 8;
      
      SUB_BG2_XDX = 1 << 8;
        SUB_BG2_XDY = 0;
        SUB_BG2_YDX = 0;
        SUB_BG2_YDY = 1 << 8;
        SUB_BG2_CY = 32 << 8; //preventing console font from writing over data


   dmaCopy(mainBG_bin, BG_GFX, 256 * 192);            //These offsets are for console font
   dmaCopy(mainBGPal_bin,BG_PALETTE, mainBGPal_bin_size);
   
   dmaCopy(mainBG_bin + 256*192, &BG_GFX_SUB[0x1000], 256 * 256);
   dmaCopy(mainBGPal_bin, &BG_PALETTE_SUB[0x1000], mainBGPal_bin_size);   

   //consoleInit() is a lot more flexible but this gets you up and running quick
   consoleInitDefault((u16*)SCREEN_BASE_BLOCK_SUB(31), (u16*)CHAR_BASE_BLOCK_SUB(0), 16);

   BG_PALETTE_SUB[255] = RGB15(31,31,5); //font color must be set here
}

#148879 - PypeBros - Fri Jan 11, 2008 5:26 pm

I'd say you don't have much to do to switch from 8bit to 16bit BG and keep the console working, since it's on another plane ... well, as long as the 16-bit BG doesn't require so much VRAM that you suddenly overwrite the place where you stored your font and console map, of course.
_________________
SEDS: Sprite Edition on DS :: modplayer

#148881 - eKid - Fri Jan 11, 2008 5:31 pm

I don't think you can have a 16bit background and the console together because a 256x256 16bit background takes up a whole 128kb of memory..... And you only get 128kb for all sub-bg vram :)

#148882 - PypeBros - Fri Jan 11, 2008 5:39 pm

eKid wrote:
I don't think you can have a 16bit background and the console together because a 256x256 16bit background takes up a whole 128kb of memory..... And you only get 128kb for all sub-bg vram :)


That somehow confirm the difficulty to get it working, true.
Still, the screen is 256x192, which would leave you with 32KB "unused" at the end of the C bank, which you could safely use to store tiles and map for the console, no ?

Well, that'd work only if you have a still background, i admit.

(too bad, we can't map both banks C and one of the "alternate" banks H/I to get a few extra 16 KB of VRAM on sub_bg)
_________________
SEDS: Sprite Edition on DS :: modplayer

#148888 - jonezer4 - Fri Jan 11, 2008 6:50 pm

So after messing with this last night for a few hours, I guess the better question would be: "Is it possible to make 8-bit image files with Grit, and if so how?"

Thanks everyone for all your help.

#148890 - eKid - Fri Jan 11, 2008 7:02 pm

from grit usage:
-gB{n} Gfx bit depth (1, 2, 4, 8, 16) [img bpp]

-gB8 in the command line should produce 8bit output, it defaults to the input image's bit depth.

#148896 - jonezer4 - Fri Jan 11, 2008 7:29 pm

eKid wrote:
from grit usage:
-gB{n} Gfx bit depth (1, 2, 4, 8, 16) [img bpp]

-gB8 in the command line should produce 8bit output, it defaults to the input image's bit depth.


I should have mentioned I've tried that, but it won't compile like that in ndslib. I guess I would need a way to tell ndslib to use grit with a 256 color palette, (specifying a palette in the grit file doesn't appear to work).

#148898 - eKid - Fri Jan 11, 2008 7:32 pm

What does your grit file look like? You can put -gB8 in the grit file too...

#148904 - Cearn - Fri Jan 11, 2008 8:23 pm

jonezer4 wrote:
eKid wrote:
from grit usage:
-gB{n} Gfx bit depth (1, 2, 4, 8, 16) [img bpp]

-gB8 in the command line should produce 8bit output, it defaults to the input image's bit depth.


I should have mentioned I've tried that, but it won't compile like that in ndslib. I guess I would need a way to tell ndslib to use grit with a 256 color palette, (specifying a palette in the grit file doesn't appear to work).

I think you're confusing a number of things here. libnds is a code library with functions and macros useful for NDS programming. grit is a tool that takes a bitmap file, converts it to bitmap formats that can be used for GBA/NDS graphics and exports it as raw binary data or source-code arrays. It is the makefile that uses grit to convert the images and your code that uses its output, not libnds.

It's possible that the makefile doesn't use the right rules (or the wrong order) for grit, that the grit options aren't correct, or that you're not using grit's output correctly. Please specify which options you're using for grit, and what error you get when building the project. As the others have said, you need `-gB8' for 8bit output. You might also need `-gb' if you want bitmap output instead of tiles. If you need more examples of how you can work with grit, consider looking my grit demo.

Also, the console system in libnds can only use tiled backgrounds in 4bit (16color) or 8bit (256color) modes, not 16bit color. The last parameter in consoleInitDefault() is actually the number of colors, not the bitdepth.

#148968 - jonezer4 - Sat Jan 12, 2008 8:01 pm

Alright, I think I have it all figured out.

-gB16 will format your image as an untiled bitmap, but...

-gB8 will format your image as tiles

-gb8 will format your image as a bitmap.

I don't know if that's a bug or what, but apparently Grit doesn't care about the capitalization of the b for 16 bit files, but it will default to tiling if you capitalize it on 8 bit files. In Glut's defense, the specs all say to use "gb" or "gt".

#148974 - Cearn - Sat Jan 12, 2008 11:21 pm

jonezer4 wrote:
Alright, I think I have it all figured out.

-gB16 will format your image as an untiled bitmap, but...

-gB8 will format your image as tiles

-gb8 will format your image as a bitmap.

I don't know if that's a bug or what, but apparently Grit doesn't care about the capitalization of the b for 16 bit files, but it will default to tiling if you capitalize it on 8 bit files. In Glut's defense, the specs all say to use "gb" or "gt".

In principle, the type of image (bitmap or tiled) is determined by `-gb' and `-gt', and the output bitdepth by `-gBnumber'. these are completely separate items. If you omit either of them, grit will try to figure out what you meant. If you omit the image type, tiled graphics are assumed except for direct-color (16bit) images because that has no tiled variant. These assumptions make sense for GBA graphics -- and NDS as well, although to a lesser extent.

The reason `-gb8' worked is probably because the input image was 8bit as well. The 8 behind '-gb' is actually ignored.