gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

C/C++ > Using char* messes up my palette.

#5274 - Benny Blanco - Thu Apr 24, 2003 12:24 pm

This I really don't understand. Everytime I try to set something like char* message, once I compiled, both background and sprite palettes go mental and just end up as hues of red.

I can't get my head around why this is happening and it's driving me insane. My deadline is in a week and this really isn't helping.

Anyone else come across this problem? Any help would be appreciated.

#5277 - funkeejeffou - Thu Apr 24, 2003 1:56 pm

Do you mean that you have your code running well as long as you do not declare your variables as char?
Are the char variables INDEPENDANT from those you use for the palette and sprites?
Please specify more where you use them and for what?
Also, i don't know what your code is supposed to do, but remember that there is the :
- char type has values running from -127 to +127
- unsigned char from 0 to +255
Maybe it could explain your mess if you use them for screen coordinates, because signed char cannot be used to represent the gba screen (240*160).

#5280 - niltsair - Thu Apr 24, 2003 2:23 pm

Also, when writing to video memory, you need to access it in 16bits.

So when you transfer your data, just use your u8 as a u16 pointer, and write half of the number of elements you needed with a 8bits pointer(seeing how you write 2 elements at a time)

#5285 - peebrain - Thu Apr 24, 2003 4:14 pm

Why are you using char's for palette entries anyways? Each palette entry is 16-bits, XBBBBBGGGGGRRRRR.

~Sean
_________________
http://www.pbwhere.com

#5288 - lordmetroid - Thu Apr 24, 2003 4:54 pm

that's probably why he gets all the hue of red as he describes... Do as I do when codeing for GBA think of everything as memory and bits...
So don't use char or whatever but instead u8,u16,etc...

that makes everything more understandable what you really are doing. and the reader doesn't need to know how many bits are this is it signed, and so on.
_________________
*Spam*
Open Solutions for an open mind, www.areta.org

Areta is an organization of coders codeing mostly open source project, but there is alot of sections like GBA dev, Language learning communities, RPG communities, etc...

#5289 - Quirky - Thu Apr 24, 2003 4:55 pm

Do you mean that when you assign a "string" stuff goes wrong? I would guess you have borked your message handling in that case and you need to add a 0 terminating value to your char arrays. But as you have been a bit vague, it could be anything.

#5294 - tepples - Thu Apr 24, 2003 6:33 pm

Another possibility is that your palette is in a 'const char *' variable as opposed to a 'const u16 *' or 'const u32 *', and adding the strings has unaligned the palette data.
_________________
-- Where is he?
-- Who?
-- You know, the human.
-- I think he moved to Tilwick.

#5348 - Benny Blanco - Fri Apr 25, 2003 7:46 pm

Hi sorry for being vague.

Basically, I'm not trying to do anything with the palette using char.
My palettes are running fine, as you've all said, I'm using u16 variables to handle them.

I need to start putting text into my game so imported StaringMonkey's Text.h file. I put the #include "Text.h" in, compiled and then the colours were all messed up.

After fiddling around, I tracked the problem to fact that the functions declare char* variables.

I removed the #include "Text.h" and the palettes went back to normal.
So I did another test and just had one line of code:

char *message="hello";

compiled and the palettes went bonkers again.

My palette code is as follows:

Code:
u16 *palette = (u16 *)0x05000200;
u16 *source = (u16 *)spritepal;
for (u16 i = 0; i < 256; i++)
{
    palette[i] = source[i];
}


And cannot for the life of me understand why assigning a char would affect the palette.

Again, any help would be appreciated.

#5363 - tepples - Sat Apr 26, 2003 3:06 am

Benny Blanco wrote:
My palette code is as follows:

Code:
u16 *palette = (u16 *)0x05000200;
u16 *source = (u16 *)spritepal;
for (u16 i = 0; i < 256; i++)
{
    palette[i] = source[i];
}

How is spritepal declared?
_________________
-- Where is he?
-- Who?
-- You know, the human.
-- I think he moved to Tilwick.

#5450 - Benny Blanco - Mon Apr 28, 2003 12:12 pm

tepples wrote:
Benny Blanco wrote:
My palette code is as follows:

Code:
u16 *palette = (u16 *)0x05000200;
u16 *source = (u16 *)spritepal;
for (u16 i = 0; i < 256; i++)
{
    palette[i] = source[i];
}

How is spritepal declared?


Hmmm... Interesting, const unsigned char, think that might be the problem?

I used peebrain's method to create the palette c header file by loading the .pal file (created with gbapal) into bin2c.exe as he does in his gbAmp examples.

What's the best way to sort this out as I've also noticed that palette values are 8bit values: e.g.

Code:
const unsigned char spritepal[] = {
  0x1F,0x7C,0x00,0x54...

#5451 - niltsair - Mon Apr 28, 2003 1:42 pm

A palette entry should represent a 15bits color. If there's 512 entries of 8 bits in the palette you're fine, the colors are just splitted up in two and by casting it to (u16*) like you did, it should be fine. Else, all of your palette colors a screwed up.

Unless your palette entry really just contains 128 colors and your sprite's tiles never used a color over 128. Then there's enough data in the palette for that, just run your loop 128 instead of 256.

#5455 - tepples - Mon Apr 28, 2003 5:02 pm

Benny Blanco wrote:
Code:
const unsigned char spritepal[] = {
  0x1F,0x7C,0x00,0x54...

It looks like spritepal[] has become unaligned. Ideally, it should look like this:
Code:
const u16 spritepal[] = {
  0x7C1F, 0x5400...

If you're going to be DMAing something out of memory, have your binary-to-C-array converter output little-endian 16-bit or little-endian 32-bit data. Some binary-to-C-array converters can output this format, but I don't know which to recommend.
_________________
-- Where is he?
-- Who?
-- You know, the human.
-- I think he moved to Tilwick.

#6648 - outRider - Sat May 31, 2003 8:51 pm

Hopefully this isn't too late to help you out, but I once had the same problem.

Declaring a char szString[]; locally on the stack in a function caused palette and image corruption for me too. Declare strings as and global (and const, if they don't change) and the problem should go away.
_________________
outRider