gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > [resolved] Problem with converting UV float to t16

#162088 - Vloz - Mon Aug 25, 2008 12:22 pm

EDIT:
The 64*64 version of glTexImage2D in my loading texture function was just wrong, shame on me!

Sorry for disrupting! :$


Quote:
Hey every one,
i am trying to optimize a level loader with converting the floating UV of face to t16...

So I dit that for a 128*128 texture:

Code:
t16 New_TextureCoord.x=floattot16((float_TextureCoord.x)*128);
t16 New_TextureCoord.y=floattot16((float_TextureCoord.y)*128);


And everything worked fine! \o/

But i tried the same code with a 64*64 texture...
And i got a really strange display...

I tried to change the code to this

Code:
t16 New_TextureCoord.x=floattot16((float_TextureCoord.x)[b]*64[/b]);
t16 New_TextureCoord.y=floattot16((float_TextureCoord.y)[b]*64[/b]);


But this didn't work, and i keep this strange display of my texture...

There is an overview of what i am getting:
[Images not permitted - Click here to view it]

So any one have an idea of how to convert floating coordinate of a 64*64 texture to t16? :?

Thank for reading and sorry for my bad english! =)

EDIT: I use glTexCoord2t16(New_TextureCoord.x,New_TextureCoord.y); to load the texture UV before drawing.