gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > 3D Questions

#115549 - The_Perfection - Wed Jan 17, 2007 2:23 am

Hello all,

I have recently started devving for the NDS (not giving up on the GBA though) and have started playing around with the OpenGL-ish interface to get various 3-dimensional things moving on the screen. I have run into a few problems though and thought you guys might be able to help.

Lighting:
I realize that the lighting is done with one sign and nine bits of fractional parts as a vector to determine which way the light is going. So would (in x, y, z)

Code:
0x201, 0x201, 0x201

be the same as

Code:
0x3FF, 0x3FF, 0x3FF

Since it is a vector, that's what I'm thinking, but I'm not sure since we haven't delt with vectors in my math class yet. (Although I did gleam much information from TONC.)

Normals:
I know that normals are extremely useful little things when it comes to lighting, but I don't know how to calculate them. I know it has to be done in the order the points were created, but don't know the actual math behind the little critters.
My friend also thought that I might be getting weird shading because the normals have to be calculated every frame. Do they? or do they automatically adjust when the figure is rotated? (Like I think they do...)

Texture Mapping:
I actually haven't attempted to texture map anything yet, but the reason I haven't is because I don't fully understand the process. I looked at the NeHe tutorial and they use a function to add the textures from a pcx file. What are the steps if I wanted to do it from an image binary?


I'm pretty sure there's more than that, but I can't remeber it right now.

Wii thank you in Advance.

#115585 - Lick - Wed Jan 17, 2007 1:39 pm

Vectors: these can display direction AND length. (5, 5, 5) != (6, 6, 6) because the lengths differ, while the directions match.

Normals: these only display direction, and the calculated length is always 1. Vectors (5, 5, 5) == (6, 6, 6) when converted to normals, because normalized, the lengths are 'removed' and the directions match.
* normalization is the act of dividing a vector by it's length. By doing that, you remove the length, leaving you with only a direction; a normal.

Texture Mapping: you simply copy the image-data into memory, but in the correct format. Then you tell the system what format the texture is in (make sure it matches with the actual data), so it can render that texture.

I probably made some mistakes while explaining.. Feel free to correct!
- Lick
_________________
http://licklick.wordpress.com

#115589 - hellfire - Wed Jan 17, 2007 3:22 pm

Quote:
I know that normals are extremely useful little things when it comes to lighting, but I don't know how to calculate them.
I know it has to be done in the order the points were created, but don't know the actual math behind the little critters.

the normal-vector is perpendicular to the surface.
you can calculate a perpendicular from two given directions using the cross-product.
since triangles consist of three points (v1 v2 v3), the two surface-directions are given: v2-v1 and v3-v1
- so you can easily compute the normal of a triangle.

as you want to specify normals per vertex (interpolated lighting, see gouraud shading), you can simply average the normals of all triangles sharing a single vertex, thus resulting in the normal vector at this particular point.
if you are loading your geometry from a standard file-format, per-vertex-normals are usually stored along.

Quote:
Normals: [...] the calculated length is always 1

the normal (in the mathematical sense) does not require to be normalized, but is usually assumed to be normalized to simplify lighting-calculations.
_________________
"The three chief virtues of a programmer are: Laziness, Impatience and Hubris"

#115597 - silent_code - Wed Jan 17, 2007 6:52 pm

... and you don't need to calculate them every frame ;) that would be an awlful lot of work... usually, if you use skinned meshes, you simply transform normals along with vertices, but don't recalculate them.
btw: make sure you transform a copy, not the original data.

happy coding

#115609 - The_Perfection - Wed Jan 17, 2007 8:52 pm

Thanks Lick, but you told me stuff I already knew. I knew about the vectors and their direction and magnitude, but I thought that the lighting vectors were unit vectors, allowing what I had stated to be the same. It would of come out to (-1,-1,-1) and (-511,-511,-511,) which would still point the same direction, and since I do think it is a unit vector, (technically values can't go above 1) what I had posted would still point the same direction, would it not?

I also knew that normals were, at most, a value of one in any direction, but I was asking how to calculate a normal, as hellfire explained. So all I need to do is calculate the cross product of those two directions on a polygon and divide by it's length to get the normal, yes or no?

Oh yes, that brings up another question, not neccessarily related to 3D...
Pre-Calculations:
If I were to create an array that will not be modified and put in some conversion factors, (such as floattov10() or a division for example,) would they be pre-calculated and placed into the binary file as an array, or would it call the function for each position in the array as it was accessed?

Once again with the texture mapping, I knew it had to be in the proper format and in the right area; I was more asking what format it had to be in and where to put the data. (I already know you have to allocate memory with a VRAM bank.)


Apologies for not being specific enough...


And for the normals (silent_code,) you say not to recalculate them, so if I were to call a rotation, the normals would automatically adjust like the points would? (Also, not working with skinned meshes yet, still getting the basics down. (Or trying to at least...)) And I do know to use a copy of data if I want to modify something, but thanks anyways for the reminder.

#115675 - hellfire - Thu Jan 18, 2007 9:58 am

Quote:
So all I need to do is calculate the cross product of those two directions on a polygon and divide by it's length to get the normal, yes or no?

to get the normal of a polygon: yes.
everything else i could add here, would majorly interfere with this:
Quote:
we haven't delt with vectors in my math class yet

_________________
"The three chief virtues of a programmer are: Laziness, Impatience and Hubris"

#115694 - silent_code - Thu Jan 18, 2007 1:01 pm

yes, the mesh will be light accordingly to the light direction, even if you rotate it. that's the whole idea of hardware t&l anyway. ;) my advise is to google for some simple (especially game specific) vector math. there are also a lot of good, easy to understand tutorials on normal calculations and how to get the job done fast.
i once had a lighting demo that calculated normals on load time, but the whole thing looked like poop, so i've written an exporter for my model viewer. now i just load the precalculated data and everything looks fine.

for textures you best check out the libnds examples. then start writing a better texture loader (my demo crashes on loadpcx() - or whatever it is called). i'd go for a custom file format and just convert any files (.pcx, .tga, .png, .bmp, .whatever) into your format using some magic little tool (you would write it yourself - there are some good libs for image file loading, like devil, or something [i don't use any libs, but opengl and openal, so i don't care about names] that eat a lot of standard formats).

#115721 - The_Perfection - Thu Jan 18, 2007 6:10 pm

I understand vectors easily enough... I just don't have the common sense behind them like most people who actually deal with this stuff all the time do. I already know several examples for video game vectors, such as:
Character movement
Projectile movement
Image compression (for circles, lines, squares and the like)
Path following
(More) Realistic gravity
and I think... Collision detection

As for the pre-calculations... have a look at this:
Code:
//In other.cpp
v10 arry[3] = {
    0x100, 0x060, 0x003,
}

v10 ar2[2] = {
    arry[0]/arry[2], floattov10(-2),
}

Would ar2 be calculated at compile time or not?

Oh yes, feel free to teach me something I don't know, especially if it pertains to math or science. It just makes it easier when I actually have to learn it.

Oh, and I had forgotten to look at those examples that weren't in NeHe. I'll make sure to go and check them out ASAP.

<NervousLaugh>Yeah... about writing my own tool... I'm pretty sure I could write one... but my PC C++ skills aren't all they really should be. (I wrote a RLE compressor/decompressor once. The compressor worked fine; the decompressor not so much.) But I can file handle and I should be able to work off of that.