#119178 - qw3rty - Tue Feb 20, 2007 2:19 pm
Hi,
I try to include environment mapping, but I don't get it !
The example provided with devkitpro is very vague unfortunately, and doesn't show the actual vertices sending (glnormal(), glvertex() etc).
Environment textures seem to rely on the order of the points you send to the hardware - I get completely different results starting the triangles at different points (e.g. top right, vs. bottom left).
Could anybody who has experience with environment mapping give me some advices ?
(An easy example of a triangle or pyramid with environment mapping on it maybe)
#119190 - ikaris - Tue Feb 20, 2007 5:45 pm
I'm doing spherical environment mapping in my engine...
I got the formula from this page:
http://www.ozone3d.net/tutorials/glsl_texturing_p04.php#part_41
Here's my source code:
Code: |
#include "fzEnvMap.h"
void fzEnvMap(fzMesh & mesh, fzVec3 cameraPos)
{
unsigned int numTexCoords = mesh.t.Size();
for (unsigned int i = 0; i < numTexCoords; i++)
{
fzVec3 vtx(mesh.v.Index().x, mesh.v.Index().y, mesh.v.Index().z);
fzVec3 u = cameraPos - vtx;
u.Normalize();
fzVec3 nrm(mesh.n.Index().x, mesh.n.Index().y, mesh.n.Index().z);
float dot = nrm.DotProduct(vtx);
float fTemp = dot * 2.0;
fzVec3 vTemp = nrm - vtx;
fzVec3 r = ( vTemp * fTemp );
float m = sqrt( (r.x * r.x) + (r.y * r.y) + ( (r.z + 1.0) * (r.z + 1.0) ) );
mesh.t.Index().u = floattot16(r.x / m + 0.5);
mesh.t.Index().v = floattot16(r.y / m + 0.5);
mesh.v.Next();
mesh.t.Next();
}
}
|
Hope this helps you !
#119198 - qw3rty - Tue Feb 20, 2007 6:59 pm
Thanks - I also started to implement software "fake" - environment.
That at least works as expected :)
But I wanted to use the 3d-hardware for the environment-mapping ("\examples\nds\Graphics\3D\Misc\Env_Mapping" in the devkitpro folder contains a demo, but I can't figure it out with that demo alone :( )
I continue to work on my software environment mapping, it doesn't slow it down that much.
#119625 - M3d10n - Sat Feb 24, 2007 9:34 pm
You shouldn't need to do anything extraneous with your geometry for proper environment mapping. Just make sure you're providing proper normals, since they are the major thing to affect the texture coordinates.
#119850 - qw3rty - Mon Feb 26, 2007 9:55 pm
I have a fixedpoint spheremapping routine now, that's fast enough for my needs, but I'm still curious how to do it with hardware.
Could anybody give an easy example of a environment-mapped quad ?
(Not a fancy teapot, that's drawn by a DMA of a binary file *sigh*)
#119867 - shash - Mon Feb 26, 2007 11:22 pm
You don't need the actual model of that example, as the important parts are these:
1) First, setup a texture so that the texture coordinates used per vertex, are calculated from it's normal (reference):
Code: |
glTexImage2D( 0, 0, GL_RGB, TEXTURE_SIZE_128 , TEXTURE_SIZE_128, 0, GL_TEXTURE_WRAP_S|GL_TEXTURE_WRAP_T|TEXGEN_NORMAL, (u8*)cafe_bin ); |
2) Later, scale the normals from normalized range to texture space, scaling the GL_TEXTURE matrix, and later rotating to match the camera. To sum it up, what this does, in conjuction with the normal texture coordinates generation, is creating texture coordinates from normals. As you have a 128x128 texture, and the normals are in the [-1,1] range, multiplying them by (64,64,1), will put them in the texture range. It's this part:
Code: |
glMatrixMode(GL_TEXTURE);
glIdentity();
GLvector tex_scale = { 64<<16, -64<<16, 1<<16 };
glScalev( &tex_scale );
glRotateXi(rotateX>>3);
glRotateYi(rotateY>>3); |
3) Send your mesh with proper normals, smooth normals will give you the sphere mapping look in that example. Per face normals will give you a disco ball look. What's important, is that if you want the texture coordinates to be different per vertex, normals must be different, as they are generated from that.
_________________
http://shashemudev.blogspot.com
#120020 - qw3rty - Wed Feb 28, 2007 11:58 am
Hmmm.... I just tryed the hardware-environment mapping again, and I think now I got it right.
This is the code, I execute instead of calling the teapot :
Code: |
glBindTexture( 0, cafe_texid );
//glCallList((u32*)teapot_bin);
glBegin(GL_TRIANGLES);
glNormal(NORMAL_PACK(-300,-300,300));
glVertex3v16(-1024, -1024, 1024);
glNormal(NORMAL_PACK(+300,+300,300));
glVertex3v16(1024, 1024, 1024);
glNormal(NORMAL_PACK(-300,+300,300));
glVertex3v16(-1024, +1024, 1024);
glNormal(NORMAL_PACK(300,300,300));
glVertex3v16(1024, 1024, 1024);
glNormal(NORMAL_PACK(-300,-300,300));
glVertex3v16(-1024, -1024, 1024);
glNormal(NORMAL_PACK(300,-300,300));
glVertex3v16(+1024, -1024, 1024);
glEnd();
glFlush();
|
What my code does, is to draw two triangles, that form a square.
Using the "real" normal (0,0,511) for every corner results in a one-colored square (all vertices are mapped to the same texture coordinate).
Using the normals, that point away from the center of the square gives the results expected.
But that would mean, I have to adjust my normals according to the "camera-vertex"-vector , right ?
(That would be the half way to software sphere mapping !)
#120048 - shash - Wed Feb 28, 2007 4:22 pm
qw3rty wrote: |
(...)
Using the "real" normal (0,0,511) for every corner results in a one-colored square (all vertices are mapped to the same texture coordinate).
Using the normals, that point away from the center of the square gives the results expected.
But that would mean, I have to adjust my normals according to the "camera-vertex"-vector , right ?
(...) |
No: normals can be static. As I already explained above, the texture coords are generated from normals. Same normals means SAME texture coordinates.
For a "normal" cube, you'll probably want to have per vertex smoothed normals, or else your mapping will be a bit rare. In general, you'll probably already want per vertex normals if you want smooth lighting.
_________________
http://shashemudev.blogspot.com
#120100 - qw3rty - Thu Mar 01, 2007 1:10 am
Normals can be static for static objects, sure.
But I have a dynamic water-surface, hence I can't have static normals.
That's why I would've to change the vertex normals according to the camera-vertex-vector, or am I missing something ?
#120101 - shash - Thu Mar 01, 2007 1:13 am
Just recalculate your normals as you would do normally for lighting, and it'll be fine. Usually it's as easy as averaging the face normals the vertex shares, and you're done.
_________________
http://shashemudev.blogspot.com
#120112 - qw3rty - Thu Mar 01, 2007 2:23 am
But once the waves stop, the surface is flat, hence all normals will point directly up.
You see the problem ? when all normals point directly up, all vertices will point to the same texture coordiante - hence I need to change them somehow
(and I still believe it's the camera-vertex-vector that influences the normal - I suppose you have to use the camera-vertex-vector reflected at the normal - which would eliminate the advancement of hardware support because at this point you are two lines away from the sphere-map texture-coordinates ).
#120118 - shash - Thu Mar 01, 2007 3:32 am
Then you need to create the normals to simulate your waves. Creating them with sin/cos should be fine, I've seen quite a few water "fakes" done with just sin/cos generated normals.
_________________
http://shashemudev.blogspot.com
#120165 - qw3rty - Thu Mar 01, 2007 12:21 pm
You didn't understnad, what I was meaning :
On a flat water-surface, with no waves, every vertices normal will point into the exact same direction.
But if I use those normals for environment mapping it won't work.
I don't use sine/cosine for my water-surface, it's a discretised "physical" model, and I don't see how that would help me creating the "environment"-normals.
#120183 - shash - Thu Mar 01, 2007 5:51 pm
qw3rty wrote: |
You didn't understnad, what I was meaning :
On a flat water-surface, with no waves, every vertices normal will point into the exact same direction.
But if I use those normals for environment mapping it won't work.
I don't use sine/cosine for my water-surface, it's a discretised "physical" model, and I don't see how that would help me creating the "environment"-normals. |
I thought you were using a simple flat surface to simulate the water, so using simple sin/cos normals would be useful in that case.
_________________
http://shashemudev.blogspot.com