gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > Large array compilation times

#118846 - Rajveer - Sat Feb 17, 2007 1:27 pm

Hi guys, I've stored my polygons for my level as a data structure array. When this array is around 80 elements long, it takes a few seconds, say around 10, to compile the whole program. Giving more detail to the level, and therefore getting around 950 polygons, it takes literally hours to compile. 300 polygons takes around 20-30 minutes, so the time increases exponentially the more I increase the polycount. Is this normal? I don't want to have to wait a whole night for testing a level to see if it works, am I doing something wrong? If it helps, I'm storing this array in an external object file (.h file) which I include.

#118848 - simonjhall - Sat Feb 17, 2007 1:57 pm

No, that can't be right! Many times I've compiled models to source code (in headers) with tens of thousands of polygons and it takes as long to compile as regular source! (ie seconds)
Does the object file increase linearly in size as you add polygons?
Could you give an example of what this data structure looks like?
Couldn't you just use a regular filesystem? ;-)

EDIT: forgot to ask - which bit is slow, compiling or linking?
_________________
Big thanks to everyone who donated for Quake2


Last edited by simonjhall on Sat Feb 17, 2007 2:00 pm; edited 1 time in total

#118849 - Cearn - Sat Feb 17, 2007 2:00 pm

Rajveer wrote:
If it helps, I'm storing this array in an external object file (.h file) which I include.

When you #include something, it isn't external anymore and requires compilation every time the includer does. Which is why people always argue against putting code/data into header files (see here for the most recent discussion of this problem).

The simplest solution would be keep it as a binary, and convert it to an object file it with bin2o, or to export the data to assembly, which requires no compilation. If compilation of the things is indeed the problem, try it with bin/asm.

#118877 - DragonMinded - Sat Feb 17, 2007 7:26 pm

If you MUST keep the data in a code section, you can always move things from the .h to a .c/cpp file and include a .h with it that declares all the files extern.
_________________
Enter the mind of the dragon.

http://dragonminded.blogspot.com

Seriously guys, how hard is it to simply TRY something yourself?

#118882 - Rajveer - Sat Feb 17, 2007 8:00 pm

Cheers for all the replies guys!

Simonjhall: Yep the filesize increases linearly. The data structure I'm using is as follows

Code:
typedef struct
{
   f32 vertex0[3], vertex1[3], vertex2[3];
   t16 tex0[2], tex1[2], tex2[2];
   f32 normal[3];
   f32 edgeplane0[3];
   f32 edgeplane1[3];
   f32 edgeplane2[3];
} polydata;


I could use a filesystem, but I'm still new to programming and still unsure about how to go about it :S Basically I've created a MAXscript in 3d Studio Max to output a model as an array using the above data structure, so if I were to use a file system would I keep the model as, say, a *.max file? I'd also like to keep the whole compiled program as one file (neat freak!), so if using a filesystem, would I store the model files separately on the flashcart?

About the last question, I'm not quite sure actually (lol still a newb!) but I'll post what the compiler says and note where it takes ages (I think it's compiling the main function, which includes the file with the level data stored and therefore a copy of it):

Code:
C:\DSDevelopment\devkitPro\Raji\3Dtest\3Dtest34>make
back.bin
bottom.bin
front.bin
left.bin
right.bin
road2.bin
ship1texture.bin
ship2texture.bin
shipselectmenu.bin
top.bin
main.c

LOOOOOOOOOOOOONG WAIT

arm-eabi-g++ -g -mthumb-interwork -mno-fpu -L/c/DSDevelopment/devkitPro/PAlib/li
b -specs=ds_arm9.specs back.o bottom.o front.o left.o right.o road2.o ship1textu
re.o ship2texture.o shipselectmenu.o top.o main.o -LC:/DSDevelopment/devkitPro/P
Alib//lib -lpa9 -L/c/DSDevelopment/devkitPro/libnds/lib -lnds9 -o build.elf
Nintendo DS rom tool 1.30 - Jul 24 2006 06:51:35 by Rafael Vuijk (aka DarkFader)

built ... 3Dtest34.ds.gba
dsbuild 1.21 - Jul 24 2006
using default loader

C:\DSDevelopment\devkitPro\Raji\3Dtest\3Dtest34>pause
Press any key to continue . . .


Cearn: Cheers for the link, it was really useful in teaching me about linking files and the advantages e.t.c. So you'd suggest for me to compile the level separately and keep it as an object file, then link it when compiling at the end? Would I do this by changing the makefile? At the moment I've been using the PALib make.exe file and the PALib makefile and I didn't realise till I got into my programming that I could have made my own, by then I thought it's working fine so I stuck to it. Would linking files also be done using another makefile?

DragonMinded: It's not essential that I store my level in a code section, it's just that I created a MAXscript to export 3D geometry as an array of my data structure because I was none the wiser. Would you suggest me to keep it separate?

QUESTION TO ALL: With all the different ways to incorporate objects (#include, linking, #include in a .h file e.t.c.) do they have any advantages and disadvantages at actual runtime (on the DS console)?

#118886 - simonjhall - Sat Feb 17, 2007 9:01 pm

Rajveer wrote:
QUESTION TO ALL: With all the different ways to incorporate objects (#include, linking, #include in a .h file e.t.c.) do they have any advantages and disadvantages at actual runtime (on the DS console)?

Yeah, there are lots of advantages and disadvantages :-)

Building objects into files - the easiest and most compatible way of getting data. Should work on with all DS hardware. Data is directly addressable by the program and no funky stuff is required. You need to able to fit the data into memory all in one go though, along with your program. Plus you've gotta convert it to either text (which gets compiled, like you're doing) or you just convert the data file to an object file which gets linked in. Fastest data access speed.
This would be my method of choice.

Using appended file systems - I think these live in ROM, so are only usable with certain cards (I don't think it works with my gbamp). I don't think any fancy runtime setup code is required. I don't think you can directly address the data though as you've gotta go via the fake-filesystem interface. You've also gotta build this filesystem before appending it to the program.
I've never done this before, so I'm sure I've made a few mistakes here :-)

Using a real filesystem - takes a little bit of effort to set up. Data is not directly addressable, but you can use huge files as long as you're only pulling in small parts at a time. There may be compatability issues with using different cards. Easy to add files to the filesystem though - just drag them onto your card! Probably the slowest data transfer speed though.

Again, I'd just put data into a source file, then use an extern to actually access the data. The externs then get fixed up at link time.
I think that's most of the details. Feel free to correct me where I've made mistakes!
_________________
Big thanks to everyone who donated for Quake2

#119017 - Rajveer - Mon Feb 19, 2007 3:39 am

Wow, that's alot of information, cleared alot up. Thanks :) I'll compile my 3D levels once and link to them, do I do this in the makefile?

Just one thing, I'm cool with compiling it once and linking it, but why does it take ages to compile the large array in the first place? You said you've used tens of thousands of polygons before with no issues compiling, so maybe I'm doing something/compiling it wrong?