#51143 - Quirky - Tue Aug 16, 2005 7:49 pm
I'm getting a bit hacked off with an aspect of GBA coding; namely how to deal with the unique set of tools required and how to correctly build programs so that they Just Work (or at least give meaningful errors) when someone else tries to build them elsewhere. Especially with all their 'stuff' installed in different places.
My original approach to keeping track of multiple projects was to create a file that has all the tools I need, gfx2gba, devkitARM etc, defined by their full path, together with typical flags (FOOBAR=/home/rich/gbadev/Tools/foobar FOOBARFLAGS=-n -q -x) and then include this in each game's Makefile. OK, keeps configuration in one spot across multi-projects and makes my life easy, providing I never release the source code.
I also have my own gba library, as I'm sure a lot of homebrewers have, made up from bits and bobs from all over the place that works as well as it needs and has sensible headers. I have all my gba dev code in ~/gbadev/Projects/ together with this library (*.[ach]) in ~/gbadev/Projects/gbalib/. .
Right. So what is the problem?
Well, I find I'm duplicating a lot of stuff in my Makefiles, I need to hadrcode the path between my gbalib and the current working directory (for -L and so forth), as I add more libraries it gets harder to keep track. Makefiles are tedious to write and not very useful (all and clean and that's it). I'm toying with Lua and Tepples' GBFS, so that's 2 more -Ls that I need to hardcode. I have my own simple XM playing library that I need to hardcode when I use it. Ditto for the include paths to header files. If I use GBFS tools, for example, they live in /some/path/to/gbfs/tools/ which I need to, again, add in to the included tools makefile rules. It's all getting a bit complicated and not very good if I want to sensibly release anything ever.
So, what is the solution? I've recently been trying autoconf and automake with a bit of success. They help with a lot of the problems: I can define required library paths with ./configure --host=arm-elf --with-gbalib=/path/to/lib for a particular project and then make 'remembers' the choice until you need to change something major. The Makefiles are also easier to write, just the bare minimum and the rest is autogenerated, with nicer rules (dist being especially useful). And it should be easier for other people to compile the code as the configure script gives them advice as to what they are missing and where. The downside is that this really requires arm-elf-gcc and friends to be in the PATH, together with any other special programs I need. I do have this now for GBA coding at least - but why is it generally frowned upon? Installing Insight adds arm-elf-insight to /usr/local/bin, for example, which is very similar.
But it does have downsides: the configure.ac input becomes the new Makefile-like monster of copy paste for any new projects - rules to check for gbalib, rules to check for programs required are all identical across projects. The files Makefile.am require special rules for GBA (the elf objcopy at the end) that you must copy-paste each time. Anything a bit non-standard, such as rules for gfx2gba, requires special attention and more copy pasting. Dealing with multiple palette images requires extra scripting. There's no real way to scale it all nicely that i can see.
So what's the best solution to this mess? How is one supposed to go about cross compiling in an organized way? Normal programs automagically know where their libraries live (dlls in windows live in C:\Windows(?) and libXXX.so is found by ldd on GNU/Linux) but cross compiled ones can't use this wizardry. How do other people go about this in a way that is scalable as your number of homebrew projects grow?
My original approach to keeping track of multiple projects was to create a file that has all the tools I need, gfx2gba, devkitARM etc, defined by their full path, together with typical flags (FOOBAR=/home/rich/gbadev/Tools/foobar FOOBARFLAGS=-n -q -x) and then include this in each game's Makefile. OK, keeps configuration in one spot across multi-projects and makes my life easy, providing I never release the source code.
I also have my own gba library, as I'm sure a lot of homebrewers have, made up from bits and bobs from all over the place that works as well as it needs and has sensible headers. I have all my gba dev code in ~/gbadev/Projects/ together with this library (*.[ach]) in ~/gbadev/Projects/gbalib/. .
Right. So what is the problem?
Well, I find I'm duplicating a lot of stuff in my Makefiles, I need to hadrcode the path between my gbalib and the current working directory (for -L and so forth), as I add more libraries it gets harder to keep track. Makefiles are tedious to write and not very useful (all and clean and that's it). I'm toying with Lua and Tepples' GBFS, so that's 2 more -Ls that I need to hardcode. I have my own simple XM playing library that I need to hardcode when I use it. Ditto for the include paths to header files. If I use GBFS tools, for example, they live in /some/path/to/gbfs/tools/ which I need to, again, add in to the included tools makefile rules. It's all getting a bit complicated and not very good if I want to sensibly release anything ever.
So, what is the solution? I've recently been trying autoconf and automake with a bit of success. They help with a lot of the problems: I can define required library paths with ./configure --host=arm-elf --with-gbalib=/path/to/lib for a particular project and then make 'remembers' the choice until you need to change something major. The Makefiles are also easier to write, just the bare minimum and the rest is autogenerated, with nicer rules (dist being especially useful). And it should be easier for other people to compile the code as the configure script gives them advice as to what they are missing and where. The downside is that this really requires arm-elf-gcc and friends to be in the PATH, together with any other special programs I need. I do have this now for GBA coding at least - but why is it generally frowned upon? Installing Insight adds arm-elf-insight to /usr/local/bin, for example, which is very similar.
But it does have downsides: the configure.ac input becomes the new Makefile-like monster of copy paste for any new projects - rules to check for gbalib, rules to check for programs required are all identical across projects. The files Makefile.am require special rules for GBA (the elf objcopy at the end) that you must copy-paste each time. Anything a bit non-standard, such as rules for gfx2gba, requires special attention and more copy pasting. Dealing with multiple palette images requires extra scripting. There's no real way to scale it all nicely that i can see.
So what's the best solution to this mess? How is one supposed to go about cross compiling in an organized way? Normal programs automagically know where their libraries live (dlls in windows live in C:\Windows(?) and libXXX.so is found by ldd on GNU/Linux) but cross compiled ones can't use this wizardry. How do other people go about this in a way that is scalable as your number of homebrew projects grow?