#169844 - DiscoStew - Thu Aug 06, 2009 10:12 pm
With reading a directory, I've found that the folders and files can be all mixed up, as well as not being sorted alphabetically, so in order for me to get them organized, I have to read the entire directory, and store each folder/file into separate arrays, while also comparing them with those already in the lists. When only dealing with 20 folders/files, the process is pretty quick, but when dealt with over 500 folders/files, I've had process times of at least 8 seconds to sort through them all.
I'll post my current directory browser code when I can, but would anyone happen to know any tricks to dealing with this, such as any functions or code snippets?
EDIT:
I'm using the DS and libfat.
_________________
DS - It's all about DiscoStew
Last edited by DiscoStew on Fri Aug 07, 2009 4:43 pm; edited 1 time in total
#169847 - elhobbs - Fri Aug 07, 2009 1:28 am
500 files/folders on the ds? that will never be fast. most of the time is going to be spent reading the directory from disk.
you could try a simple hash - like a separate list based on the first character of the file name. then you would have shorter lists to sort.
#169849 - sverx - Fri Aug 07, 2009 9:15 am
DiscoStew wrote: |
When only dealing with 20 folders/files, the process is pretty quick, but when dealt with over 500 folders/files, I've had process times of at least 8 seconds to sort through them all. |
You mean it takes you up to 8 seconds to read the content of a directory that has 500 items (file/dirs) in it???
I've got to try to test this too...
#169852 - Dwedit - Fri Aug 07, 2009 1:11 pm
You want the code I'm using for Pocketnes GBAMP?
What I did was take all the filenames, and stuff them into an array of 32-byte fixed-length strings. Filenames got preprocessed, for instance, extensions were removed, and Directory names were surrounded by [ ].
Then I made an array which maps sorted order to physical order.
Finally, I used Merge Sort to quickly sort the mapping (only duplicating data for the mapping, which is just a bunch of ints, don't need to duplicate the filenames themselves.)
Then when you select a file, let's say that the sorted order says that position #37 is actually file #103. You read from the array, it tells you that position #37 is file #103. Then you iterate over the directory, and count 103 files, then open that one.
_________________
"We are merely sprites that dance at the beck and call of our button pressing overlord."
#169853 - sverx - Fri Aug 07, 2009 2:45 pm
Dwedit wrote: |
You want the code I'm using for Pocketnes GBAMP? |
ops... I thought you meant on a DS...
Sorry!
#169857 - DiscoStew - Fri Aug 07, 2009 4:43 pm
sverx wrote: |
Dwedit wrote: | You want the code I'm using for Pocketnes GBAMP? |
ops... I thought you meant on a DS...
Sorry! |
Actually, I meant on a DS, specifically using file-systems like libfat. Sorry for not specifying that.
_________________
DS - It's all about DiscoStew
#169859 - sverx - Fri Aug 07, 2009 5:01 pm
DiscoStew wrote: |
Actually, I meant on a DS, specifically using file-systems like libfat. Sorry for not specifying that. |
oh, I'll run the same test with my xm7play player then, never tried to read a directory with so many items in it. I'm using libfat too, of course.
#169893 - FluBBa - Mon Aug 10, 2009 10:08 pm
Having the same problem on this end. Reading the directory entries goes slower and slower the further I get in the list ( I had to put up an animated character to let people know that it's working, the first second it goes really fast then it slows down allmost to a crawl).
My guess is that libfat iterates from the begining for every entry read.
_________________
I probably suck, my not is a programmer.
#169896 - elhobbs - Tue Aug 11, 2009 3:09 am
not sure if you are using nested directories, but it looks like libfat may be happier with a breadth first search.
if you provide a full path then libfat will search from the root. if you use a relative path then it should start from the current working directory.
#169920 - sverx - Fri Aug 14, 2009 8:35 am
Ok, I made a test yesterday evening. I created a new directory and wrote 500 files into that directory, then launched my xm7play application and opened that directory: it took a snap, way less than a second.
So I'm quite surprised now... maybe we're using a different setup? I'm on a Lite using an R4 with 2GB microSD. You could try running the same program on your setup to check if it's as fast as it is on my system or not.
Because my program does quite the same as you described: I'm reading one item from the directory a time, then I malloc() a little struct to hold the info, then I insert that item in a ordered list (alphabetically, case sensitive, directories on top - files later) and cycle to next item. When it's done I start displaying the list on the lower screen... quite straightforward, uh?
I'm using diropen() and dirnext(), and chdir() when entering/leaving subdirs... I guess you're doing the same too btw...
Let me know, I'm really curious :)
#169930 - DiscoStew - Sat Aug 15, 2009 7:54 am
I just came back from vacation, and am just browsing the forums real quick before I hit the sack after a 10-hour long road trip back. I'll get my stuff together in the morning.
_________________
DS - It's all about DiscoStew
#169932 - DiscoStew - Sat Aug 15, 2009 6:27 pm
Ok, I was checking my code this morning, and comparing what you were using sverx, I had been using opendir, closedir, readdir, and stat to get the directory information. I don't know where I heard it from, but I thought using diropen, dirnext, and dirclose had some problems, which is why I didn't use those in the first place. I went and tried them in my code now, and now the 8+ second load I had with about 500 files/folders is now done in a split second. I am very much relieved now that I don't have to wait so long each test period on hardware with my stuff. Thx.
Just to inform you elhobbs, I was using chdir when dealing with nested directories.
A question though. Why were the prior functions I had been using taking so long to process?
_________________
DS - It's all about DiscoStew
#169933 - elhobbs - Sun Aug 16, 2009 12:00 am
ha ha. I am using opendir readdir and closedir. maybe that is why I think things are so slow.
I had been using diropen etc, but it started causing stack corruption in one of the libnds releases something to do with a structure size mismatch.
#169943 - sverx - Mon Aug 17, 2009 8:55 am
DiscoStew wrote: |
I don't know where I heard it from, but I thought using diropen, dirnext, and dirclose had some problems, which is why I didn't use those in the first place. |
Yes, I also remember I read about that... but it was related to a specific libnds release (at least it seems to me it was so) so I simply ignored it and tried with diropen() and dirnext() first... I was lucky, of course ;)