gbadev.org forum archive

This is a read-only mirror of the content originally found on forum.gbadev.org (now offline), salvaged from Wayback machine copies. A new forum can be found here.

DS development > Audio double buffer problem

#170258 - Pate - Sun Sep 13, 2009 4:58 pm

Hi!

I am trying to code some audio stuff, and I followed the excellent guide by Deku at http://deku.gbadev.org/program/sound1.html where I found the table of suitable sample rates and buffer sizes for sound mixing.

I am using the timer value 64612 with buffer size 304 (18157Hz), but I am still experiencing sync problems.

I initialize my audio like this:

Code:

#define ADLIB_CHANNEL_CR   0x04000420
#define   ADLIB_ENABLE       (127+(64<<16)+(1<<29)+(1<<27)+(1<<31))
#define   ADLIB_TIMER      64612
#define   ADLIB_BUFFER_SAMPLES   (304)
#define   ADLIB_BUFFER_SIZE   (2*ADLIB_BUFFER_SAMPLES)

adlib_buf:
   .space   (2*ADLIB_BUFFER_SIZE)

ldr   lr, =adlib_buf
ldr   r1, =ADLIB_TIMER      @ SOUND_FREQ(18157)
ldr   r2, =ADLIB_CHANNEL_CR
mov   r3, #0
mov   r4, #ADLIB_BUFFER_SAMPLES  @ (2 buffers of 2-byte samples)/4
@-------
@ Channel 0
@-------
str   lr, [r2, #0x04]      @ SCHANNEL_SOURCE(0) = (int)adlib_buf;
strh   r1, [r2, #0x08]      @ SCHANNEL_TIMER(0) = SOUND_FREQ(18157);
strh   r3, [r2, #0x0A]      @ SCHANNEL_REPEAT_POINT(0) = 0;
str   r4, [r2, #0x0C]      @ SCHANNEL_LENGTH(0) = length;


Then inside VBlank interrupt I start the sound playing, using the two buffers back to back and looping back to start at the end of the second buffer:

Code:

#define   ADLIB_ENABLE       (127+(64<<16)+(1<<29)+(1<<27)+(1<<31))

ldr   r1, =ADLIB_CHANNEL_CR
ldr   r2, =ADLIB_ENABLE
str   r2, [r1]      @ SCHANNEL_CR(0) = ADLIB_ENABLE;


Then during every VBlank I swap between the two 304-sample (or 608-byte as I use 16bit audio) buffers where I put the sample values to play.

The problem I am having is that the system drifts out of sync within a couple of seconds, which causes crackling to the sound, and then again after a few seconds it is back in sync for a few seconds, and so on.

So, my questions are, is this a correct method for using audio double buffering at all? The guides do not seem to use a looping hardware sound channel, is this my problem? I thought the GBA audio values in Deku's guide are still valid with DS, am I correct here?

Oh, in case you are wondering, yes, I am trying to make ARM7 emulate an AdLib sound card. Many problems with it yet, but this buffer sync problem is something that I have been fighting with for several days now and can not seem to figure it out.

Any help appreciated!

Pate
_________________

#170260 - Ruben - Sun Sep 13, 2009 6:27 pm

The problem here is that the DS refresh rate has no 'good' frequencies available, therefore all will cause crackling when resetting on V-Blank only.
I personally would recommend using the hardware channels, unless you absolutely must use software rendering.

Also, you'd normally use 32768Hz mixing frequency on the DS, as this is the output frequency for the hardware.

I was going to make a longer post, but I had a brain fart, so if someone else can explain how that works... But if no-one does, I'll post how.

#170266 - Pate - Mon Sep 14, 2009 5:06 am

Ruben wrote:
The problem here is that the DS refresh rate has no 'good' frequencies available, therefore all will cause crackling when resetting on V-Blank only.


Ah, so DS hardware differs from GBA in this case? Ok, that pretty much explains the problem I am having. Seems like I need to sync using a timer instead of VBlank. That should fix the sync problem, am I right?

Quote:
I personally would recommend using the hardware channels, unless you absolutely must use software rendering.


I currently use 9 hardware sound channels, one for each AdLib channel. Each AdLib channel has two operators that can either output a sample each (in which case I need to mix these two samples) or operator 1 can be used as input for operator 2 to modify the sample that operator 2 generates.

Quote:
Also, you'd normally use 32768Hz mixing frequency on the DS, as this is the output frequency for the hardware.


I don't think I can afford the CPU cycles that rate would require. Currently my code runs 8 channels fine, and did run 9 channels until I had to add an extra branch for each channel (not even for each sample) after which it started crashing. After some more optimizing it should be able to run 9 channels at 18kHz. Sadly it is still missing some features. The problem is that each operator needs 2 table lookups per sample (so in total 2*2*9 table lookups per sample) in addition to the actual sample output to the buffer.

Quote:
I was going to make a longer post, but I had a brain fart, so if someone else can explain how that works... But if no-one does, I'll post how.


No problem, many thanks for your reply, you cleared up the main problem I was having, so I think I'll just forget about VBlank syncing.

Thanks again!

Pate
_________________

#170270 - Ruben - Mon Sep 14, 2009 9:02 am

Yes, syncing to a timer should get rid of the crackling.
Also, if you're not going to mix at 32768Hz because of the speed, then I'd recommend 16384Hz.

Could you paste the main code here? There may be some minor optimizations I know of.

#170271 - sverx - Mon Sep 14, 2009 11:46 am

... btw I would choose to activate looping channels in hardware, so that you won't hear 'clicks' when a channel stops/restarts.

The DS actual mix freq is ~32.728,5 Hz, according to gbatek so you could have your software mixing routines working at 16.364 Hz .

If you want to use vblank as a heartbeat then you've got to generate 272,73 samples each time vblank. But I would choose another heartbeat or another software mixing freq, one that is a multiple of 60 Hz ( 273 * 60 = 16380 )

Of course then your buffer should be twice big and you've got to keep track of odd/even vblanks, but that's the easier part :)

#170275 - Dwedit - Mon Sep 14, 2009 1:35 pm

The frame rate is not 60. The frame rate is 59.8261.

In other words, forget about using Vblank as a heartbeat, and stick to timers.
_________________
"We are merely sprites that dance at the beck and call of our button pressing overlord."

#170276 - sverx - Mon Sep 14, 2009 2:46 pm

Dwedit wrote:
The frame rate is not 60. The frame rate is 59.8261.


wow. one can't be sure about anything these days :|

#170277 - Pate - Mon Sep 14, 2009 3:48 pm

Thanks for all the replies!

I'm trying to sync using a timer, but I'm having problems with that one also. Could you please check if my math is correct?

I was thinking of shortening the buffers to 256 samples (nice round number) and going to 16384 (or close to that) mixing rate. So the SCHANNEL_TIMER value would be 64512 (65536-1024), correct?

Next, as the actual timers run at double speed, and I want to have the timer interrupt after every 256 samples, the timer should give an IRQ at approximately 64 times a second (16384/256). To get this rate with a timer, the value should be 65024 (65536-512) with a DIV_1024 timer. Is this correct?

Sadly I still get heavy crackling with that approach.

I then tested what happens if I don't use an IRQ but instead spin loop on the timer counter, and that gives a clean tone for a little while, then starts crackling, and then again gives a clean tone for a little while.

Might be something else wrong with my code, but is my math above correct?

Thanks again!

Ps. I'm too embarrassed to show the full source code as it has so many problems, but when I get it to work properly and just need optimizations I'll surely show it to you gurus for optimization tips!

Pate
_________________

#170278 - Ruben - Mon Sep 14, 2009 4:13 pm

If I am thinking correctly, the hardware timer value should be 0x10000-(16756991/16384) = FC01 = 64513 or around there, so that's fine.

The next thing to do is to have CPU timer 0 overflow with the same value as the sound timer, and have timer 1 cascade off it, with an overflow of your buffer size. ie...

Mixing frequency = 16384
Buffer size = 512

Channel[0].Timer = 0x10000-(16756991/Mixing frequency)
Channe[0].Mode = Left(127) + on
Channel[1].Timer = 0x10000-(16756991/Mixing frequency)
Channe[1].Mode = Right(127) + on
Timer[0].Value = 0x10000-(16756991/Mixing frequency)
Timer[0].Mode = Timer on
Timer[1].Value = 0x10000 - Buffer size
Timer[1].Mode = Timer on + timer cascade + timer IRQ

However, I can't be too sure, as I'm about to pass out (as usual).. but I'm pretty sure I did something similar to that on the GBA.[/code]

#170279 - Pate - Mon Sep 14, 2009 5:41 pm

Ah, silly me, I restarted the timer for every buffer fill loop, so of course it slid out of sync.

I tried again with a constantly running timer, and now it seems (or rather sounds) like it is almost working. No$GBA still pops slightly about 3-4 times a second (might be just the emulated Windows sound buffer looping and not anything wrong with my code), but real hardware sounds completely different, the pure sine tone has changed into a distorted guitar sound. Like the sound volume overdrives the amplifier or something... Weird.

Anyways, the actual buffer sync problem seems to be solved, now I just need to improve my implementation of it and start working on fixing the problems in the emulation.

Thanks again for all your help!

Pate
_________________

#170280 - Ruben - Tue Sep 15, 2009 12:25 am

The 'popping' on no$ could be related to DMA transfers. I know it bugged me for a really long time before I finally figured out no$ doesn't like DMA transfers at all when playing sound.

#170294 - Pate - Wed Sep 16, 2009 6:21 am

Okay, the buffering seems to work rather fine now, thanks again for your help!

I posted a message on the ASM subforum requesting some optimization tips, as I can only manage 8 channels at 16384Hz with 256-sample buffers, not all 9 channels that I need.

If you have time to look at it and figure out any optimizations, please do! It is at http://forum.gbadev.org/viewtopic.php?t=16861

Thanks again!

Pate
_________________