Page 1 of 1

Confused with Root Counters

Posted: February 15th, 2018, 1:52 am
by Misscelan
Hi there,

I wanted to time something properly using the root counters but I'm having problems getting the results I want.
According to the documentation (Run-Time Library Overview):

-RCntCNT1 System clock
-RCntCNT2 System clock (8 cycles)

One tick is approximately equal to 0.03 microseconds when counting by the system clock. In the 8-cycle
mode, 1 tick equals 8 times .03 microseconds (approximately .24 microseconds).



Then I do this:

Code: Select all

int main()
{
	SetRCnt(RCntCNT1, 65000,  RCntMdINTR);
	StartRCnt(RCntCNT1);

	while (1) 
       		if (GetRCnt(RCntCNT1) >= 15000)
		{
			printf("Here is a second\n");
			ResetRCnt(RCntCNT1);
			StartRCnt(RCntCNT1);

		}

}
And I get the prinft in about a second. The thing is 15000 ticks should be 0.00045 seconds (one tick is 3 * 10^-8).
Note 15000 does not return an exact second but should be close to this number. Also, I'm running this in NO$Cash, not sure if that's relevant.

What am I doing wrong or what am I missing?

Thanks!

Re: Confused with Root Counters

Posted: February 15th, 2018, 9:14 pm
by Misscelan
OK, I think I got it...
Apparently RCntCNT1 is the HBLANK counter that happens every 63.56/64 microseconds (PAL/NTCS).
So a second is expected when GetRCnt(RCntCNT1) >= 15625 for NTCS or 15733 for PAL.

I find the documentation a bit confusing about this, maybe it's me...
If anyone think I'm still not getting it right, please comment! :)