Page 1 of 1

Allocating an Ordering Table with calloc3/malloc3

Posted: April 7th, 2020, 4:21 am
by Jaberwocky
Hey everyone,

I have a silly question again >o< Don't get mad at me!
For my "game" I prepared an OrderingTable for some elements by declaring globaly:

Code: Select all

#define OT_LENGTH     10

GsOT     Wot[2];
GsOT_TAG zsorttable[2][1 << OT_LENGTH];
int      activeBuffer;
And setting them up with:

Code: Select all

// set OT resolution
Wot[0].length = OT_LENGTH;	
Wot[1].length = OT_LENGTH;

//set OT on OT handler(0)
Wot[0].org = zsorttable[0];	
Wot[1].org = zsorttable[1];
This works all fine! I'm happy x3 And the story could end there buuuut! I wanted to be really cutting edge and wanted to allocate a dynamic OrderingTable using calloc3. So I prepared the memory allocation with:

Code: Select all

extern u_char* const __heapbase;
extern unsigned long __heapsize;

inline void setupHeap()
{
	InitHeap3((unsigned long*)__heapbase, __heapsize);
}
And well... allocated a new GsOT, GsOT_Tag - just like I did in the global memory. Running my code however makes NO$PSX not happy at all! I replies with: "endless link-chain in gpu linked list dma" and the screen goes all fancy.

I played around with this a bit! Using the old calloc for allocation will make the code work. I figured out that calloc3 allocates like at the end of the heap, while calloc allocates at the start of the heap. So I guess it has to do something with really high addresses? But since the GsOT and GsOT_Tag are both allocated via calloc3, they wouldn't have big offsets...

I feel like I do something fundamentaly wrong :/ Sorry again >o<

Re: Allocating an Ordering Table with calloc3/malloc3

Posted: April 7th, 2020, 2:54 pm
by Shadow
Make 'activeBuffer' volatile for compiler optimisation.
Your arguments to InitHeap3 are incorrect because they are calculated wrong.

Do what I do to get the maximum size available for the heap.

Code: Select all

extern unsigned long _bss_objend;

void InitHeap()
{
	u_long stack = 0x801FFFF0; // default stack for this PS-EXE (as defined in SYSTEM.CNF)
        u_long _stacksize = 0x10000; // 64 KB (make this smaller for simple programs)
	u_long addr1, addr2;
	
	addr1 = (stack - _stacksize);
	//printf("addr1 = %X\n", (int)addr1);
	addr2 = (addr1 - (int)&_bss_objend);
	//printf("addr2 = %X\n", (int)addr2);
	
	//printf("\nUsing the end BSS address %X for InitHeap3\n", (int)&_bss_objend);
	//printf("Reserving %d bytes for InitHeap3...", (int)addr2);
	
	InitHeap3(&_bss_objend, addr2);
}

Re: Allocating an Ordering Table with calloc3/malloc3

Posted: April 7th, 2020, 5:25 pm
by Jaberwocky
Sadly - your suggestions didn't fixed the problem I still have the same issue as before.

Also.... Using your code gave me the same base address for the heap, like __HeapBase did - size were different however...

Edit:
Please end me x.x"
The riddle is solved now - it was me all along! Before allocating the buffer for the GPU stuff, I also allocated a buffer to read a file from disk. Turns out, that buffer was to small for the give file - so the file was happily read into my OT buffer, setting the value of the length to a very high value...

I feel stupid xD