zzo38 wrote:
I invented another way (for signed 16-bit numbers into decimal), which however requires many more ROM tables and special pattern tables. I have not compared the speed
However, in some cases you can just work in base 10 or base 100 for scoring anyways and it will work, and you do not need to convert (although you can't use 6502 decimal mode so you have to implement it by yourself instead). (It depends much on the program.)
I wound up doing it like this:
Code:
#include <stdio.h>
#include <math.h>
static unsigned char score0[7]={"0000100"};
static unsigned char score1[7]={"0000000"};
static unsigned char i;
static unsigned char a;
/**
* Add score0 to score1
*/
void add_points(void)
{
a=0; // clear carry
// Convert result to binary.
for (i=6;i-->0; )
{
score0[i]=score0[i]-'0';
score1[i]=score1[i]-'0';
}
// Add each piece
for (i=6;i-->0; )
{
score1[i]=score0[i]+score1[i]+a;
a=(score1[i]>9);
if (a)
score1[i]-=10;
}
// Convert result back to ASCII
for (i=6;i-->0; )
{
score0[i]=score0[i]+'0';
score1[i]=score1[i]+'0';
}
}
void main(void)
{
for (int q=0;q<1000;++q)
{
add_points();
printf("%c%c%c%c%c%c%c\n",score1[0],score1[1],score1[2],score1[3],score1[4],score1[5],score1[6]);
}
}
Although, I am altering the ASCII converts, as my number tiles start a 0x01 now.
-Thom