Hi all, its me again, Hundo!
So I find myself now on the advanced Nerdy Nights tutorial about horizontal scrolling. Step 4 of specifically, and there is a portion of that code that just doesn't make sense to me. I've debugged it by hand, while following the debugger in FCEUX and it just doesn't make sense to me what is happening. Would anyone here care to shed some light on my questions as to how this particular piece of code works? First I'll show the code and then I'll ask my question about it.
LDA scroll ; calculate new column address using scroll register
LSR A ; shift right 3 times = divide by 8
STA columnLow ; $00 to $1F, screen is 32 tiles wide
LDA nametable ; calculate new column address using current nametable
EOR #$01 ; invert low bit, A = $00 or $01
ASL A ; shift up, A = $00 or $02
ASL A ; $00 or $04
ADC #$20 ; add high byte of nametable base address ($2000)
STA columnHigh ; now address = $20 or $24 for nametable 0 or 1
LDA columnNumber ; column number * 32 = column data offset
LDA sourceLow ; column data start + offset = address to load column data from
Here we have the code which draws a new column of background data every time the scroll counter reaches a multiple of 8 (tiles are 8 pixels wide...).
I'm pretty sure I understand the first 12 lines of code correctly. If I'm correct, here we are calculating what PPU memory address we will be writing the first byte of data too for our new off screen column. Is this correct? Pretty sure I am correct on that, but its the next piece of code that I really don't undrstand.
Its the code that start with "LDA ColumnNumber". Can anyone tell me exactly what this piece of code is doing? Finally the logic behind the code is destroying my brain. In the code bunny loads a column number. Lets say the column number is 24, arbitrarily. He then multiples that number by 32 with 5 ASL's. After that, he stores the result of this previous multiplication as the low byte of the variable "sourceLow". Now, if the column number is 24, (and it can be because he's loading 128 columns of data in this example), and you multiply 24 by 32, your decimal result is 1800. Now one byte of data only holds up to 255, so how can this value be placed into the variable "sourceLow"? When I follow the sequence of ASL's in the FCEUX debugger, the results that I'm expecting for the multiplies do not add up there either.
Bunny also AND's the "columnNumber" value with the number #%11111000. If I understand the logic right here, any 8-bit number AND'ed with 11111000 can only equal the values 24 or 00 (decimal). Why does he want one or the other of these two values as his High byte for the "sourceHigh" variable?
Any advice for this would be greatly appreciated. Thank you!