It is currently Thu Oct 18, 2018 1:55 pm

All times are UTC - 7 hours





Post new topic Reply to topic  [ 20 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: Sat Feb 16, 2013 6:29 am 
Offline
User avatar

Joined: Tue Aug 07, 2012 12:27 pm
Posts: 59
Although Quietust already made some explorations, I decided to do my own.

After detailed study of 2A03 circuit following results were obtained:
- No differences were found in the instruction decoder
- Flag D works as expected, it can be set or reset by CLD/SED instructions; it is used in the normal way during interrupt processing (saved on stack) and after execution of PHP/PLP, RTI instructions.
- Random logic, responsible for generating the two control lines DAA (decimal addition adjust) and DSA (decimal subtraction adjust) works normally.

The difference lies in the fact that the control lines DAA and DSA, which enable decimal correction, are disconnected from the circuit, by cutting 5 pieces of polysilicon (see picture). Polysilicon marked as purple, missing pieces marked as cyan.

As result decimal carry circuit and decimal-correction adders do not work.
Therefore, the embedded processor of 2A03 always considers add/sub operands as binary numbers, even if the D flag is set.

Image
(clickable)

PSD source : http://breaknes.com/files/APU/core.zip [155 MB]
Podcast (russian) : http://youtu.be/Gmi1DgysGR0
6502 schematics : http://breaknes.com/files/6502/6502.jpg


Top
 Profile  
 
PostPosted: Sat Feb 16, 2013 7:11 am 
Offline
User avatar

Joined: Sat Feb 12, 2005 9:43 pm
Posts: 10899
Location: Rio de Janeiro - Brazil
Are you saying that the circuit is there, it's just not connected? That's pretty interesting... They simply had to make the CPU different (for legal reasons?) so they just "broke" a certain feature, instead of not implementing it at all... so weird!


Top
 Profile  
 
PostPosted: Sat Feb 16, 2013 7:26 am 
Offline
User avatar

Joined: Tue Aug 07, 2012 12:27 pm
Posts: 59
Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.


Top
 Profile  
 
PostPosted: Sat Feb 16, 2013 7:36 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20667
Location: NE Indiana, USA (NTSC)
And remember that at the time, only patents covered microprocessors. Copyright-like exclusive rights in integrated circuit topographies don't apply to ICs first sold before about 1990. Perhaps this is why the Super NES's CPU includes an authentic second-source 65816 core.


Top
 Profile  
 
PostPosted: Tue Feb 19, 2013 4:22 am 
Offline
User avatar

Joined: Fri Nov 12, 2004 2:49 pm
Posts: 7548
Location: Chexbres, VD, Switzerland
This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.


Top
 Profile  
 
PostPosted: Tue Feb 19, 2013 9:42 am 
Offline
User avatar

Joined: Tue Feb 13, 2007 9:02 pm
Posts: 147
Location: Richmond, VA
Sneaky. So Commodore's engineers were correct:

Quote:
[Commodore 64 programmer] Robert Russell investigated the NES, along with one of the original 6502 engineers, Will Mathis. “I remember we had the chip designer of the 6502,” recalls Russell. “He scraped the [NES] chip down to the die and took pictures.”

The excavation amazed Russell. “The Nintendo core processor was a 6502 designed with the patented technology scraped off,” says Russell. “We actually skimmed off the top of the chip inside of it to see what it was, and it was exactly a 6502. We looked at where we had the patents and they had gone in and deleted the circuitry where our patents were.”


Quoted from Bagnall's On the Edge book.


Top
 Profile  
 
PostPosted: Tue Feb 19, 2013 8:03 pm 
Offline
User avatar

Joined: Fri Nov 19, 2004 7:35 pm
Posts: 4093
So how useful would decimal mode have really been?

_________________
Here come the fortune cookies! Here come the fortune cookies! They're wearing paper hats!


Top
 Profile  
 
PostPosted: Tue Feb 19, 2013 8:41 pm 
Offline
User avatar

Joined: Sat Feb 12, 2005 9:43 pm
Posts: 10899
Location: Rio de Janeiro - Brazil
Dwedit wrote:
So how useful would decimal mode have really been?

It could have made scores and other stats easier to manage... Can't think of anything else.

Since I learned assembly with the 2A03, I don't really miss the decimal mode. In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.


Top
 Profile  
 
PostPosted: Tue Feb 19, 2013 10:14 pm 
Offline

Joined: Thu Aug 12, 2010 3:43 am
Posts: 1589
Bregalad wrote:
This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.

These days a lot is automated but the result is far from optimal and still needs human intervention to fix the worst offenders - it just avoids most of the work. It still takes a lot of effort to get done, especially with the complexity of current chips.


Top
 Profile  
 
PostPosted: Wed Feb 20, 2013 8:24 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20667
Location: NE Indiana, USA (NTSC)
tokumaru wrote:
In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.

But they still have to be fast enough. ARMv4 (e.g. ARM7TDMI) doesn't have decimal mode or hardware divide. Someone on the gbadev board used to complain that the sprintf() call to convert binary numbers to decimal to draw the status bar every frame ate up a substantial portion of the available CPU time. And if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version? That's what a lot of Atari 2600 game programmers tended to do, I'm told.


Top
 Profile  
 
PostPosted: Wed Feb 20, 2013 9:23 am 
Offline
Formerly 65024U

Joined: Sat Mar 27, 2010 12:57 pm
Posts: 2263
IMO, that's just bad programming then. Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte. It's not that hard to fix, even in C.


Top
 Profile  
 
PostPosted: Wed Feb 20, 2013 11:32 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20667
Location: NE Indiana, USA (NTSC)
3gengames wrote:
tepples wrote:
if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version?

Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte.

Fans of decimal mode might have called that a waste of memory.


Top
 Profile  
 
PostPosted: Wed Feb 20, 2013 12:26 pm 
Offline
User avatar

Joined: Sun Jan 22, 2012 12:03 pm
Posts: 6889
Location: Canada
I imagine the most effective use case for this is an accounting program where you are keeping track of a lot of numbers onscreen, and you want to keep the UI responsive.

Of course, there's the additional overhead when multiplying BCD, which might throw a wrench into that goal...


Anyhow, it's convenient to have. It's better than having to write extra software routines to do the same thing, but as has been pointed out, those aren't that hard to drop into your program anyway, so the benefit is pretty minimal. If the NES had it, it would have been used.


Top
 Profile  
 
PostPosted: Wed Nov 23, 2016 11:48 am 
Offline
User avatar

Joined: Mon Dec 29, 2014 1:46 pm
Posts: 817
Location: New York, NY
org wrote:
Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.


Even after excising the decimal mode circuitry, what about the rest of it? Why didn't they have to pay royalties for using the "integrated circuit microprocessor" modules?


Top
 Profile  
 
PostPosted: Wed Nov 23, 2016 11:59 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20667
Location: NE Indiana, USA (NTSC)
In 1983 there was no mopyright.

The Famicom was made in the early 1980s, when copyright-like exclusive rights in mask works didn't exist yet. Until then, integrated circuit layouts were seen as too "utilitarian" to qualify for ordinary copyright. But by the release of the Super Famicom, the Treaty on Intellectual Property in Respect of Integrated Circuits (IPIC) of 1989 had been signed. So Nintendo licensed the 65816 from WDC.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 20 posts ]  Go to page 1, 2  Next

All times are UTC - 7 hours


Who is online

Users browsing this forum: Ben Boldt and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group