Is the GBA rgb output compatible with OSSC?

Discussion of programming and development for the original Game Boy and Game Boy Color.
Post Reply
alfredocalza
Posts: 67
Joined: Sat Jun 22, 2019 1:06 pm
Location: Madrid, Spain

Is the GBA rgb output compatible with OSSC?

Post by alfredocalza » Fri Nov 22, 2019 2:59 am

I want to find out if it is possible to get an RGB signal out of the GBA and connect it to my OSSC so that I can play it on my HDTV. I have read that the GBA resolution is 160 x 240 or 228 x 308 (I am not sure about this), which is not compatible with TVs, but my ossc is capable of outputting 2x, 3x, 4x and 5x line resolution. Does anybody know if the ossc would accept an RGB input from the GBA? If so, how can I get it out of the GBA? is there also a sync signal which I would also need to get out of the GBA?

Thank you!

lidnariq
Posts: 9510
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Is the GBA rgb output compatible with OSSC?

Post by lidnariq » Fri Nov 22, 2019 11:34 am

alfredocalza wrote:
Fri Nov 22, 2019 2:59 am
I have read that the GBA resolution is 160 x 240 or 228 x 308 (I am not sure about this)
The GBA screen is 240 pixels by 160 scanlines.
The GBA scanline timing is 308 pixels (i.e. a new scanline of data is emitted after 308 pixels' time has passed) by 228 scanlines.
Does anybody know if the ossc would accept an RGB input from the GBA?
I have no idea. That's a question you should ask the OSSC developer: can it use a 13.6kHz hsync / 60Hz vsync (228 scanline) signal?

Certainly the hardware is capable of this, but that doesn't mean that the FPGA in it knows how to deal with it.
If so, how can I get it out of the GBA? is there also a sync signal which I would also need to get out of the GBA?
Buy a used TV de Advance. :p

But, yes, if you'd read the link I shared in the previous thread, the cable from the GBA CPU to the LCD contains 15 bits of digital parallel RGB555 video, plus some other signals. I don't obviously see ones that correspond to hsync and vsync, but I'm confident it's there.

alfredocalza
Posts: 67
Joined: Sat Jun 22, 2019 1:06 pm
Location: Madrid, Spain

Re: Is the GBA rgb output compatible with OSSC?

Post by alfredocalza » Sat Nov 23, 2019 10:41 am

Thanks a lot for the explanation! I still have many things to learn. I will try to ask about whether or not the OSSC will take the output from the GBA. I will ask the store I bought it from since I can't find anywhere who was the developer.

Thank you!

Alfredo

alfredocalza
Posts: 67
Joined: Sat Jun 22, 2019 1:06 pm
Location: Madrid, Spain

Re: Is the GBA rgb output compatible with OSSC?

Post by alfredocalza » Sun Nov 24, 2019 9:50 am

Hi, here is the reply from the ossc retailer:

Hi Alfredo,

OSSC can accept such signals but being as it's a line multiplier it
can't do anything to improve their compatibility, so you will need a
display that can tolerate the off spec signal after line doubling.

Thanks,

Is there anything I can do to make the signal compatible with my hdtv?

lidnariq
Posts: 9510
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Is the GBA rgb output compatible with OSSC?

Post by lidnariq » Sun Nov 24, 2019 11:00 am

alfredocalza wrote:
Sun Nov 24, 2019 9:50 am
Hi, here is the reply from the ossc retailer:
Hi Alfredo,

OSSC can accept such signals but being as it's a line multiplier it can't do anything to improve their compatibility, so you will need a display that can tolerate the off spec signal after line doubling.

Thanks,
Is there anything I can do to make the signal compatible with my hdtv?
If you can configure the OSSC to a 4x or 5x ratio, then you'll end up with timing that's close to:

240x160x4 in 308x228x4 :
pretty close to standard 1440x900 scanline timing. Might work on a computer monitor, but I doubt HDTVs accept this mode.

240x160x5 in 308x228x5 :
rather close to "CEA-861" 1920x1080 scanline timing. Best chance of working, but the GBA emits video at a vsync rate of 59.727Hz, and many HDTVs will reject video sources that aren't quite close to the standard 50, 59.94, or 60 Hz modes.

alfredocalza
Posts: 67
Joined: Sat Jun 22, 2019 1:06 pm
Location: Madrid, Spain

Re: Is the GBA rgb output compatible with OSSC?

Post by alfredocalza » Sun Nov 24, 2019 3:52 pm

But that is weird, because I bought my HDTV in Europe and I can play both PAL and NTSC SNES consoles through my OSSC (I have the snes mini which I bought in the U.S. and an European 1-chip01 version which I bought in the UK and modded it to play both NTSC and PAL cartridges). These consoles should have different refresh rates as you said (50hz and 60hz), but it makes no difference when playing one or the other through my OSSC and my European TV. Does the refresh rate even matter with HDTVs? I thought HDTV had a different technology?. (please excuse my ignorance!). Anyway, if I find out how to extract the RGB signal from my GBA I will give it a try to see if it'll take it, but this next weekend I need to dedicate some time to figuring out the jail bar issue with one of my NES consoles!

lidnariq
Posts: 9510
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Is the GBA rgb output compatible with OSSC?

Post by lidnariq » Sun Nov 24, 2019 5:36 pm

alfredocalza wrote:
Sun Nov 24, 2019 3:52 pm
These consoles should have different refresh rates as you said (50hz and 60hz), but it makes no difference when playing one or the other through my OSSC and my European TV.
Well... not clear! A PAL SNES generates video at 50.007Hz, which is probably close enough to what the HDTV requires that it won't complain.

If your SNES had been modified to be switchable to 60Hz, that'd instead generate video at 59.55 Hz, and a real NTSC SNES would instead generate video at 60.09Hz, both of which some HDTVs complain about.
Does the refresh rate even matter with HDTVs? I thought HDTV had a different technology?.
Yes, unfortunately. Computer monitors have always had to deal with things differing from computer to computer, sometimes widely. Original VGA cables relied on the assumption that the output device didn't need to know where the pixels were. When DVI was invented, it added a clock to mark where the pixels were, but the receiving device needs another clock that runs at ten times that to actually receive the pixel data. This clock has a range of a factor of 7 in speed (all the way from 25MHz to 175MHz), and it's hard (more costly) to build the device that will accurately generate the faster clock from it.

Many clock multipliers only support a range of 1.5 from the slowest to fastest supported resulting rate.

With HDTVs, they decided to get rid of that nonsense, and instead only support a very limited set of pixel clocks instead. No need for a PLL that can range widely, but instead only ones corresponding to a very limited set of standard rates, and by requiring that the emitter generate something that matches, they can manufacture simpler hardware.

HDTVs that did include a composite input often recorded the input into a framebuffer and played back from that framebuffer at a fixed rate supported by the HDTV, which could produce stuttering, tearing, or dropped frames, depending on just how non-compliant the input video was.

Post Reply