It is currently Thu Apr 19, 2018 8:16 pm

All times are UTC - 7 hours





Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: Wed Oct 26, 2016 6:33 am 
Offline
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3246
Location: Nacogdoches, Texas
I remember the discussion about creating a "old-style" custom video game console out of pre existing parts and the discussion just about ending when it got to video hardware because your options are severely limited. I've always thought about how older video game consoles would output analog video even though it is originally digital video. (If I'm not mistaken, this is how HDMI for the NES can work?) I looked at how HDMI signal is encoded, and it appears it is really just 3 pins that carry 8 bit color values (RGB) in a series. I then thought about how you could get 3 CPUs to handle each color channel where their outputs would be converted to be in a series instead of in parallel. They would have to be insanely fast though, because I looked at something that appeared that the lowest resolution in the HDMI standard is 480p, and that's 640 x 480 x 60 = 18,432,000 pixels in one frame. So already, you'd need each CPU to be several times faster than 18MHz, and I've seldomly seen an 8 bit CPU past 10. Yeah, this was a dumb idea. :lol:


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 7:19 am 
Offline
User avatar

Joined: Wed Feb 13, 2008 9:10 am
Posts: 629
Location: Estonia, Rapla city (50 and 60Hz compatible :P)
Bitbanging the signal is never gonna work, it is hundreds of MHz to GHz speeds. There are dedicated serializer-deserializer chips that could be potentially used though, then you can work on pixel level rather than signal level.

_________________
http://www.tmeeco.eu


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 8:11 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 19919
Location: NE Indiana, USA (NTSC)
Even with a dedicated chip handling the DVI-D/HDMI physical interface, feeding the deserializer fast enough to get 720p may prove difficult. You may need to put an upscaler on an FPGA the way kev did, and at that point, you might as well make the whole PPU and APU on the FPGA.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 10:22 am 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
The slowest DVI (and HDMI) are allowed to send pixels is the VGA resolution, with a pixel clock of 25.2MHz.

The TMDS that they both use consists of 10 periods for every pixel ("8b10b"). This means that the slowest bitrate you'd need to send is 252MHz. I believe there's no microcontroller that could bit bang this.

Lower resolutions are officially supposed to be constructed via pixel doubling. Some monitors (at least mine, anyway) also do the right thing when given a signal with extended blanking periods.

So ... Sure, I guess you could pull a Galaksija; use a parallel-digital-to-DVI sender, a 25MHz master clock and divide that by 4 to generate the CPU clock. Then the CPU "just" gets to bit bang a 160x480 image (guaranteed to work), or if the monitor cooperates maybe you could coax out 320x240.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 12:13 pm 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 19919
Location: NE Indiana, USA (NTSC)
lidnariq wrote:
Lower resolutions are officially supposed to be constructed via pixel doubling. Some monitors (at least mine, anyway) also do the right thing when given a signal with extended blanking periods.

Wouldn't they have to? VGA modes include 350- and 400-line text modes; CGA graphics modes 04h and 06h, EGA mode 0Dh, and MCGA mode 13h, all of which which double 200 lines to 400; and 350-line EGA graphics modes 0Fh and 10h. The PC ordinarily boots into a 400-line text mode. Or what provision does DVI or HDMI make for 350- or 400-line mode?


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 12:57 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
The BIOS-based 320-pixel wide modes are pixel doubled in the graphics card, effectively emitting dots at 12.6MHz (but clocked twice). What I'm referring to is that I can emit 320x240-60 at 25MHz without pixel doubling, and my monitor will upscale it correctly. (As a result, the active time is a paltry 18%)

I'm not certain whether graphics cards are required to support any legacy video modes any more. I know that high-resolution palettized modes are mostly gone.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 1:29 pm 
Offline
User avatar

Joined: Sun May 27, 2012 8:43 pm
Posts: 1331
lidnariq wrote:
The slowest DVI (and HDMI) are allowed to send pixels is the VGA resolution, with a pixel clock of 25.2MHz.

The TMDS that they both use consists of 10 periods for every pixel ("8b10b"). This means that the slowest bitrate you'd need to send is 252MHz. I believe there's no microcontroller that could bit bang this.

Lower resolutions are officially supposed to be constructed via pixel doubling. Some monitors (at least mine, anyway) also do the right thing when given a signal with extended blanking periods.

So ... Sure, I guess you could pull a Galaksija; use a parallel-digital-to-DVI sender, a 25MHz master clock and divide that by 4 to generate the CPU clock. Then the CPU "just" gets to bit bang a 160x480 image (guaranteed to work), or if the monitor cooperates maybe you could coax out 320x240.


You might bit-bang it by abusing some level of timing tolerance on a DVI monitor, by using DMA to output chunks of 8b10 encoding, alongside a clock that stops sometimes. I doubt any good monitor wouldn't be upset by this.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 3:39 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
Given that TMDS requires a PLL to recover the bit clock from the pixel clock, I don't think there's any way to do that.

There are a few specific 8-bit brightness values that could be faked at half the bit clock (because of how TMDS works), but at least one of the control signals (idle, hsync, vsync, I forget) can't be sent in this way, so it's not really a useful shortcut.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 6:21 pm 
Offline
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3246
Location: Nacogdoches, Texas
mikejmoffitt wrote:
using DMA to output chunks of 8b10 encoding

I would have thought that you couldn't use DMA because otherwise, you're just sending a chunk of data without modifying it in any way to actually create graphics. The plan would be to where the CPU is never bogged down enough to where it never misses a writing pixel. As you can tell, I have no idea what I'm talking about. :lol:

lidnariq wrote:
I can emit 320x240-60 at 25MHz without pixel doubling, and my monitor will upscale it correctly.

Is there a difference between a black pixel and a "blank" pixel? Or wait, does this have to do with one of the flags you were talking about? I'd have thought you'd get at 640 pixel wide image with having every other pixel black.

lidnariq wrote:
I guess you could pull a Galaksija; use a parallel-digital-to-DVI sender

Seems like what I had in mind. I'm surprised that this was ever done, especially with a 3MHz Z80. :shock:

lidnariq wrote:
a 25MHz master clock and divide that by 4 to generate the CPU clock.

Assuming you can output a pixel per cycle? :lol:

I found something interesting though, and that's the eZ80: https://en.wikipedia.org/wiki/Zilog_eZ8 ... ts_Used_In The reason I say it's interesting is because it has a 24bit ALU, making it perfect for 24 bit color. It's also pretty damn fast, (50MHz, which is perfect in this case, but also according to Wikipedia, 4x as fast per cycle as the Z80) although actually getting it seems like a challenge as it looks impossible to get it by itself: http://www.zilog.com/index.php?option=c ... &Itemid=57

lidnariq wrote:
a 160x480 image (guaranteed to work), or if the monitor cooperates maybe you could coax out 320x240.

What do you mean by "monitor cooperates"? It seems like it would never have a clue what to do here.


Last edited by Espozo on Wed Oct 26, 2016 8:23 pm, edited 1 time in total.

Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 7:29 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
You can always get 160x480 (with pixel quadrupling), because that looks to the monitor like 640x480. (The CPU knows that it's lower resolution, but the monitor doesn't).

However, getting the underscanned mode to work instead requires the receiver to say "I'm only being given a little tiny picture, I should take those small handful of pixels and use it", given that it's not a standard resolution. And the CPU still has to manage sending data at 25Mpixel/second for those brief bursts.

It's plausible that this is generally supported because DVI and HDMI send a different signal (unlike VGA or NTSC) for "active region but black" vs "margins". But I only have my PC monitors as an HDMI receiver, so I can't test on anything else. And TVs are known to be much much pickier than monitors.


Top
 Profile  
 
PostPosted: Wed Oct 26, 2016 8:20 pm 
Offline
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3246
Location: Nacogdoches, Texas
lidnariq wrote:
DVI and HDMI send a different signal (unlike VGA or NTSC) for "active region but black" vs "margins"

Oh. So, what you'd do is have the resolution set to 640x480, have it to where a pixel is drawn every 4 pixels, but have it to where the "next row" signal (or something like that) is only set every other row of pixels?


Top
 Profile  
 
PostPosted: Thu Oct 27, 2016 5:48 pm 
Offline
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3246
Location: Nacogdoches, Texas
I knew this was a dumb idea from the start and I never really had any intention of doing such a thing in the first place (it would have been cool though), but I thought of how ill suited a standard CPU is for taking the role of a dedicated video chip. It's a lot more flexible, but also way slower because cycles are wasted on instruction data and most of the instructions are very ill suited for this, causing you to use more instructions. It's actually easier to add 2 graphic layers together than it is to do standard overlapping, because you have to first see if the pixel that is being applied is transparent or not, and then decide whether or not it overwrites the pixel of the layer behind it. The other thing is that you would need to have the CPU constantly jump back for every pixel, unless you want to have the largest unrolled loop ever.


Top
 Profile  
 
PostPosted: Thu Oct 27, 2016 7:13 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
Espozo wrote:
Oh. So, what you'd do is have the resolution set to 640x480, have it to where a pixel is drawn every 4 pixels, but have it to where the "next row" signal (or something like that) is only set every other row of pixels?
Effectively transmitting 1280x240? Theoretically that's what 480i-over-HDMI is supposed to look like.


Top
 Profile  
 
PostPosted: Thu Oct 27, 2016 8:34 pm 
Offline
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3246
Location: Nacogdoches, Texas
Yeah, and 1280 / 4 = 320, so unless I'm mistaken, it's definitely not impossible to "bit bang" this. While there's still no way I'm actually making something like this (I mostly wanted to see if it could be done, which seems so now, at least on the actual display part) I found out that the eZ80 actually only has an 8 bit data bus, so it couldn't even work in this case. If you could get any moderately old CPU in the world for this though, I found something called the Motorola 56000, that actually has booth a 24 bit ALU and data bus. Additionally, it's a digital signal processor, whatever that really means (I'd hope that would mean better suited for this, considering you are creating a digital signal) https://en.wikipedia.org/wiki/Motorola_56000


Top
 Profile  
 
PostPosted: Thu Oct 27, 2016 9:06 pm 
Offline

Joined: Sun Apr 13, 2008 11:12 am
Posts: 7001
Location: Seattle
Yeah, but that's kinda misleading. Once you have the parallel-digital-RGB888-to-HDMI converter IC, the rest is "just" the same as analog component video generation... albeit needing a master clock at 25MHz or higher that you then use for whatever else.

Those IC's aren't inexpensive. Better off just using a ice40HX series FPGA, which is just barely fast enough to directly emit 252Mbit/sec TMDS, and also contain more logic at the same time.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next

All times are UTC - 7 hours


Who is online

Users browsing this forum: Google Adsense [Bot] and 8 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group