Not really... if you can dig up a very long cable (at least 10m) to run from the Famicoms to the TV, that would be informative. At that length, the spacing between reflections should be comparable to a pixel width, making reflections visible as luma artifacts instead of doing something to chrominance.NewRisingSun wrote:Any idea about the console and TV combination resulting in what I have observed?
You could also try using cable with explicitly wrong impedance instead. CVBS is almost always over 75Ω cable, but twisted pair as used in ethernet is 110Ω instead.
Sure. Reflections could tickle clipping behavior (and thus phase distortion) at this stage, but...NewRisingSun wrote:Reflections themselves may not cause differential phase distortion. But different reflections can cause differently-spaced comb filtering, which will cause different cancellations of original signal components, in turn causing different changes in amplitude, causing different phase distortion in any powered stage on the part of the TV set that displays the signal.
For something like a TV, where we really do expect a properly impedance matched input, we don't usually need to worry about reactances or nonlinear stages. They're almost always a high-input-impedance amplifier in parallel with proper termination, and so measuring the input impedance is just measuring the resistance.I don't know how to measure the impedance of any input or output, but from what I have read, it's not as simple as holding an Ohmmeter to it.
Measuring the output impedance is more complicated, especially since we're anticipating something changing the impedance as a function of voltage. I can't think of a good way to do this without an oscilloscope.