Bob, since the video camera scans both fields from the same frame of
film, there is no "Difference in Time".
I realize now that it may not have been clear that I'd switched
contexts in the above, from film-originated material to interlacing
in general (and more specifically, the sort which originates AS
"video", i.e., what comes out of a standard TV camera.
As far as slight differences, the persistence of the human eye tends
to average out the minor differences.
Yes, but that's exactly where the "motion problem" in
interlaced material (again, that which originates that way,
from a camera) comes into play. You DO average out
successive fields, but since they're being presented in an
interlaced manner (at least in a properly-adjusted receiver
or monitor), the effect is a blurring of details. This is one
of the reasons that an interlaced system doesn't actually
deliver the resolution one would assume from the scan
format. For instance, the 525/60 scanning standard (as
used in North America) provides a bit over 480 "active"
lines per frame, but delivers only about 340 lines' worth
of effective vertical resolution. (Another way this is often
expressed is to say that the standard assumes a "Kell
factor" of 0.7; 0.7 times 484 lines is 338.8 lines.)
Another contributing factor to this effect, at least for CRT-
based displays (which is all there was, of course, when the
standard was written) is that a CRT running interlaced
can't be focused to the point where the individual lines are
fully resolved (to do so would result in horrible line
"twitter" owing to the 30 Hz refresh rate for any single
line).
You may not like the "Frame" concept, but I find "Progressive Scan"
to be stupid. Progressive? they went back to the earliest video scan
method, and have the nerve to call it "Progressive"?
Well, "progressive" of course doesn't refer to it being a
more advanced method. It IS, if you have the bandwidth, a
better way to scan. Interlaced scanning really was adopted
in the first place only because it's a crude-but-effective form
of analog "compression," permitting a higher image resolution
than otherwise would be the case in the available bandwidth.
It surely doesn't benefit the system in any other way (it is harmful
in terms of image quality, somewhat) and you wouldn't go to all
the trouble involved in an interlaced system (the half-lines at the
end of the fields, the "equalization" pulses, the need to adjust the
relative field positions at the display end, etc., etc., etc.).
Bob M.