Apple Pro Training Series: Optimizing Your Final Cut Pro System: A Technical Guide to RealWorld PostProduction [Electronic resources] نسخه متنی

اینجــــا یک کتابخانه دیجیتالی است

با بیش از 100000 منبع الکترونیکی رایگان به زبان فارسی ، عربی و انگلیسی

Apple Pro Training Series: Optimizing Your Final Cut Pro System: A Technical Guide to RealWorld PostProduction [Electronic resources] - نسخه متنی

Sean Cullen, Matthew Geller, Charles Roberts, Adam Wilt, Nancy Peterson

| نمايش فراداده ، افزودن یک نقد و بررسی
افزودن به کتابخانه شخصی
ارسال به دوستان
جستجو در متن کتاب
بیشتر
تنظیمات قلم

فونت

اندازه قلم

+ - پیش فرض

حالت نمایش

روز نیمروز شب
جستجو در لغت نامه
بیشتر
لیست موضوعات
توضیحات
افزودن یادداشت جدید


Color Television


Monochrome television involves a single signal: brightness, or

luma , usually given the symbol

Y . (Although more properly it's Y', a gamma-corrected value.) Color TV requires three times the information: red, green, and blue (

RGB ). Unfortunately, color TV was supposed to be broadcast using the same spectrum as monochrome, using the same bandwidth, so having three times the information was a bit of a problem.

Engineers at RCA developed a truly brilliant solution. They realized that the frequency spectrum of a television signal had gaps in it largely unoccupied by picture content. They encoded color information in a sine-wave called a

subcarrier and added it to the existing luma signal; the subcarrier's frequency fit within one of the empty gaps in monochrome's spectrum, so the luma information was largely unaffected by the new

chroma information, and vice versa. The resulting signal remained compatible with existing monochrome receivers and fit in the same broadcast channels: they had managed to squeeze the quart of color information into the pint pot of existing monochrome broadcasting.

In this

composite color system, the TV recovers color by comparing the amplitude and phase of the modulated subcarrier with an unmodulated reference signal. Differences in phasethe angle you can see on a vectorscopedetermine the hue of the color; the amplitudethe distance of a color from the center of the vectorscope's displayindicates its saturation. The reference signal is transmitted on every scanline; it's the

colorburst signal that you can see between the horizontal sync pulse and the left edge of the active picture.

The picture monitor shows horizontal sync, blanking, and colorburst using its pulse-cross function; the waveform shows the same information in the analog signal itself.

Proc Amp ). To fix saturation, use the Sat control in the Color Corrector 3-way (Video Filters > Color Correction > Color Corrector 3-way) for real-time correction. You can also use the Sat control in the one-way Color Corrector, the Chroma control in the Proc Amp, or the

Desaturate filter (Video Filters > Image Control > Desaturate), which lets you increase

or decrease saturation.

Additionally, the carriage of color on a subcarrier limits the detail you can resolve in the color; sharp transitions in color show up as dot crawl instead of a crisp edge. NTSC and PAL limit chroma resolution to under a quarter of luma resolution. Fortunately, the human eye is much less sensitive to color details than brightness details; NTSC and PAL were designed with these limitations in mind.

Finally, adding color to the earlier monochrome standard caused interference with the sound signal when sound and picture were modulated for broadcast. To avoid interference, the picture frequenciesframe rate as well as color subcarrierwere slowed by one part in a thousand. The resulting broadcasts could still be played on existing monochrome receivers with no problem, but that one part in a thousand slowed the field rate from 60 Hz to 59.94 Hz, the frame rate dropped to 29.97 Hz, and dropframe timecode had to be used (when, later on, timecode was invented) to keep NTSC's times in sync with wall clocks, a very important consideration for broadcasters.


Color Recording


Broadcasting color is one thing. Recording it on tape, with all the instabilities and variability of electromechanical systems, is something else altogether. Many different approaches have been used, with differing benefits and side effects.


Direct Color

In the early days of color, the composite signal was recorded directly on tape, using 1-inch VTRs. Because the color is carried in the phase and amplitude of a high-frequency signal (3.58 MHz in most NTSC systems, 4.43 MHz in PAL), any

timebase errors minor variations in playback speed that change the apparent phase of the color signalresult in ruined color. Playing back

direct color recordings requires

timebase correction , using memory buffers to smooth out mechanically-induced jitter. Timebase correctors (TBCs) for direct color playback were very expensive and were essentially specific to the format and type of machine being used.


Color-Under

Heterodyne or

color-under recording developed as a way to record and play back color without the need for the expensive TBC. Inside the VTR, the chroma signal is "heterodyned down" to a much lower frequency and recorded "under" the luma signal. The magic of heterodyning is that when the playback signal is "heterodyned up" again, most of the timebase errors cancel themselves out, which reproduces usable color. Color-under is used in ¾-inch, VHS, Video8, S-VHS, and Hi8 formats, among others.

TBC-free color comes at a price: the chroma resolution of the color-under signal is reduced, and the precise timebase of the color subcarrier becomes muddled. The muddled subcarrier makes it impossible to separate the luma and chroma information as accurately as with direct color, so color-under recordings are harder to process in downstream equipment such as proc amps, TBCs, and frame synchronizers, and almost invariably suffer from diminution of high-frequency details.


Component

In the mid-1980s, Sony's Betacam and Panasonic's MII formats introduced

component recording. In the mid-1980s, the Hawkeye/Recam/M, Betacam, and MII formats introduced component recording. A major problem with composite video is that once the luma and chroma are mixed together, it's pretty much impossible to recover the separate luma and chroma signals completely intact. Although this is irrelevant for final broadcast, since analog broadcasting uses composite color, it makes manipulation of the image in post production difficult, and dubbing composite color signals across multiple generations, even with TBCs, results in considerable quality loss.

Instead of subcarrier-modulating the signal, Betacam and MII record the luma component and two

color-difference components separately, in different parts of the video track. Because the color signal is never modulated on a phase-critical subcarrier, it isn't subject to hue-distorting timebase errors or the resolution limitations imposed by subcarrier modulation.

Color-difference components fall under the general nomenclature of "YUV", although there are several variantsY'UV, Y'/R-Y/B-Y, Y'PRPB, Y'CRCBdepending on the exact format or signal connection being used. There's nothing mysterious about YUV color; it's a simple transformation of RGB signals by matrix multiplication. YUV signals offer several advantages over RGB in the rough-and-tumble world of analog recording and transmission:

Gain imbalances between RGB signals show up as very noticeable color casts in the reproduced picture. The same degree of imbalance between YUV signals appears as a slight change in overall brightness or a change in saturation of some colors. To see what gain imbalances do to images, follow these steps:


1.

Exit FCP if it's running.

2.

Install AJW's filters from the DVD included with this book: Drag the folder AJW's Filters (Media > Lesson 07) into your Mac's Plugins folder (

Macintosh HD > Library > Application Support > Final Cut Pro System Support > Plugins, where "Macintosh HD" is the name of your system disk).

If you're working on a shared system and don't have the permissions to drop the folder in the prescribed location, you can install it just for yourself in the Plugins folder in your home directory (

YourHomeDirectory > Library > Preferences > Final Cut Pro User Data > Plugins). (FCP 1 through FCP 3 used different script locations; look in your User Manual for script installation instructions.)

If the filters don't appear when you run FCP, make sure the AJW's Filters folder and its contents are both readable and writable. Change the permissions as necessary, and the filters should show up inside FCP.

3.

Start FCP and load the

DVStressTest3.tif clip into the Viewer.

4.

Put the clip into a 720 x 480pixel Timeline and set the playhead so that the clip shows up in the Canvas. Double-click the clip in the Timeline to load it in the Viewer, then select the Viewer's Filters tab.

5.

Apply the Effects > Video Filters > AJW's Filters > Channel Balance [ajw] filter to the clip.

6.

Set the Green / Cb Gain slider to 70%, simulating an amplitude imbalance.

7.

Change the Color Space setting from RGB to

YCrCb (

YUV ) and back again while looking at the results in the Canvas.

8.

Play with other Gain settings for the various channels and watch what happens in RGB mode and in YUV mode.


Timing differences between RGB signals appear as bright fringes of contrasting colors along the edges of objects, whereas in YUV the same amount of delay shows itself as laterally displaced colors with much milder edge effects. Follow these steps to see what timing differences do in RGB and YUV:


1.

You already installed AJW's filters in the previous exercise, right?

2.

Start FCP and load the

DVStressTest3.tif clip.

3.

Put the clip into a 720 x 480pixel Timeline and set the playhead so that the clip shows up in the Canvas.

4.

Double-click the clip in the Timeline to load it in the Viewer, then select the Viewer's Filters tab.

If you're lazy like I am, FCP still has the clip loaded from the previous exercise. That'll do fine. Turn off or delete any filters already applied to the clip.

5.

Apply the Effects > Video Filters > AJW's Filters > Channel Offset [ajw] filter to the clip.

6.

Set the Green / Cb Horizontal slider to 10, simulating a timing difference.

7.

Change the Color Space setting from RGB to YCrCb (YUV) and back again while looking at the results in the Canvas.

8.

Play with other Horizontal settings for the various channels and watch what happens in RGB mode and in YUV mode.


Finally, as mentioned before, the human eye is less sensitive to color resolution than to brightness resolution. By transcoding color into YUV components, it's possible to reduce the bandwidth taken by the color components considerably without markedly affecting picture quality.

Lesson 9.

/ 204