ZeePedia

SPOKEN WORDS AND RELEVANT VISUALS:Digital Audio, Quantizing Error

<< BACK GROUND VOICE AND VOICE OVER:Natural or Raw Sound, Sound Effects
TALK SHOWS, FORUMS AND DISCUSSION PROGRAMMES >>
img
TV News Reporting and Production ­ MCM 516
VU
LESSON 38
SPOKEN WORDS AND RELEVANT VISUALS
Under this topic we will further discuss the role and importance of audio in TV production, including
audio mixing, analog and digital audio, audio control devices, Audio Recording, Editing and
Playback, Video Switchers and Special Effects, Chroma Key and lip-synching i.e. synchronizing
audio with the video.
Digital Audio
"There is very little about the details of analog audio technology that is useful in the digital world, this
means having to learn the basics all over again." Lon Neumann, Audio Engineer
The decade of the 80s saw the introduction of digital audio signal processing. This not only opened the
door to a vast array of new audio techniques, but it represented a quantum leap in audio quality.
For example, the following technical problems have been a headache for audio recording
engineers for decades:
·
Wow and flutter (tremble or flicker)
·
Remnant high frequency response/self-erasure
·
Modulation noise
·
Bias rocks
·
Print-through
·
Head alignment problems
·
Stereo image shift
·
Poor signal-to-noise ratio
·
Generational loss
All of these problems and even a few more are eliminated with digital audio. This is possible because
of the precise timing pulses associated with digital audio and the fact that digital signal is comprised
of "0s" and "1s." These represent simple positive and negative voltages that are not close to each other
in value.
As long as equipment can reproduce just these two states, there is an audio signal. However, with an
analog signal there are an unlimited number of associated values, providing ample opportunity for
things to get out of strike. Technically speaking, the background noise of a digital signal can be as bad
as 20dB (which is a lot) and the digital signal will still survive. In the case of an analog signal, this
would translate into intolerable noise.
Copying vs. Cloning
Each time you make a copy of an analog audio segment you introduce aberrations or abnormalities
because you are only creating a "likeness" of the original. With digital technology you are using the
original elements to create a "clone."
If we are using the original uncompressed digital data, we can fully expect to end up with an exact
clone of the original, even after 50 generations (50 copies of copies).
With analog data copies of copies quickly result in poor audio quality. Before the event of digital
technology, such things as nonlinear editing were not possible.
If you have the option, you'll want to convert analog data into digital as soon as possible and leave it
that way until you are forced at some point to convert it back to analog.
Converting Analog to Digital
The same sampling and quantizing principles apply to digital audio. With both audio and video the
analog signal is typically quantified or sampled 48,000 times per second.
128
img
TV News Reporting and Production ­ MCM 516
VU
That means that every 20 microseconds a "snapshot" is taken of the analog voltages. This
instantaneous snapshot is then converted first to a base-ten number and from there to a computer-type
binary ("0" and "1") form.
The number of data bits used to encode the analog data determines the resolution and dynamic range
possible.
A 16-bit encoding system has 65,536 voltage steps that can be encoded. Obviously, the higher the data
bits the better the quality -- and the more technical resources required to handle the signal.
Such high sampling rates demand a high degree of timing (synchronization) precision. Without it
things fall apart with stunning speed.
Just as in video, a synchronizing signal is used to keep things in lock step. This signal or
synchronizing (sync) pulse in digital audio is sent out every 0.00002 of a second.
Quantizing Error
In audio production signals must be converted back and forth from analog to digital and from digital to
analog. Since we are dealing with "apple and orange" types of data, something called a quantizing
error can result.
In the analog-to-digital conversion process, a voltage midpoint is selected in the analog values to use
as the digital equivalent. This midpoint is a close, but generally not a perfect, reflection of the original
analog signal, thus to avoid the error there is the need to minimize the number of digital-to-analog (as
well as analog-to-digital) conversions.
Optimum Digital and Analog Audio Levels
The optimum audio levels for digital audio signals are different than those for analog signals. Whereas
the 0dB peak setting is the Standard Operating Level (SOL) for analog systems, for digital equipment
the maximum level is typically -20dB. With both analog and digital signals it comes down to
something called headroom.
Headroom is the safe area beyond the SOL (standard operating level) point. With a SOL of -20dB, this
leaves 20dB for headroom. This is a bit technical, but just keep in mind that the maximum audio level
for analog signals will generally be different than it will for digital signals.
With digital signals, however, a digital meter or a peak program meter (PPM), is used. In the case of
the digital meter on the right, when the signal touches the red area, we're entered the headroom area. If
a digital signal were to go to the very top of the scale, clipping would occur. Unlike analog audio,
where exceeding the maximum level will result in signal distortion, in digital audio you might not
notice the elimination of audio peaks.
Actually, an occasional full-scale digital sample (to the top of the red range) is considered inevitable;
but, a regular string of "top of the scale" occurrences means that the digital audio levels are too high
and you are losing audio information.
VU meters respond in different ways to audio peaks. In the case of the standard VU meter the needle
tends to swing past peaks because of inertia. At the same time, this needle will not quickly respond to
short bursts of audio. Thus, this type of meter tends to average out audio levels.
Because of the limited headroom with digital audio signals a faster responding peak program meter
(PPM) or digital meter is preferred. Before you can really get serious about maintaining correct audio
levels throughout a production facility, you must see that the audio meters throughout the facility
accurately calibrated to a standard audio reference level.
129
img
TV News Reporting and Production ­ MCM 516
VU
Although, facilities can adopt their own in-house standards, typically, a 1,000Hz audio tone should
register 0dB on analog equipment and -20dB on digital equipment.
At the same time, production facilities can set their own internal standards as long as they remain
consistent throughout the facility and everyone knows what they are.
Digital Standards
In 1985, the Audio Engineering Society and the European Broadcasting Union developed the first
standard for digital audio. This is referred to as the AEB/EBU standard. This standard was amended in
1993. Before this standard was adopted digital audio productions done in one facility could experience
technical problems when moved to another production facility.
Digital Audio Time Code
Digital audio systems make use of similar system of identifying exact points in a recording. This is
essential in the editing process in order to identify and find audio elements, as well as to keep audio
and video synchronized. But as we will see when we talk about video time code, in the process of
converting frame rates between the 24, 30, and 29.97 (the different video standards), timing errors
develop.
Unless the audio technicians are aware of these differences and take measures to compensate, after a
few minutes video and audio can get noticeably out of sync. (We've probably all seen movies, news
bulletins and shows where the lip-sync was out and the words we were hearing didn't exactly match
the lip movements of the actors.)
People working with digital audio should at least be aware of the potential problem, and before a video
project is started, consult an engineer about the possible problems that could arise in the conversion
process. It's much easier to head off these problems before a project starts than to try to fix them later.
Audio Control Devices
Boards, Consoles, and Mixers
Various sources of audio must be carefully controlled and blended during a production. If audio levels
are allowed to run at too high a level, distortion will result, and if levels are too low, noise can be
introduced when levels are later brought into the normal range. Beyond this, audio sources must be
carefully and even artistically blended to create the best possible effect.
The control of audio signals is normally done in a TV studio or production facility with an audio board
or audio console.
Audio boards and consoles are designed to do five things:-
1.
Amplify incoming signals
2.
Allow for switching and volume level adjustments for a variety of audio sources
3.
Allow for creatively mixing together and balancing multiple audio sources to achieve an
optimum blend
4.
Route the combined effect to a transmission or recording device
5.
Sophisticated audio boards or consoles also allow you to manipulate specific characteristics of
audio. These include the left-to-right "placement" of stereo sources, altering frequency characteristics
of sounds, and adding reverberation.
For video field production smaller units called audio mixers provide the most basic controls over
audio. The input selector switches at the top of each fader can switch between such things as
microphones, CDs, video servers, and satellite feeds. The selector switch at the bottom of each fader
typically switches the output of the fader between cue, audition and program.
130
img
TV News Reporting and Production ­ MCM 516
VU
Cue is primarily used for finding the appropriate starting point in recorded music. A low-quality
speaker is intentionally used in many studios so cue audio is not confused with program audio.
Audition allows an audio source to pass through an auxiliary VU meter to high quality speakers so
levels can be set and audio quality evaluated. And, of course, program sends the audio through the
master gain control to be recorded or broadcast.
Even though audio boards, consoles, and mixers can control numerous audio sources, these sources
all break down into two main categories:
·
Mic-level inputs
·
Line-level inputs
Mic-level inputs handle the extremely low voltages associated with microphones, while line-level
inputs are associated with the outputs of amplified sources of audio, such as CD players. Once they are
inside an audio board, all audio sources become line-level and are handled the same way.
Using Multiple Microphones in the Studio
Most studio productions require several mics. Since the mics, themselves, may have only a 5 to 10
meter (15-30 foot) cord, mic extension cables may be needed to plug the microphone into the nearest
mic connector. Studio mics use cables with three-prong connectors.
Since things can get confusing with a half-dozen or more mics in use, the audio operator needs to
make a note on which control on the audio board is associated with which mic. A black marker and
easily removed masking tape can be used on the audio board channels to identify what mic is plugged
into what channel. Mic numbers or talent names can be used for identification.
Because mics represent one of the most problem-plagued aspects of production, they should be
carefully checked before the production begins. Unless you do this, you can expect unpleasant
surprises when you switch on someone's mic, and there is either no audio at all, or you faintly hear the
person off in the distance through another mic.
There is another important reason that mics should be checked before a production: the strength of
different people's voices varies greatly. During the mic check procedure you can establish the levels
(audio volume) of each person by having them talk naturally, or count to 10, while you use a VU
meter to you set or make a note of the appropriate audio level.
Of course, even after you establish an initial mic level for each person, you will need to constantly
watch (and adjust) the levels of each mic once the production starts. During spirited discussions, for
example, people have a tendency to get louder. It is also good practice to have a spare mic on the set
ready for quick use in case one of the regular mics suddenly goes out.
Given the fragility of mics, cables, connectors, etc., this is not an unusual occurrence. As production
facilities move to digital audio, boards are taking on a different appearance. Like the new digital
switchers and lighting boards, the latest generation of audio boards makes use of an LCD video
display.
Using Multiple Mics in the Field
If only one mic is needed in the field, it can simply be plugged into one of the audio inputs of the
camera. (The use of the internal camera mic is not recommended except for capturing background
sound.) When several microphones are needed and their levels must be individually controlled and
mixed, a small portable audio mixer will be needed.
The use of an audio mixer generally requires a separate audio person to watch the VU meter and
maintain the proper level on each input. Portable AC (standard alternating current) or battery-powered
131
img
TV News Reporting and Production ­ MCM 516
VU
audio mixers, are available that will accept several mic- or line-level inputs. The output of the portable
mixer is then plugged into a high-level video recorder audio input (as opposed to a low-level mic
input).
Most portable mixers have from three to six input channels. Since each pot (fader or volume control)
can be switched between at least two inputs, the total number of possible audio sources ends up being
more than the number of faders. Of course, the number of sources that can be controlled at the same
time is limited to the number of pots on the mixer.
There is a master gain control -- generally on the right of the mixer -- that controls the levels of all
inputs simultaneously. Most mixers also include a fader for headphone volume. Although handheld
mics are often used for on-location news, for extended interviews it's better to equip both the
interviewer and the person being interviewed with personal mics.
Whereas the mixer will probably require a special audio person to operate, the cameraperson can
operate the simple two-mic mixer. The output from the unit is simply plugged into the camcorder.
Audio Mixer Controls
Audio mixers and consoles use two types of controls: selector switches and faders. As the name
suggests, selector switches simply allow you to select and direct audio sources into a specific audio
channel. Faders (volume controls) can be either linear or rotary in design. Faders are also referred to
as attenuates, gain controls, or pots (for potentiometers). Linear faders are also referred to as vertical
faders and slide faders.
"Riding Gain"
It's important to maintain optimum levels throughout a production. This is commonly referred to as
riding gain.
You will recall that, depending on the production facility, digital and analog audio signals typically
require different optimum levels -- and even those standards vary with different countries. However,
to reduce confusion we'll use the analog standard of 0dB to represent a maximum level.
Normal audio sources should reach 0dB on the VU or loudness meter, when the vertical fader or pot is
one-third to two-thirds of the way up (open).
Having to turn a fader up fully in order to bring the sound up to 0dB indicates that the original source
of audio is coming into the console at too low a level. In this case, the probability of system
background noise increases.
Conversely, if the source of audio is too high coming into the board, opening the fader very slightly
will cause the audio to immediately hit 0dB. The amount of fader control over the source will then be
limited, making smooth fades impossible.
To reflect the various states of attenuation (resistance), the numbers on some faders are the reverse of
what you might think. The numbers get higher (reflecting more resistance) as the fader is turned down.
Maximum resistance is designated with an infinity symbol, which looks like an "8" turned on its side.
When the fader is turned up all the way, the number on the pot or linear fader may indicate 0, for zero
resistance. Even so, just as you would assume, when the pot is turned clockwise or the fader control is
pushed up, volume is increased.
Level Control and Mixing
Audio mixing goes beyond just watching a VU meter. The total subjective effect as heard through the
speakers or earphones should be used to evaluate the final effect. For example, if an announcer's voice
and the background music are both set at 0dB, the music will interfere with the announcer's words.
132
img
TV News Reporting and Production ­ MCM 516
VU
Using your ear as a guide, you will probably want to let the music peak at around -15dB and the voice
peak at 0dB to provide the desired effect: dominant narration with supporting but non-interfering
background music. But, since both music and voices have different frequency characteristics (and
you'll recall that, unlike VU meters, our ears are not equally sensitive to all frequencies), you will need
to use your ear as a guide.
During long pauses in narration you will probably want to increase the level of the music somewhat,
and then bring it down just before narration starts again. In selecting music to go behind (under)
narration, instrumental music is always preferred. If the music has lyrics sung by a vocalist (definitely
not recommended as background to narration) they would have to be much lower so as not to compete
with the narrator's words.
Using Audio from PA Systems
In covering musical concerts or stage productions a direct line from a professionally mixed PA (public
address) system will result in decidedly better audio than using a mic to pick up sound from a PA
speaker. An appropriate line-level output of a public address (PA) amplifier fed to a high-level input
of a mixer can be used. However, be careful, feeding a high-level or speaker level PA signal to a mic
input can damage the amplifier.
Audio Recording, Editing and Playback
Turntables and Reel-to-Reel Tape Machines
Records and reel-to-reel tape machines used to be the primary source of prerecorded material in TV
production.
Today, they have almost all been replaced by CDs (compact discs), DAT (digital audiotape) machines,
and computer-type hard drives.
"Vinyl" a term that refers mostly to LP (long playing) records, was the primary medium for
commercially recorded music for several decades.
Most vinyl records were either 45 or 33 1/3 rpm (revolutions per minute) and had music recorded on
both sides. Records had a number of disadvantages, primarily the tendency to get scratched and worn,
which quickly led to surface noise.
Unlike vinyl records, some of the newer media can be electronically cued, synchronized, and instantly
started -- things that are important in precise audio work.
Reel-to-reel analog 1/4-inch tape machines, which were relied upon for several decades in audio
production, have also almost all been replaced -- first by cart machines and then by DAT machines
and computer hard drives.
Cart Machines
Cart machines (cartridge machines), which are still used at some facilities, incorporate a continuous
loop of 1/4-inch (6.4mm) audiotape within a plastic cartridge.
Unlike an audio cassette that you have to rewind, in a cart the tape is in a continuous loop. This means
that you don't have to rewind it, you simply wait until the beginning point recycles again. At that point
the tape stops and is cued up to the beginning.
Most carts record and playback 30- and 60-second segments (primarily used for commercials and
public service announcements) or about three minutes (for musical selections).
Audio carts are now well on their way to the Museum of Broadcasting along with other exhibits of
broadcast technology used in earlier years. Today, audio is primarily recorded and played back on hard
drives, CDs, and DAT recorders.
133
img
TV News Reporting and Production ­ MCM 516
VU
Compact Discs
Because of their superior audio quality, ease of control, and small size, CDs (compact discs) are a
preferred medium for prerecorded music and sound effects. (Radio stations typically transfer CD
selections to a computer disk for repeated use.)
Although the overall diameter of a typical audio CD is only about five inches (12.7 centimeters)
across, a CD is able to hold more information than both sides of a 12-inch (30.5cm) LP phonograph
record. Plus, the frequency response (the audio's pitch from high to low) and dynamic range (the audio
range from loud to soft that can be reproduced) are significantly better.
Although CDs containing permanently recorded audio are most common, CDRs (recordable compact
discs) are also used in production. These offer all of the advantages of using CDs, plus the discs can be
re-recorded multiple times.
Radio stations that must quickly handle dozens of CDs use Cart/Tray CD players.
For repeated use, CD audio tracks are commonly transferred to computer disks where they can be
better organized and quickly selected and played with a few strokes on a keyboard. A computer screen
displays the titles and artists, and the time remaining for a selection that's being played.
In mass producing CDs an image of the digital data is "stamped" into the surface of the CD in a
process similar to the way LP records (with their analog signals) are produced.
When a CD is played, a laser beam is used to illuminate the microscopic digital pattern encoded on the
surface. The reflected light, which is modified by the digital pattern, is read by a photoelectric cell.
The width of the track is 1/60th the size of the groove in an LP record, or 1/50th the size of a human
hair. If "unwound" this track would come out to be about 3.5 miles (5.7 km) long. Of course, DVDs
take this technology even further.
In 2004, MP3 CDs appeared that have the capacity of as many as 10 standard CDs.
CD Defects and Problems
If the surface of the CD is sufficiently warped because of a manufacturing problem or improper
handling or storage, the automatic focusing device in the CD player will not be able to adjust to the
variation. The result can be mis-tracking and loss of audio information.
Automatic Error Correction
Manufacturing problems and dust and dirt on the CD surface can cause a loss of digital data. CD
players attempt to compensate for the signal loss in three ways:
·
Error-correction,
·
Error concealment (interpolation)
·
Muting
Error-correcting circuitry within the CD player can detect momentary loses in data (dropouts) and,
based on the existing audio at the moment, supply missing data that's close enough to the original not
to be readily noticed.
If the loss of data is more significant, error-correcting circuits can instantly generate data that more or
less blends in with the existing audio. If this type of error concealment has to be invoked repeatedly
within a short time span, you may hear a series of clicks or a ripping sound.
Finally, if things get really bad and a large block of data is missing or corrupted, the CD player will
134
img
TV News Reporting and Production ­ MCM 516
VU
simply mute (silence) the audio until good data again appears -- a solution that's clearly obvious to
listeners.
Audio Recording, Editing and Playback
DAT
DATs (Digital Audio Tapes) are capable of audio quality that exceeds what's possible with CDs.
The 2-inch by 2-7/8 inch (5 X 7.6 cm) DAT cassette contains audiotape 3.81mm wide. The cassette is
about two-thirds the size of a standard analog audiocassette. The two-hour capacity of a DAT cassette
is 66 percent greater than a standard 80-minute CD.
RDAT (recordable digital audiotape) is designed for professional applications, as are the very high
quality ADAT machines (types I and II).
DAT systems use a head-wheel that spins at 2,000 rpm (revolutions per minute), similar to what's
found in a videocassette recorder.
Various types of data can be recorded with the audio. Examples are time code and the MIDI machine
control data used in sophisticated postproduction audio work.
DAT Time Code: The DAT time code system, referred to as the IEC Sub-code Format, also insures
that tapes recorded on one DAT machine can be played back without problems on any other machine.
DAT time code is similar to the SMPTE time code.
Computer Hard Drives
Today, computer hard drives are the choice for broadcast music, commercials, and general audio
tracks. Recording audio material on computer hard drives (generally with MPEG-2 or MPEG-4
compression) has several advantages.
First, the material can be indexed in an electronic "table of contents" display that makes it easy to find
what you need. This index can also list all of the relevant data about the "cuts" (selections) --
durations, artists, etc. Second, you have almost instant access to the selections.
Once recorded on a hard drive, there is no wear and tear on the recording medium as the audio tracks
are repeatedly played. Another advantage is that the selections can't be accidentally misfiled after use.
(If you've ever put a CD back in the wrong case, you know the problems this can represent.) And,
finally, unlike most CDs, hard drive space can easily be erased and re-used.
Data Compression
Both digital audio and video are routinely compressed by extracting data from the original signal that
will not be missed by most listeners or viewers.
This makes it possible to record the data in much less space, and, thus, faster and more economically.
Data can be compressed to various degrees using different compression schemes.
Although hard drives are extremely reliable today, they do occasionally "crash," especially after
thousands of hours of use or a major jolt ends up damaging the delicate drive and head mechanism.
Unless anti-virus measures are instituted, the computer operating system can also be infected with
viruses, which can result in a complete loss of recorded material. With these things in mind, critical
files and information should always be "backed up" on other recording media.
IC and PC Card Recorders
Some audio production is now being done with PC card and IC recorders. Both use solid-state
memory cards, such as Compact Flash and ATA Cards.
These memory cards contain no moving parts and are impervious to shock and temperature changes.
The data in these memory modules can be transferred directly to a computer for editing.
These units typically give you the choice of two basic recording formats: MPEG-2, a compressed data
format, and PCM (pulse code modulation) which is an uncompressed digital format. The latter is used
with CD players, DAT recorders, and on computer editing programs that use wave (wav) files.
135
img
TV News Reporting and Production ­ MCM 516
VU
RAM Audio Recorders
The new generation of recorders can be a fraction of the size of other types of recorders.
However, unlike recorders with removable media, the stored audio must generally be played back
from the unit, itself.
The I-Pod Era
When I-pod-type devices and computers that could "rip" (copy) musical selections from CDs and
Internet sources arrived on the scene, consumer audio recording and playback changed in a major way.
Users can assemble hours of their favorite music (up to 2,000 songs) on a computer and transfer it to a
pocket-sized, solid-state listening device such as an iPod (on the left) or to one of the new generation
cell phones (on the right).
"Podcasts" of broadcasts from TV networks can also be downloaded and listened to or viewed at the
user's convenience.
With the iPod nano you can watch up to 5 hours of TV shows, music videos, movies, and Podcasts.
Although Apple Computer initially popularized these devices, many manufacturers now produce their
own versions.
Audio Editing Systems
Audio editing used to require physically cutting and splicing audiotape -- an arduous process.
Today, there are numerous computer-based audio editing programs available. Many are shareware that
can be downloaded from the Internet.
Shareware can be downloaded and tested, generally for about a month, before the program quits
working and you need to pay for it.
Once you pay, you may be given an unlock code that will enable you to use the program for an
unlimited time.
Often, minor updates to the program are free; major updates will probably involve an update charge.
In addition to basic editing, audio editing programs offer audio filtering, manipulation, and an endless
range of special audio effects.
The audio line shows how a single channel of sound appears in an audio editor. The vertical red line
indicates the cursor (selector) position.
Much as a cursor is used to mark words in a word processing program to make changes as needed, the
cursor in an audio time line provides a point of reference for making audio changes.
Most programs use a computer mouse to drag-and-drop segments and special effects onto a time-line
(the longitudinal graphical representation of the audio along a time continuum).
Audio editing in television production is typically handled along with the video on a video editing
system.
The hard drives on computer-based audio editing systems can also store a wide range of sound effects
that can be pulled down to a time line to accompany narration and music.
Video Switchers and Special Effects
Although video switchers look impossibly complex, once you understand some basics, they don't seem
as intimidating. Each button represents a video source even "black," which includes the technical parts
of the video signal necessary to produce stable black. The bottom row of buttons (outlined in blue)
represents the program bus or direct-take bus.
Any button pressed on this row sends that video source directly to line out, the final feed being
broadcast or recorded. The easiest way to instantly cut from one video source to another is simply to
select it ("punch it up") on the program bus. The program bus generally handles more than 90% of
136
img
TV News Reporting and Production ­ MCM 516
VU
video switching. But, what if you want to dissolve (fade) from one camera to another, or fade to
black?
For this you need to move to the top two rows of buttons referred to as effects, or the mix/effect bus.
From here, with the help of the fader bars, you can create rudimentary special effects. When the fader
bars are in the top position, any video source punched up on the top row of buttons is sent to the
effects button on the program bus. The buttons that have been selected are shown in red.
In this case, camera 3 was selected on the effects bus, so that's the camera that will be sent down to the
program bus. Since the effects bus has been selected on the program bus, its signal will then be sent
out and be displayed on to the line out video monitor. Put another way, if the fader bars point toward
the top row of buttons on the effects bus, and camera 3 has been selected on that bus, we will see
camera 3 when the effects bus is selected on the program bus.
If we were to move the fader bars down to the lower position, the video source selected on the lower
row of buttons (in this case camera #2) would be sent to the program bus. During the process of
moving the fader bars from the top to the bottom, we see a dissolve (and overlapping transition) from
camera #3 to camera #2.
If we stop the fader bars midway between the move from top to bottom, we would see both sources of
video at the same time -- we would be superimposing one camera over the other. Although this used
to be the way we displayed titles, credits, etc., on the screen, today we use an electronic keying
process.
In a key one image is electronically "cut out" of the other, while in a super the two images are visible
at the same time. Compared to a key, the latter can look a bit jumbled. First, the fader bars have been
split--each one being at the "0" (no video, or black) position. If we were to move fader bar "A" to the
top position we would put camera 3 on the air; if we were move fader bar "B" to the bottom position
we would put camera 2 on the air.
What you don't want to do is split the bars so that they each sends out maximum video from its source.
(Video engineers may get very upset with you!) Next, note the extra row of buttons (outlined in green)
marked "preview," just below the program bus.
With the preview bus we can set up and check an effect on a special preview monitor prior to
switching it up on the program bus. Without being able to preview and adjust video sources before
putting them on the air, we might end up with some unpleasant surprises. To see (preview) an effect,
we first punch up effects on the preview bus. When we get the effect we want on the effects bus, we
can cut directly to it by punching up effects on the program bus.
Some switchers have multiple effects banks.
If you moved the fader bars on Effects #2 to the up position, you would make a transition from black
to whatever was on Effects #1. In this case it would be Camera 2 superimposed over Camera 3.
Finally, let's add a few bells and whistles.
The top row of buttons in this drawing represents various types of wipes.
Yellow on the buttons represents one video source, black another source.
Additional patterns--some switchers have hundreds--can be selected by entering numbers on the
keypad.
If wipe is selected on the switcher, the button pushed (indicated in red in this drawing) shows the
moving pattern (controlled by the fader bars) that would be involved in the transition from one video
source to the other.
A border along the edges of the wipe pattern -- a transition border -- can be used and its hue,
brightness, sharpness, width, and color saturation selected.
137
img
TV News Reporting and Production ­ MCM 516
VU
The key clip knob controls the video level of the source you are going to key into background video.
This is adjusted visually on the preview monitor.
Downstream keyers, which are often used to key in such things as opening titles and closing credits,
are external (downstream from) the basic switcher.
The advantage of a downstream keyer is that it doesn't require the use a switcher's effects bank for
keying.
This means that the bank stays free to be used for other things.
The switcher incorporates versions of all of the features, plus a computer display that adds even more
options.
Although switcher configurations differ, they all center on the same basic concepts.
Chroma Key
The type of key is referred to as luminance key because the keying effect is based on the brightness or
luminance of the video that you are keying in. But, as we saw when we discussed virtual reality sets,
it's also possible to base keying on color (chroma). In chroma key a particular color is selected for
removal and another video source is substituted in its place.
This type of keying is commonly done during weathercasts where a graphic is inserted behind weather
person. Although any color can be used in chroma key, royal blue and a saturated green are the most
commonly used? Most of the special effects we seen on television today are done with chroma key.
Software-Based Switchers and Effects
Most software-based switchers use the hardware-based switcher.
Note the familiar fader bars and the various banks of buttons. In this case, instead of pushing buttons,
you click on the buttons with a mouse.
Software based systems can be easily and regularly upgraded when new software is written--an
advantage you don't have to the same degree with hardware-based equipment.
With most software-based systems it's also possible to go far beyond basic switching and create such
things as 3-D illustrations and animated effects.
138
Table of Contents:
  1. CREATIVITY AND IDEA GENERATION FOR TELEVISION:Video Procedures
  2. PRE-REQUISITES OF A CREATIVE PRODUCER/DIRECTOR:SET-UP RESPONSIBILITIES
  3. REFINING AN IDEA FOR PRODUCTION:Drama, Magazine Shows, Documentary
  4. CONCEPT DEVELOPMENT:Variable. Pure and applied research
  5. RESEARCH AND REVIEWS:Research Procedure, Review of available literature
  6. SCRIPT WRITING:Elements of a successful story, Characters, Effects
  7. PRE-PRODUCTION PHASE:Indoor production, Outdoor production, Essentials of PBE
  8. SELECTION OF REQUIRED CONTENT AND TALENT:Camera rehearsal
  9. PROGRAMME PLANNING:Checklist, Electronic Field Production (EFP)
  10. PRODUCTION PHASE:Floor plan, Traditional set, Representational set, Design elements
  11. CAMERA WORK:Movement of lens of camera, Types of shots
  12. LIGHT AND AUDIO:Importance of sound in TV, Use of microphone, Loudness
  13. DAY OF RECORDING/PRODUCTION:Rehearsals,Point to ponder
  14. LINEAR EDITING AND NLE:Episode, Scene, Editing, Production Switcher
  15. MIXING AND USES OF EFFECTS:Live Sound Effects, ARROW STRIKING
  16. SELECTION OF THE NEWS:Elements of news, Timeliness, proximity
  17. WRITING OF THE NEWS:The inverted pyramid, Lead, Credit line
  18. EDITING OF THE NEWS:Characteristics of good news:Process of editing a news
  19. COMPILATION OF NEWS BULLETIN:Hard news, Soft news, Investigative report
  20. PRESENTATION OF NEWS BULLETIN
  21. MAKING SPECIAL BULLETINS:Agriculture, Show biz, Fashion, Drama
  22. TECHNICAL CODES, TERMINOLOGY, AND PRODUCTION GRAMMAR
  23. TYPES OF TV PRODUCTION:Magazine Shows, Specific audience programming
  24. DRAMA AND DOCUMENTARY:Documentary film, Defining documentary
  25. SOURCES OF TV NEWS:Reporters, Correspondents, Monitoring, News Agency
  26. FUNCTIONS OF A REPORTER
  27. BEATS OF REPORTING:City reporter, Social reporters, Show-biz reporter
  28. STRUCTURE OF NEWS DEPARTMENT:Beat Reporters, Online media
  29. ELECTRONIC FIELD PRODUCTION:Sports, Electronic news gathering
  30. LIVE TRANSMISSIONS:Studio floor, Switcher, Master control room, Camera control units
  31. QUALITIES OF A NEWS PRODUCER:Determination, Awareness, Sharp an active
  32. DUTIES OF A NEWS PRODUCER
  33. ASSIGNMENT/NEWS EDITOR:Accuracy, Fairness and Reliability, Conflict
  34. SHOOTING A NEWS FILM:The Influence of telecast News
  35. PREPARATION OF SPECIAL REPORTS:Uncovering Truth, Reportage
  36. INTERVIEWS, VOX POPS AND PUBLIC OPINIONS:INTERVIEW, Information
  37. BACK GROUND VOICE AND VOICE OVER:Natural or Raw Sound, Sound Effects
  38. SPOKEN WORDS AND RELEVANT VISUALS:Digital Audio, Quantizing Error
  39. TALK SHOWS, FORUMS AND DISCUSSION PROGRAMMES
  40. FUNCTIONS OF VARIOUS DEPARTMENTS OF A TV SET UP
  41. PROGRAMMES DEPARTMENT:Program content, Television series by genre
  42. NEWS AND CURRENT AFFAIRS THE PROGRAMMING & SCHEDULING
  43. COORDINATION AMONG DIFFERENT DEPARTMENTS OF TELEVISION
  44. COORDINATION AMONG DIFFERENT DEPARTMENTS OF TELEVISION - 2 SUB-DEPARTMENTS AND SMALL SECTIONS
  45. COORDINATION AMONG DIFFERENT DEPARTMENTS OF TELEVISION 3