- Lip sync
Lip-sync or Lip-synch (short for lip
synchronization ) is a technical term for matching lip movements with voice. The term can refer to: a technique often used for performances in the production offilm ,video andtelevision program s; the science ofsynchronization ofvisual and audio signals duringpost-production and transmission; the common practice of people includingsinger s performing with recorded audio as a source of entertainment and; matching lip movements ofanimated characters (includingcomputer facial animation ). In the case of live concert performances lip-synching is generally considered controversial although in many instances it is required from aproduction standpoint to ensure quality for broadcast or a performer may be harmonizing with their own vocals.Lip-synching in music
Though lip-synching can be used to make it appear as though actors have musical ability (e.g., "
The Partridge Family ") or to misattribute vocals (e.g.Milli Vanilli ), it is more often used by recording artists to create a particular effect, to enable them to perform live dance numbers, or to cover for illness or other deficiencies during live performance. Sometimes lip-synched performances are forced by television for short guest appearances, as it requires less time for rehearsals and hugely simplifies the process of sound mixing.Because the film track and music track are recorded separately during the creation of a
music video , artists usually lip-synch to their songs and often imitate playingmusical instrument s as well. Artists also sometimes move their lips at a faster speed from the track, to create an unusual effect in the final clip.Artists often lip-synch during strenuous
dance numbers in both live and recorded performances, due to lung capacity being needed for physical activity (both at once would require incredibly trained lungs). They may also lip-synch in situations in which their back-up bands and sound systems cannot be accommodated, such as theMacy's Thanksgiving Day Parade which features popular singers lip-synching while riding floats.Some singers habitually lip-synch during live performance, both concert and televised. Others sing the lead part over a complete recording or over just pre-recorded music and backing vocals. Sometimes when this is done, the live vocals are less audible than the backing track. Some groups lip-synch supporting vocal parts or shared parts in order to maintain vocal harmony or to ensure balance of volume among several singers.
Some artists switch between live singing and lip-synching during performance, particularly during songs which require them to hit particularly high or low notes. Lip-synching these notes ensures that they will not be out of tune and that the artist will not strain their voice too much during an arduous concert. Once the difficult portion of the song has passed, the artist may continue to lip-synch or may resume singing live. Some artists such as
Fergie lip-synch choruses during songs, but sing the main verses. Some artists such asShakira do a wide range of different variations of lip-synching: from fully lip-synching to the original album track or a pre-recorded version, to lip-synching different parts of different songs (for example chorus and beginning phrase; a difficult part of melody), to having some of the instruments and back-up vocals (including the ones by the singer) being pre-recorded, to all singing and music being live.Lip-synching is almost always used in
musical film s, such as "La Vie en Rose" and "High School Musical " ("The Rocky Horror Picture Show " being an exception). However, when songs appear in non-musical films, such as "", where they are not large musical numbers, the actors sing live on set, but later dub their voices in ADR using a "better" performance of the song.Some artists may choose to lip-synch during live performance because of stage fright or perceptions of inadequacy. Unlike studio recording, live performance provides only one chance to sing each song correctly. An artist may worry that their voice is not strong enough, that it will sound noticeably different from recorded versions, or that he or she will hit a wrong note. Non-professionals often use lip-synching as a form of musical pantomime in which the performer moves his lips to a musical recording done by someone else. This form of lip-synching is often performed by
drag queen s and, more recently,drag king s.Other artists have chosen to lip-synch quite obviously for comedic value. During a short, pre-recorded performance, such as a guest appearance on a TV show, some artists purposely include easter eggs like swapping instruments between band members or playing their instruments in obviously erroneous ways.
The practice of synching also occurs in musical theater, for much the same purpose as for musicians. A production may include a mix of lip-synched and live musical numbers. In long-running shows, this may be done to help protect the performer's voice from strain and damage, as well as to maintain a high caliber of production. A notable example of using lip-synching as a special effect includes performances of "The Phantom of the Opera", where swing actors in the same costume as the lead actors, to give the illusion of the characters moving around the stage with some mystery.
Lip synching contests and game shows
In 1981 Wm. Randy Wood started lip sync contests at the Underground Nightclub in
Seattle, Washington to attract customers. The contests were so popular he took the contests nationwide. By 1984 he had contests running in over 20 cities and after submitting a show proposal went to work forDick Clark Productions as consulting producer for the TV series "Puttin' on the Hits ". The show received an impressive 9.0 rating the first season and was nominated twice for the Daytime Emmy Awards. In theUnited States , thishobby reached its peak during the 1980s, when several game shows, such as "Puttin' on the Hits " and "Lip Service ", were created.Lip-synching in films
In
film production lip synching is often part of the post-production phase. Most film today contains scenes where the dialogue has been re-recorded afterwards, lip-synching is the technique used when animated characters speak, and lip synching is essential when films are dubbed into other languages.ADR
Automated dialogue replacement , also known as "ADR" or "looping," is a film sound technique involving the re-recording of dialogue after photography.Animation
Another manifestation of lip synching is the
art of making a character appear to speak in a prerecorded track ofdialogue . The lip sync technique to make an animated character appear to speak involves figuring out the timings of the speech (breakdown) as well as the actual animating of the lips/mouth to match the dialogue track. The earliest examples of lip-sync in animation were attempted byMax Fleischer in his 1926 short "My Old Kentucky Home". The technique continues to this day, with animated films andtelevision shows such as "Shrek ", "Lilo & Stitch ", and "The Simpsons " using lip-synching to make their artificial characters talk. Lip synching is also used in comedies such as "This Hour Has 22 Minutes " and political satire, changing totally or just partially the original wording. It has been used in conjunction with translation of films from one language to another, for example, "Spirited Away ". Lip-synching can be a very difficult issue in translating foreign works to a domestic release, as a simple translation of the lines often leaves overrun or underrun of high dialog to mouth movements.Language dubbing
Quality film dubbing requires that the dialogue is first translated in such a way that the words used can match the lip movements of the actor. This is often impossible to achieve if the translation is to stay true to the original dialogue. Elaborate lip-synch of dubbing is also a very lengthy and expensive process.
In English-speaking countries, many foreign TV series, especially Japanese
anime , are dubbed to be put ontelevision . However, cinematic releases of films tend to come with subtitles instead. The same is true of countries in which a language is spoken that is not spoken widely enough to make the expensive dubbing commercially viable (in other words, there is not enough market for it).However, most non-English-speaking countries with a large enough population dub all foreign films into their national language before releasing them to cinemas. In such countries, people are accustomed to dubbed films so much that somewhat less than optimal matches between the lip movements and the speech are not generally noticed. At the same time, they are unaccustomed to subtitles, so they tend to find them distracting because they lack the skills to follow the on-screen action and the subtitles at the same time.
Lip-synching in video games
Early
video games did not feature prominent use of voice, mainly being text-based. At most, games featured some generic jaw or mouth movement to convey a communication process in addition to text. However, as games become more advanced, lip sync and voice acting has become a major focus of many games.Role-playing games
Lip sync is a minor focus in role-playing games. Because of the sheer amount of information conveyed through the game, the majority of communication is done through the use of scrolling text. Most RPGs rely solely on text, while some games display inanimate portraits to provide a better sense of who is speaking. Some games make use of some voice acting, such as
Grandia II , but due to simple character models, there is no mouth movement to simulate speech. RPGs are still largely based on text, with the rare use of lip sync and voice files being reserved forfull motion video cutscenes. Some newer RPGs, however, use full voice overs. These games are typically for computers or next gen systems and include such games as Star Wars: Knights of the Old Republic and The Elder Scrolls: Oblivion. In these full voice over games, lip sync is crucial.trategy games
Unlike RPGs,
strategy games make extensive use of sound files to create an immersive battle environment. Most games simply played a recorded audio track on cue with some games providing inanimate portraits to accompany the respective voice. "StarCraft " used full motion video character portraits with several generic speaking animations that did not synchronize with the lines spoken in the game. The game did, however, make extensive use of recorded speech to convey the game's plot, with the speaking animations providing a good idea of the flow of the conversation. "Warcraft III " used fully rendered 3D models to animate speech with generic mouth movements, both as character portraits as well as the in-game units. Like the FMV portraits, the 3D models did not synchronize with actual spoken text, while in-game models tended to simulate speech by moving their heads and arms rather than using actual lip synchronization. Similarly, the gameCodename Panzers uses camera angles and hand movements to simulate speech, as the characters have no actual mouth movement.First-person shooters
FPS is a genre that generally places much more emphasis on graphical display, mainly due to the camera almost always being very close to character models. Due to increasingly detailed character models requiring animation, FPS developers assign many resources to create realistic lip synchronization with the many lines of speech used in most FPS games. Early 3D models used basic up-and-down jaw movements to simulate speech. As technology progressed, mouth movements began to closely resemble real human speech movements. ' dedicated a development team to lip sync alone, producing the most accurate lip synchronization for games at that time. Since then, games like ' and "
Half-Life 2 " have made use of coding that dynamically simulates mouth movements to produce sounds as if they were spoken by a live person, resulting in astoundingly life-like characters. To date, the most accurate lip synching in any video game was displayed in a video featuring the new lip synching technology used in the co-op FPS "Team Fortress 2 ". Gamers who create their own videos using character models with no lip movements, such as the helmeted Master Chief from "Halo", improvise lip movements by moving the characters' arms, bodies and making a bobbing movement with the head (see "Red vs. Blue ").Television transmission synchronization
An example of a lip synchronization problem, also known as
lip sync error is the case in whichtelevision video and audio signals are transported via different facilities ("e.g.," ageosynchronous satellite radio link and a landline) that have significantly differentdelay times, respectively. In such cases it is necessary to delay the earlier of the two signals electronically to allow for the difference inpropagation times. See alsoaudio video sync andaudio synchronizer .Lip sync issues have become a serious problem for the television industry world wide. Lip sync problems are not only annoying, but can lead to subconscious viewer stress which in turn leads to viewer dislike of the television program they are watching. [ [http://www.pixelinstruments.tv/pdf/Articles/Effects%20of%20Audio-Video%20Asynchrony.PDF "Effects of Audio-Video Asynchrony on Viewer's Memory, Evaluation of Content and Detection Ability"] by Reeves and Voelker. ] Television industry standards organizations have become involved in setting standards for lip sync errors. [ATSC Document IS-191 ( [http://www.atsc.org/standards/is_191.pdf] )]
ee also
*
John Epperson , popularly known as Lypsinka on stage and in movies for his lip-synching abilities
*Lip dub
*Playback singing
*Presentation time stamp References
External links
* [http://www.visagetechnologies.com/ Visage Technologies AB: Software tools and SDK for Character Animation including automatic real-time lip-sync]
* [http://www.annosoft.com/ Annosoft LLC: Automatic Lipsync SDKs for game developers]
Wikimedia Foundation. 2010.