Next came flatbed edit suites, brought about by Steenbeck, known for a history of film editing, viewing and controlling tables since the early 50's. It was 1953 that saw the birth of the first Steenbeck flatbed 16mm 4 plate, the ST200. This was seen as new media at the time mainly du to the fact that included optical sound and playback of 16mm perforated magnetic tape. The ST100, a 4plate 35mm editor, soon followed in 1954. Soon the first editor with back projection on a ground class came about, the 35mm ST400, followed by the ST500 switchable for standard and Cinemascope.
Meanwhile live television editing, beginning in the 40's, led to kinescope recordings. This was basically a camera recording a video screen of a live broadcast, unfortunately this had many issues like banding and ghosting.
The next big step came about in 1956 when Ampex released the first 2" quadruplex video tape. Editing this was similar to editing films, the tape had to be developed using iron filings in a toxic, carcinogenic carbon tetrachloride solution. This was done so that a magnetic band could be seen on the film through a microscope, this helped for precision cutting. Unfortunately the audio and video could not be done at the same time as they were several inches apart in the machine, to counter this the video was cut and the missing sound was added back in the correct place.
In linear editing it has been argued that you spend more time waiting for tapes to cue rather than actually making edits. Linear editors aren't flexible and can be a hassle if you wish to re-edit a piece of film, this new edit must be on top of an old edit if you wish to o back and alter a previous edit. This can be a major problem if your new scene is longer or shorter than the old one, it can either run into the next scene or some of the old shot could still be visible.
The next big step came about in 1971, with the introduction of the CMX 600, even though only 6 were ever made, they paved the way for non-linear editing. The Avid 1 soon followed in 1989 and the main problem now was storage. It wasn't until 1993 that Avid increased the amount of storage available, releasing a 7 terabyte system. The success of this could be seen in 1997 when Walter Murch won an oscar for best editing for The English patient, he edited the film using an Avid.
Non linear systems require two main things, computer power and data storage. Non linear editing is far less time consuming and everyone knows that time is money. The ability to work in a more complex way with both video and audio can only be seen as a positive thing. Digital technology was also a huge help when it came to special effects. Forrest Gump (1994) made great use of this by using a digitalised set of images reconstituted frame by frame to remove Gary Sinise's legs from each frame.
Digital editing was the future, it was easier to copy, easier to sync and resistant to noise and in 1990 New trek released the first 'Video Toaster' on the Omega system. It had limited linear editing capabilities but it brought video production to schools, production shops and some small television studios. It had numerous effects and even had lightweight 3D. Five years later saw the birth of DVD optical disks, they used a new type of compression, MPEG-2. A year later, the first US public HD broadcast was aired, the process for this involved sending 35mm film through a telecine, which essentially scanned the film and made a digital copy. This can then be manipulated in a computer using special effects and compositing, once complete, the optical writer would place the video images back onto film.
From here the editing world didn't look back, Chris Watts revolutionised the DI process with Pleasantville in 1998, this was the first time the majority of a new feature film was scanned, processed and recorded digitally. But it wasn't until 4 years later with the release of Star Wars Episode II Attack of the Clones that we had the first motion picture to be shot purely on digital. In the late 2000s it became possible to shoot purely on digital and edit online, using the original full quality files.
In linear editing it has been argued that you spend more time waiting for tapes to cue rather than actually making edits. Linear editors aren't flexible and can be a hassle if you wish to re-edit a piece of film, this new edit must be on top of an old edit if you wish to o back and alter a previous edit. This can be a major problem if your new scene is longer or shorter than the old one, it can either run into the next scene or some of the old shot could still be visible.
The next big step came about in 1971, with the introduction of the CMX 600, even though only 6 were ever made, they paved the way for non-linear editing. The Avid 1 soon followed in 1989 and the main problem now was storage. It wasn't until 1993 that Avid increased the amount of storage available, releasing a 7 terabyte system. The success of this could be seen in 1997 when Walter Murch won an oscar for best editing for The English patient, he edited the film using an Avid.
Non linear systems require two main things, computer power and data storage. Non linear editing is far less time consuming and everyone knows that time is money. The ability to work in a more complex way with both video and audio can only be seen as a positive thing. Digital technology was also a huge help when it came to special effects. Forrest Gump (1994) made great use of this by using a digitalised set of images reconstituted frame by frame to remove Gary Sinise's legs from each frame.
Digital editing was the future, it was easier to copy, easier to sync and resistant to noise and in 1990 New trek released the first 'Video Toaster' on the Omega system. It had limited linear editing capabilities but it brought video production to schools, production shops and some small television studios. It had numerous effects and even had lightweight 3D. Five years later saw the birth of DVD optical disks, they used a new type of compression, MPEG-2. A year later, the first US public HD broadcast was aired, the process for this involved sending 35mm film through a telecine, which essentially scanned the film and made a digital copy. This can then be manipulated in a computer using special effects and compositing, once complete, the optical writer would place the video images back onto film.
From here the editing world didn't look back, Chris Watts revolutionised the DI process with Pleasantville in 1998, this was the first time the majority of a new feature film was scanned, processed and recorded digitally. But it wasn't until 4 years later with the release of Star Wars Episode II Attack of the Clones that we had the first motion picture to be shot purely on digital. In the late 2000s it became possible to shoot purely on digital and edit online, using the original full quality files.
No comments:
Post a Comment