I know I tell a lot of stories about the old days, but this doesn't mean I'm 100 years old.
Just wanted to get that cleared up for those of you keeping a tally of how many times I mention 1" tape and CMX edit controllers.
Now back to our regularly scheduled program, already in progress...
For those of you just joining us, welcome to the Mike Cohen Creative COW 100th Blogiversary.
"100 blog posts? So what," you might be saying.
Well I try to put a little bit of my personality and philosophy into every post. For me it's a big deal. It's a big deal not that I have composed and published 100 entries about my job and my life, but that in doing so I have gotten involved in other aspects of the Creative COW community. As a result of blog entries, I have had the opportunity to write magazine articles, to be interviewed in podcasts, to make friends and business associates and even to obtain potential clients for my company's services. Something that is good for the soul and good for business is, well, a good thing!
And from what I hear, the blogs in general are good for the Creative COW's business. Google searches often lead people to the forums. If I Google myself or certain keywords I have used in my blogs, these blog entries come up in results. Presumably I can't be the only one searching for "CMX edit controllers" or more likely "AVCHD editing in Premiere. If new first-time visitors to the COW get in via the forums, the blogs, the services or the video reels - that too is a good thing.
And speaking of good things, have you seen the wide selection and amazing displays of creativity in the video reels section? You could spend hours there getting free inspiration for your own projects. I've actually started taking notes as I browse the reels. Go ahead, click "VIDEO REELS" in the main menu..I'll save your seat.
So back to the 100 blog retrospective. The best thing to do is to browse back issues going back to 2007. It is educational for me to see what I was thinking at the time. So rather than regurgitating my favorite posts, I think I will regurgitate my favorite images as used in past posts. I get a kick out of grabbing a quick picture with my phone when inspiration strikes. I send the picture to myself with a note and then, often on a long airplane ride, fill in the gaps to try to tell my story.
This first one takes me back to my first position as a professional editor. The Ampex ACE 25 edit controller. For those of you who have only used digital nonlinear editing, lucky you. Back in the day, you had to have some engineering know-how in order to perform basic editing, assuming you were in a facility without in-house engineering expertise. For more on the subject check out this link:
This image brings back some memories. A surgeon I work with on a regular basis needed to do a live powerpoint presentation to a medical conference. He was in Vegas, the conference was in Portugal. Thus, he was scheduled to go on at about 4am Vegas time on a weekend. At that time of day, we couldn't get a local video conferencing suite, so we had to think way outside the box. WebEx is advertised and used as a great tool for corporate meetings, but using it in multiple locations including in front of a live audience can be a bit dicey. So we came up with a stop gap solution. This picture depicts our audio transmission system which included VOIP and two telephones.
Speaking of medical conventions, back in early 2009 we managed a conference on obesity surgery. Our company arranged the venue, the audiovisual and catering, invited the faculty, reserved hotel rooms and managed registration for about 500 attendees. Think of it as a mini-NAB for surgery. One of my roles was managing a day of live surgery. We streamed 9 surgical cases from NY, San Francisco, Miami, Michigan, Brazil, Chile and two other locations. Some signals came down ISDN, others via the internet. Everything went through a skybridge, and there was audio and video from our location going back to each location. To be even more clever, I created roll-ins for each surgeon and location, run off DV tape. This acted not only as a nice transition but also as a place holder in case of technical difficulties. It was a fun fast day with lots of audience participation.
My other jobs at this meeting were to document the proceedings for posterity (ie, transcription, publishing articles about the proceedings and possible future on-demand webcasting)...
And drinking a lot of coffee and tea.
2009 was the year I finally traded up to a smartphone. I went with the blackberry because most of the clients and doctors I work with use this device. It has made a huge difference in productivity while traveling and even while in the office. For example, if I have a hot and heavy editing session planned, I may not even boot up the laptop (e-mail computer) and just check the berry periodically. This can save an hour or more per day. You'll note around April 2009 the quality of my blog pictures improved significantly. Still underexposed and grainy, but certainly bigger!
Sometimes (a lot) I add pictures and anecdotes about food, restaurants and eating or cooking to my blogs. What the heck does this have to do with the multimedia business? Everything. If I am fed I have energy to do my job, or I have rewarded myself for a busy productive day.
Sometimes I take my pictures to the next level and make them into useful illustrations. Here for example I was talking about preparing for a trip. Charged batteries, extra tape stock and tightened wingnuts on your equipment make a big difference.
As mentioned, several posts talk exclusively about travel. I don't go to the ends of the earth or to exotic locations (with the possible exception of Cleveland) but I have been known to go to the ends of the airport terminal for a Mocha Chip Latte!
I also used the blog to follow our entry into high definition production. What better venue for HD imaging than surgery? Of course you can get plenty of discussion about formats, editing workflow and playback issues in dozens of forums, so I'll just wow you with some imagery:
Sorry if that was gross, but this is my business!
Just thought I'd take this opportunity to mention 1" tape, for those of you keeping track at home.
All that travel also affords the opportunity to snap some quality pictures with a real camera, and sometimes I like to share those images as well - and if you're lucky, a story to go along with it.
This was a unique venue for a meeting - Jackson Hole, WY - in August.
In 2008 I attended a convention in Toronto. Since my hotel was about a mile from the convention location, I got to see some of the sights morning and night.
This week I took the train down to Philadelphia for a meeting, took the train home, then two days later went back to Philly with the gang for a meeting. Sometimes conventions are in cities with things to see and a wealth of good places to eat.
Vegas is a weird town. The Strip is full of amazing sights and some shady characters - sort of an odd mix of themes. NAB and the Bellagio fountains are two of the highlights.
Post-Katrina, New Orleans remains a popular destination for meetings and the occasional video shoot. Just stay on the main roads.
Think I'll hang this one in my office.
How many times do you find yourself in Moline, IL with a few hours to kill? Those tractors are huge.
Another good reason to carry around a proper camera. And with that, we'll let the sun set on the first 100 blogs of my blogging activities.
I appreciate all the feedback and the readers. If this is your first time on the COW, welcome. For my old friends, thanks for coming back. I look forward to coming up with new stories, anecdotes, learning experiences, recollections and images in the next 100.
As always, thanks for reading.
Posted by: Mike Cohen on Jan 27, 2010 at 3:58:10 pm
If I am waiting for a plane or waiting for the distance between Connecticut and California to shrink, I sometimes recall past experiences. This was supposed to be the goal of this blog, but the present seems a lot more interesting. So we'll take a break from the hustle and bustle and recall a few unique experiences. One can always learn from the past. I am somewhat methodical in my, er, methods, so I'll go sequentially...
My first solo shoot. Having accompanied several shooters on about a dozen trips both in the OR and other settings, it was time to be the lead guy on a trip. We were doing a video about the immune system. Driving on the New Jersey Turnpike was a new experience for me, but it is basically a very busy road which passes through some really polluted landscape. Interestingly, ballast from ships returning from WWII was dumped in the swamps near the Meadowlands.
Anyway, I arrived at my location in Southern New Jersey. Setup One was a hospital room. A child was about to receive an infusion of intravenous immunoglobulins - basically a treatment for immune diseases such as Kawasaki's Disease or Guillain-Barré Syndrome. As it turns out, this was rather uneventful - took about 30 seconds. Next setup was a laboratory, where I threw some orange light on the background and shot various angles of a lab technician mixing the IV solution. Slightly more interesting. Finally on the ride home I stopped for a tour of my dad's office in Edison. Most interesting of all!
Cincinnati. Growing up, the only thing most kids knew about Cincinnati was WKRP and Loni Anderson. As it turns out there is more to this town than meets the eye. My particular shoot was at, get this, a hospital! Seriously, it was the first Ob-Gyn surgery I had filmed and it was a doosie. People often ask me if I get queasy. Usually the answer is a resounding No. On this occasion, the answer was Yes. Since kids and moms read the blogs, I will just say that there was a lot of blood, and you the reader can use your imagination.
Around this time we started going to Philadelphia a lot to work with two different world-renowned surgeons. One was another Ob-Gyn surgeon specializing in reconstructive surgery. Again, without going into too much detail, he is a specialist in fixing problems with incontinence and other problems in that region. But the best part about this surgeon was his personality. He had some funny stories to tell about his patients. Knowing there are some comedians here on the COW, see what you can do to setup the punch line "I need you to water my lawn!" During one operation, this surgeon said, "This ovary is really calcified. Mike, you gotta feel this." So I dutifully put on some gloves and was handed the excised organ in question. It was indeed the most calcified ovary I had felt (up to that point anyway.)
The week after my honeymoon, I was off to South Bend, Indiana. You get off the plane and there in the airport is a shiny new Hummer - the big Ah-nold version. South Bend is home to AM General and of course the University of Notre Dame. This particular project was about Cryosurgery. Not quite as cool as it sounds. There is an opportunity for a Mr. Freeze joke here, but I have already mentioned the Governator once, so I will let it go. Basically, for two days we were setup in a local doctor's office with a camera. Each patient that came in had some form of wart to be removed. Technically, not all growths on one's face or neck are warts. There are plantar warts, moles, skin tags, and other exciting appendages. Amazingly, a liquid nitrogen spray freezes the little devil and after a few treatments it will fall off. Hasta la vista, baby. We also did a video about using the cryo spray gun, venting the excess liquid nitrogen, and learned how to make a fun fruit punch for Halloween parties!
LA. Although I visited LA once during college on a family vacation, this was my first trip sans-Griswalds. We were shooting hernia surgery. The surgeon asked that we shoot with two cameras. May I remind you this was 1997. Video cameras and operating room tripods weighed in at about 90 pounds each at the time. So my colleague Mike and I took some fly by night airline out to LAX and checked into the Riot Hyatt on Sunset Blvd - the hotel where Led Zep was known to tear it up back in the 70's. We hit Hollywood Blvd, went to the Ripley's museum and stuck our hands in the various imprints on the Walk of Fame. Thinking back, this is a good way to pick up an unwanted souveneir. Should be called the Walk of E. Coli. Anyway, the next day we arrived at the hospital setup our two hulking towers of stainless steel and BNC cables, cameras (HL35(?)) tube camera with dockable MII deck on one side, HL55 2/3" CDD camera with portable BetaSP deck on the other. Lock and load! This video remains a top seller, if hernia surgery is your cup of tea. It is one of the most common surgeries performed in the world, so it is the cup of tea of a lot of people.
Detroit. Cue KISS music. I visited the Motor City a few times this year, just the luck of the draw I guess. For the first trip, I stayed at the Omni Hotel, a glass structure reminiscent of the Bonaventure Hotel in LA - a hotel I always wanted to visit since seeing the 1980 Michael J. Fox classic "Midnight Madness." This futureworld was complete with a Ford display, monorail and a direct link to GreekTown for all the lamb chops and souvlaki you can eat. The next morning I arrived at the hospital for a planned C-Section. The doctors had determined that the full-term baby had gastroschisis - a disorder in which the intestines have herniated through the umbilicus during development and stayed there. Having never seen a baby being delivered, this was very exciting. What was not exciting was seeing the baby rushed to another OR, while I had to break down my gear and move it all to the other OR before they did the surgery without me. No worries, I think one of the surgeons helped carry the tripod (the big 70 pound one) One of these days I will post a picture of the beast. For now here is a cartoon.
Once in the OR, the surgeons reduced the bowel back into the baby's abdomen and used some mesh to reinforce the skin until it grew large enough to accommodate the contents. In other words, the baby grew inside the womb with some of the bowel outside of the abdomen, so there was not enough room to put it all back where it belonged.
The New Millenium. High tech was upon us. We had recently started using the Media 100 XR for most of our projects, but we kept two online edit bays up and running. Since LVD SCSI drives were very expensive ($3500 for 9 gigs) the long form projects remained on 1" tape.
One particular long project was the creation of a 25 tape video library. A surgeon has previously recorded about 50 DVCPRO tapes worth of live cases. So it seemed we were starting out with good material. However, the switcher feeding the DVCPRO deck was not synced to anything, so every time there was a cut there was a loss of picture sync. Oddly the audio continued. So first we had to dupe off the tapes to a new reel, in order to be able to use the source tapes for online editing. Once that was sorted out, we sat with the surgeon for a few days editing all the cases down to length, then a few weeks later recording hours of narration for final editing.
While we were in fact in the 21st Century, the world had not yet caught on to this fact. We went to Baltimore to document a conference, basically consisting of 2 days of slide lectures. In 2001 Bill Gates had not yet convinced the whole world to start using PowerPoint. Thus, as each speaker presented his slides, we had a guy in the back of the auditorium scanning the carousel of the previous speaker.
However because it took 60 seconds per slide scanned, it became a bottleneck. Well, the conference ended, we managed to return all slides to their rightful owners, and we journeyed back home to begin editing the slides into the video. Each lecture became a 400x300 Sorenson Quicktime file to be integrated into an Authorware CD-ROM. Seemed so high tech at the time!
End of BCE - Before Computerized Era
2002 - The Computerized Era - in other words, the time by which everything was digital, and the 1" machines died. It was a long time coming. With that I will sign off for now. The fact that I can recall such details from BCE is pretty incredible since I don't recall what I had for breakfast today. I will pick up this trail in a future post tentatively entitled "Tales of a Fourth Grade Editor."
Thanks for reading. Sorry about the graphic story....Ah yes, it was eggs!
Posted by: Mike Cohen on May 19, 2009 at 6:06:45 pm
The usual edit bay (we used to call them edit bays back when an online editing suite resembled the bridge of the Enterprise. Today an edit bay resembles a computer desk - results may vary - consult your pharmacist)…The usual edit bay may or may not have a window. Very often the only light is dimmable track lighting, perhaps a lava lamp and the soothing red glow of the mouse.
My particular office is just that - an office - in which we happen to do editing.
Office Half of my Office
Edit Bay Half of my Edit Bay
Lately my colleagues have been using my editing station for their HDV projects, since I seem to be caught up in non-editing work and the computer seems pretty stable. To clarify, much of my non-editing work is planning for future projects in which I may or may not do the editing, as well as all the other stuff that goes into a multimedia project besides the actual production work - this is called Project Management and is in fact my primary job function. Thus an office with a window, task lighting, an overhead fluorescent that is never used, a potted plant or two and a generous drawer full of snack products makes for a more productive work environment.
As also described in almost every post, one thing my job does include is not being in my office very much! While my days away, excluding excessive amounts of travel time, are scheduled pretty tightly, the travel time itself, hotel time and time spent in a fuel tank with wings suspended seemingly by magic 7 miles up afford the opportunity to actually be productive…maybe.
The ability to use a laptop for anything more than watching a movie depends upon several factors:
1. Leg room - this may sound trivial, but the ability to extend one’s legs fully makes the experience much more comfortable.
2. Seat reclinability - along the same lines as leg room, the more you can recline your seat, the better. Even an extra inch or two frees up your elbow joints so that your hands rest in the proper position on the keyboard. This is where keyboard shortcuts in your editing app are really important.
3. Tray table extendability - some seatback tray tables extend away from the seat on rails, some don’t. Although your elbows need to be crammed into your neaghbor’s kidney, the extending tray allow you to extend the laptop screen to a viewable angle. Given the high reflectivity of my Dell’s screen, viewability is key.
4. Timing with cabin service. You must be skilled in handling a hot drink with one hand while protecting your electronics with the other. A good strategy is to boot up the computer as soon as the bell dings, then when you see the drink lady(or guy) coming, close the laptop and place it, get this, on our lap, then lower the tray table to protect the computer. When your drink arrives, grab the cup not with your hand in the usual cup-holding position. Rather, place your pinky and ring finger under the cup, your middle finger and thumb on the sides of the cup, and your index finger on the rim of the cup. This affords the most stability against spillage for a cup filled to an unknown level with any beverage.
Once you have the drink in hand, drink it as fast as possible then stash the empty cup in the seat pocket, and retrieve your computer from its protective zone.
5. Extra battery - if it is a long flight, or if it is a short flight and you are doing something that is battery intensive, such as editing or rendering, extra juice is important. My Dell Vostro came with a small battery. I added a 2nd larger battery and try to keep both topped off at all times. Make sure you set your battery alarm so you know when you have about 10% left, giving you time to shut down and change bricks.
Once you are situated, hydrated and ready to work, aside from the T-Rex arm syndrome, you can go about your business. While I usually choose an aisle seat for easy access to the snack counter (what, you don’t fly Emirates Air?) a window seat lets you occasionally have a look out at the heavens above and the clouds below. You can indeed be both productive and relaxed in your edit suite with a view.
Thanks for reading. It is time to turn off my portable electronic device. Good thing I fully charged my
Posted by: Mike Cohen on Apr 20, 2009 at 7:10:47 pm
While this is the title of a popular show on BBC, I borrow the name for this post because I have come across some treasures of my own in my soon to be former office/loft/former attic.
It is time to relocate my office temporarily. In cleaning up my mess, I have come across boxes of old papers - mostly junk destined for the bin. Here is the first gem, directions for setting up your edit suite. Honestly I neither remember writing this nor doing most of it. How quickly we get used to newer technology! Enjoy:
The Finer Points of ACE 25 Editing
I. Before you can edit, you must align the source decks through the switcher.
A. Place tapes in each source deck, making sure the decks are in "remote", are attached and assigned to the ACE 25, and have the proper crosspoints on the switcher.
B. From the ACE 25, hit "ALL" and "PLAY" to roll all your decks.
C. Now, choose a deck, usually MIIA, and dissolve between that and color bars on the Program bus.
D. Does it dissolve smoothly without a shift?
- If it dissolves smoothly, move on to the next source, and try a dissolve.
- If it does not dissolve smoothly, follow these directions:
1. Place Color Bars on Program.
2. Look at your switcher in the Waveform Monitor.
3. Make sure the Waveform is on "External Reference."
4. On the waveform, press "magnify" and, use Vertical, Horizontal and Variable Gain to place the leading edge of sync on a large vertical line on the 0 IRE horizontal line, with the bottom of the signal resting on the -40 IRE horizontal line.
5. Now that you know where the sync (reference) is supposed to be (because no one should be messing with the sync settings of the color bar generator), cut to your first video source on the switcher.
6. Using a small, flat-head screwdriver or "greenie", turn the H-PHASE pot on the source VTR's TBC so that the sync is As Close As Possible to where the color bars are (on the waveform).
7. Now that the sync is the same going into your first source as it is for the switcher, try a dissolve between your source and the color bars. Incidentally, the color bars and house reference are generated from the same unit, so color bars represent correct house reference. This is why reference is called "reference." Get it?
8. It should be OK. You may have to adjust the V-PHASE pot on the source TBC so the position of the picture lines up, such as when editing animation or artwork.
9. Finally, your SUB-CARRIER, or color phase needs to be checked. The best way to do this is to perform a dissolve from the source to the Color Bars on the switcher, stopping the fader bar before it completes the transition. Look at the Vectorscope. Are all six vector color points in the correct boxes? If they are, you're good to go. If not, make sure your setup, video, chroma and hue pots are all in detent position. Now, using your small screwdriver, turn the SUB-CARRIER (SC) phase pot, the Coarse control, to get the vectors in the correct range, then the fine to rotate to exactly the correct position.
9a. Everything up to here has been for synchronization of your source - that is, confirming that the electronic signals are compatible for A/B-Roll editing. You also must check your color bars on each tape to ensure that the TBC Hue, Saturation, Luminance and Setup controls are in the correct ranges on the Waveform and Vectorscope. Without setting up your source tape's bars, you may not get correct video color and luminance values in your edited tape. This is why you record bars in the field. What? You didn't record bars in the field? Naughty! Thus, your scopes serve multiple purposes, both from an engineering point of view, as well as a visual check against your less accurate eyeballs!
10. Now, check your other sources, using the First correctly timed source as a guide, not the color bars.
11. Once you've gone through this process for each source, you should be ready to begin editing.
II. But first, you must setup your record VTR. If it is a 1" Machine:
A. First black a new tape (for insert editing, or for your first edit in assemble):
1. Find a used or new reel of 1" type C tape.
2. Load your tape onto a VPR-80 1" machine, properly threading the tape according to the diagram inside the front cover. Don't forget to turn on the TBC and make sure the input is correct.
3. Go into "Setup" and set the time code, Press 20, enter, 58:00:00, enter, backspace.
4. Set Full Frame time code, while in Setup, Press 24, enter, 1, enter, backspace.
5. Press "Setup" to exit the menu and press "Ready" to start the machine a runnin'.
6. Make sure you have video going into your 1" Machine. Patch your switcher OUT to 1" IN on the video patch bay. Patch your audio as well (either from the audio mixer OUT to 1" IN, or directly from the source deck to the 1" as the case may be.
7. Now, you must check the RF level going into the 1" machine. With the machine in READY mode, press STOP. Now press SETUP, 1, ENTER and PLAY and RECORD. If the tape starts recording, press STOP and start over. The tape should not roll. When done properly, the red RECORD button should be illuminated, and you should hear a fast clicking noise coming from the scanner.
7a. Now, with a small screwdriver, turn the REC RF LEVEL pot so the needle in the Video/RF meter is in the center of the green area.
8. Press STOP.
9. Punch up BLACK on your switcher.
10. Press PLAY and RECORD on the 1" machine, and record until you have adequate length for your program.
11. Be sure to record an extra minute or two past the intended length.
III. Non-videotape sources.
A. The main non-videotape source is the Targa 2000 Photoshop computer, which we use as a still store. This is a RGB signal that goes through the Integrated Graphics Module (IGM) to be encoded into composite video, which then feeds the Vista switcher and the patch bay.
B. Go to the IGM and open the front panel. You will see a SC Phase pot, with two lines, one marked GVG the other Vista. With your small screwdriver, or your fingernail if need be, turn the pot to whichever switcher you are using. The SC Phase should be the only necessary adjustment for this source. If you do a transition, and there is a shift from the previous source, you may carefully adjust the Video Position. Remember, Video Position is a visual adjustment.
Time to Edit
I'm not going to teach you how to edit, you should already know this. If you have followed the above steps carefully, you should be able to produce a beautiful program without any color or H Phase shifts. Good luck!
The above should take about 30 minutes on a good day, or all day on a bad day. We had some gremlins which seemed to live in the equipment racks, so some days were pretty dicey.
I had forgotten all about the RF level check on the 1". I have never forgotten, however, the task of disassembling the VPR 80 scanner head assembly, replacing a $600 video record head and who could ever forget manually cleaning carbon dust out of the scanner motor brushes to delay purchasing new ones.
For readers who have only ever used NLE systems to edit, be thankful. For those who started with on-line editing and are now using an NLE of choice, I hope you all survived the online days with your sanity. There was nothing worse than trying to tell a client that these delays in the edit session are normal. Pay no attention to the exposed parts and the extra screws on the floor! Those were the days, Edith.
Now, where did I put my greenie? I have a SC pot to adjust!
Thanks for reading.
Posted by: Mike Cohen on Jan 17, 2009 at 1:16:07 pm
If you grew up in the late 70's or early 80's like I did, you could not go five minutes without seeing one of the now famous spelling commercials on tv. First came the Oscar Meyer jingle:
Oscar Meyer has a way of B-O-L-O-G-N-A.
Next came Tommy Lasorda spelling relief R-O-L-A-I-D-S.
No wonder I can spell so well!
The subject of this post, thus, is about the letters (and symbols) that mean a lot to me:
Relief from endless mouse clicks and eye movement in Premiere is spelled J-K-L-;
Let me explain. Back in the day I was edited a video on our old ACE 25 edit controller. For the benefit of those younger than say 25 we used to cut video machine to machine, using DOS-based editing controllers to not only sync up the decks, but also control via GPI signals the switcher, CG, DVE or ADO and in some cases the audio mixer. The ACE-25 actually had a built in audio mixer. Thus during a preview, you could set the timing of the switcher effect and setup your audio levels. Then when you hit "execute", actually hitting the preview and record buttons together, the machines would pre-roll, roll and record your edit automatically. Pretty sophisticated work for a 8088 processor with half a megabyte of RAM and a lot of dust bunnies inside the CPU. Here's a picture of the ACE 25 and the VPR-80 1" to get you in the mood:
The whole reason for this trip back to Hill Valley, circa 1955, was to talk about motion memory. In other words, do something enough times and you can do it with your eyes closed. Operate the ACE-25 long enough, and you know where the keys are without looking, even the keys that change function depending upon what the CRT display shows. Touch typing is the same, although as I get older and my typing gets faster, I Make the sammee errorsa with greater frequencyu./
Enter Nonlinear editing. The AVID has always had the custom keyboard with labeled functions. For Premiere you can buy such a keyboard, however some of the built-in factory keyboard commands are logical, while others cause two serious problems. Anytime you need to use the mouse, for functions that are used with any frequency, you are putting yourself at risk for RSI. Also, anytime you need to move your eyes from the screen to the keyboard, just to change finger positions, you lose focus. Do this 500 times a day and you could lose hours of productivity.
Now to the subject of this post. J-K-L are the default Premiere keys for Play in reverse (J) - Stop (K) - and Play forward (L). A lot of the editing I do is cutting a full reel down using the razor tool, then ripple deleting the whole sequence to get my first edit. I have thus changed the default Razor at Cursor command from CTRL + / to ; - thus I can easily park my ring finger of my left hand on J, made easier to find by that little pimple on the key, my middle finger on K and my index finger on L, and without moving my eyes or my mouse, move my index finger to ; to razor at the cursor. I still need to use the mouse to select and delete a clip, but there must be a way to do that with keys also. I have thought about changing SAVE to the H key, so I can save my work without changing positions of hands or eyes. And to boot, you can hold down K and either hit J or L one press at a time to go forward or back one frame at a time, meaning you don't need the arrow keys, and you can hold down K and press and hold J or L to move forward or back slowly. This is like switching a machine gun from burst to full automatic. (As if I have so much as seen or touched a machine gun; I read all of Andy McNab's books and pretend to know what I am talking about!).
Here, in the very latest HD video quality known to us insiders as Samsung Cell Phone Video, here's what I call "dance of the digits":
So in summary, learn your keyboard shortcuts, but change them to suit your workflow and habits. I have weaned myself away from some of the Media 100 shortcuts, which I depended upon for the first year or so of Premiere usage. However CS4 has changed selected functions, such as Target higher or lower track, so it's time to retrain my brain yet again. I am going to have to start doing crossword puzzles in order to keep my brain firing on all cylinders. Maybe I can run an RS-422 cable from my cerebral cortex to my dusty old ACE-25 and call upon the unprecedented power of MS-DOS 3.0!!
Thanks for reading.
Posted by: Mike Cohen on Jan 5, 2009 at 5:41:57 pm
I'd like to say I have been so busy catching up on work that I have not had a chance to write new posts. This is partially true.
Partially, I took a week to visit my folks in insanely sunny Florida, helped my wife through some medical troubles (no, for the last time, I did not film it) and have in fact been pretty busy at work.
Oh, and I discovered I know a lot of people on Facebook.
But back to the important topic of workflow.
As described in excrutiating detail in previous posts, I make the most of to do lists, post-it notes, scraps of paper, e-mails to myself, Excel spreadsheets and various other attempts at self-organization.
I recently completed a project which was an excellent exercise in organization. I will describe it in generic terms, but give some specific example of learning points.
The Documentary/Promo/Movie Trailer to Promote a particular career
Ok, I guess that wasn't too generic. It is an interesting project.
We pitched a casual documentary style approach, using inspirational interview clips and relevant b-roll, good music, and little to no narration.
Once we had cleared the various PR hurdles, we got three great days of shooting at several medical schools and hospitals, including a c-section. I developed a list of questions, and while conducting each 30 minute interview came up with follow-up questions designed to get people to talk about what they do best(which is not talk about what they do best. What they do best is do what they do best.) It is my job as producer to draw out performances, even and especially unscripted candid interviews. We also tried some possibly hokey segments, some of which will never see the light of day!
The next step was to digitize (capture) all of the raw footage, 4 66 minute DVCPRO tapes and about 15 mini-DVCAM tapes. We shot primarily with 2 V1U cameras in DVCAM mode (incidentally, the two cameras did not match as I'd hoped they would) and shot a few interviews with the DVCPRO, although we could have left this at home and saved gas.
After 3 days of digitizing, while doing other work of course, the next phase begins - logging. Rather than logging the tapes before capturing (digitizing) I capture and then log.
Take each interview subject and isolate unedited on its own sequence. In some cases we shot an interview with two V1U cameras, with the lenses practically touching, one wide one tighter, to facilitate editorial or time based edits without jump cuts or dissolves. This is a good way to simply edit out long pauses, ums, ahs, coughs, or retakes. But one can also compress a long thought into a short one. Since the V1U has no timecode output, we try to have each interview subject clap their hands, which is an easy way to sync things up. We generally had a lav going into only one camera, so you find the good audio, fill left or fill right, and turn off the track from the other camera.
With each person on his or her own sequence (timeline) I next chop up the timeline into topics. In other words, I edit out the sound of me asking a question, so I am left with the person answering the question, with black spaces. I like to do things methodically, so I do this for every sequence, before actually viewing the material in real time.
Before proceeding to the next step, just for some psychological reason, I like to know how much material I actual have to now go through. So I ripple delete the spaces on all the timelines, so I can write a time next to each person's name. My yellow lined paper now looks like this:
Harrison Fjord - 22:00
Barbara Edyen - 7:00
Bruce Willjyis - 4:15 (boring)
Peter Jaquson - 11:15
you get the idea - I can now tell myself, "Self, you have 1hr 33 minutes of interviews to watch."
Not so bad.
With everything chopped up, I now go back to actually listen to the material and make notes. On yellow lined paper, I write the person's name and then a few words for each unedited chunk of interview:
1 - why he got into his career - money of course!
2 - met his wife on a project
3 - When he knew this was the career for him - he could produce mediocre work and still get rich
4 - etc
5 - so forth
6 - so on
After this step, I now have a few sheets of lined paper. Now to select my, er, selects. I do this on paper, placing a check mark next to the clips I want to use. I go through each timeline, and just eyeballing the clip number, move the selects to a higher track. With this done for every speaker, I then copy and paste the selects to a new timeline, and watch it all in order. I save this timeline as edit 1.
Now I save as edit 2, and start weeding out the, er, weeds? My goal is as short as possible to get the message across. The goal was stated as between 3 and 20 minutes, whatever works. I wrote on my yellow paper:
Edit 1 - 23:00
Edit 2 - 17:00
Edit 3 - 14:00
As the amount of material is reduced it becomes increasingly more difficult to make cuts. I got to about edit 5 and maybe 7 minutes of really good gems.
The page down - spacebar to play - delete to delete keys makes things a bit easier in this process, although you need to use the mouse anyway.
Now for about three weeks (3 months) I had been brainstorming ideas on how to actually cut this together. The brainstorming started before we actually shot anything, but not knowing if we had a chance to get the kind of material I was envisioning. The plan came together - somebody call Howlin' Mad Murdoch.
With my shortest-humanly-possible-without-losing-some-nice-moments version in front of me, I came up with an editing format I was happy with, and realized I need to add some more time in the form of b-roll, SOT and some more interview segments from a second set of questions. Time rose back to about 15 minutes. Some efficient use of the three camera setup for one SOT sequence and some thoughtful cutting of the interview segments got me down to about 11 minutes. I next added a title sequence and conclusion and hit 12 minutes and change.
I watched this edit (7) another time or two and tweaked some edits on each pass.
Now to add the music and hopefully make it more engaging.
I recently added a bunch of new CDs to my Firstcom contract, so I grabbed a few of the new titles, and found some contemporary sounding music. I decided there should be music under the whole program, including the interviews and SOT segments. Since there is no 12 minute track in the Firstcom library (I know, Sonicfire Pro could do it) I decided to use different tracks based upon the mood of the music and the subject matter being discussed (someone call Steven Shmeeldurg, maybe he can use that technique!).
Some of the Firstcom discs include just the audio CD, so you need to rip the music. Others include a DVD-ROM of AIF files, including both the full mix and the separate instrument tracks. The separate tracks makes things more fun and you have more control over the mood. This also helps transition from one piece of music to another - you can bring in the drums or piano before the previous song fades out - hopefully this makes it less jarring. But kids these days are used to quick changes, right?
With the music added, I spent a few more hours perfecting the mix, and then time to render out to FLV for web viewing. Oh wait, have to color correct the multiple cameras, right. Premiere has numerous color correction tools, and it took a little while to find the right combination or 3-way color correction, Proc Amp, HSL, Levels and Equalize (not all of those and not the same cobination on all clips). Not bad for a first pass, we can tweak it on the final edit. Remember this is for the firtst edit.
I posted it online before heading home for the evening (incidentally, all of the above took about 4 days of focus.)
Once at home, I watched the full video over my DSL - always a good idea to check out your work via a home computer setup. Although we have cable modem attached to our network at the office, the home DSL experience is a good test.
It looked pretty good, so I e-mailed the client a link.
The next day most of the feedback was very good, a few comments about music choices and some of the interview clips, but these things are very easy to fix. I was also asked for a script. I quickly made a two page Word doc listing the times and brief summary of each sound bite, just so people could refer to this while reviewing. Once we lock things down for the final, a full transcript will be needed for approval. It is a good idea to have a transcriptionist in your rolodex (what's a rolodex?) for these purposes.
I should add that during the final day of editing, I was getting the inexplicable "Sorry, a serious error has occurred, Premiere needs to close." error, usually when doing anything in Premiere involving doing anything with any function. Not good when you are almost done with a project. On a few occasions I lost about 10 minutes of work. It seemed the faster I worked, the less frequent were my manual saves, and Premiere's auto saves were set to 20 minutes.
After a few frustrating incidents, I set auto save to 1 minute intervals - a little annoying, but even with frequent crashes I did not lose too much work. I dealt with this hassle so I could finish the project.
Once the video was online for client viewing, we determined a few things about my computer. First, someone had installed AOL instant messenger without permission - whether or not this was the culprit, it wasn't helping. Next we tested the RAM and that checked out. So next was a reinstall of all Adobe products. This seems to have fixed the problem, although I have still had a few Serious Error crashes, but nothing like before. I'm sure we will figure out the problem eventually.
Thanks for reading.
Posted by: Mike Cohen on Aug 5, 2008 at 6:03:08 pm
So, day two of the shooting extravaganza went great. We started in the OR with about a dozen people and 4 cameras, then broke into two crews for the rest of the day.
I hired two actors to role play about 15 different training scenarios. While we had about 25 pages of scripts, we improvised some new scenes and modified or deleted existing scenes. Overall it was loads of fun, single-camera film style setups. I even did a little acting.
As the day progressed from 7:30am call time to 4:30 wrap, people were getting a bit punchy and goofy, and there are some great outtakes and giggles, which keeps everyone engaged and working together.
I say this every time I participate in this type of shoot - it is some of the most fun one can have as a job.
Upon arrival back at the hotel, I went for a little walk around the Denver State Capitol grounds. I had toured the Capitol during a previous visit, however I neglected to take any photos at the time.
So I snapped a few shots of the Capitol and surrounding area, then walked around a community festive gathering before meeting my colleagues for a great P.F. Chang's dinner.
Most of the pictures I use in my blogs are cell phone pictures. It is often much easier to snap a photo with the phone, which is always on my person, than to carry my pocket digicam everywhere I go. And the pictures are not too bad at web sizes. And, since I bought this amazingly tiny 1 gigabyte memory card, I take a lot more useful pictures without worrying about running out of internal phone memory.
Not to mention the fact that I can e-mail the pictures directly to my Flickr page, so i do not have to carry around a card reader when I travel - if, that is, I want to blog while I travel. Even more Star Trek is the ability to e-mail camera phone video clips directly to my YouTube account. I am looking forward to the day when I can e-mail video from my HD camera directly to my office computer. One day perhaps.
Incidentally, check outPicLens, a plugin for Firefox. you have to see it to get the concept - way cool way to browse internet photos and videos.
And so my friends, this completes the summary of this adventure. Tomorrow it is back to CT, with more adventures to come.
Thanks, as always, for reading.
Posted by: Mike Cohen on Jun 8, 2008 at 9:17:53 pm
Recently I was asked to make a video loop to play on the hotel television system during this week's convention. No problem, I had already begun receiving videos. The format requirements were simple: DV tape, DVCAM tape or authored DVD.
In reality, I received videos in the following formats: DV tape, Authored DVD, Windows Media, MPEG-1, MPEG-4, H.264, DIVX - all the usual suspects.
In most cases this is not a problem, Premiere Pro 2.0 will import just about every format. A few files had to be converted to another format due to the wrong audio frequency (32 vs 48k). Two videos came in without their audio, so i used Squueze to convert the original file to an MP3,import the MP3 and line it up on the timeline.
Next problem, the videos in a non-720x480 format, about half of them, cannot be stretched to full screen without losing image quality, which in effect would make the authors look bad. So I decided to make the project 16:9. I used a Jumpback from Digital Juice as the background, took the name of the medical society and put this on the left and right sides of the screen ESPN style, and then depending upon the size of the videos centered the image at whatever the maximum size possible for each file. Not too bad, makes it look like it is supposed to be shrunken. Compared to the 720x480 and even the 640x480- videos, the smaller ones don't look so small, because everything is part of a larger display.
After every two author videos is a brief promo clip from the sponsor. Their production group edited the promo in HD, so I asked them for an anamorphic 16:9 DVCAM tape, which imported into the Premiere 16:9 SD project. With the clip conformed to 16:9 it filled the frame perfectly.
On the hotel system, the DVD player feeding the cable system correctly letterboxes the DVD.
On the Plasma screens scattered around the hotel, the standalone DVD players correctly play the DVD anamorphic.
So the learning point here, is given a mixed grab bag of video formats, one can make it look appropriate, make each author look as good as possible and serve more than one display scenario with one project.
Now if I could just find my room key!
Thanks for reading.
Posted by: Mike Cohen on Apr 13, 2008 at 7:55:44 am
No, that's not the name of a new company - but it sounds good actually.
I actually was thinking of the term "go fast boats" as used in the Miami Vice movie(it has been on HBO in a loop). Basically fast racing boats used for smuggling.
This week was a go fast production week.
Monday - Pack my gear, print Google maps of two hospitals and a client's offices in Massachusetts. Fuel up the Wagon Queen Family Truckster (Saturn ION) for a mere $30, and hit the road. I also hit the library to stock up on books for my wife and hit the supermarket to get her some provisions.
Lately my best friend has been a thermos bottle. I brew some coffee using a French Press, add a few spoonfulls of hot cocoa powder and little milk to the thermos bottle, then fill it up with the brew. This stays hot and comforting all day long. I pull over at every rest stop, or about every half hour, and have a small cup using the screw on lid from the bottle. By doing so, I guarantee that I need to stop at every rest stop for obvious reasons.
First stop Brockton Hospital to visit my dear Grandpa Izzy. After an hour or so of visiting, I hit the road for Burlington, MA. Checked into my hotel, a Candlewood Suites. I specifically chose this hotel because it offers a microwave, fridge, stove and even a dishwasher. Although only staying for two days, it is much more enjoyable to me to have breakfast in my room. The hotel has a little food pantry with non-hotel prices for cold cereal, milk, muffins, cookies, cans of soup, juices and the like.
Tuesday - Meet client at a local hospital at 7:30am, get changed into scrubs, get to the OR, setup my gear, plug my DV recorder into the video laparoscope, test the recording, then go to the cafeteria for some toast and mediocre coffee, then back to the OR for the case. Lately I have been shooting surgery with 2 cameras - one overhead, one on sticks.
After the case, I packed up my gear and went back to my hotel to check e-mail, make some phone calls and grab a sandwich. Then I headed back to Brockton to see Izzy for a few more hours and help move him to a nursing home for a (hopefully) temporary stay.
Next day was up to the client's offices for some tabletop product shots, lunch, and some more shooting and brainstorming.
Wed evening I drove back to CT, with a few stops for bad gas station coffee (I may need to start traveling with my French Press and a 12 volt water kettle for the car) and a stop at Trader Joe's for some raspberry jam and gluten free pasta. Got home, not really hungry, I watched this week's episode of New Amsterdam and part 3 of the fantastic John Adams miniseries. Check it out.
Thursday AM - Fire up the trusty laptop, plug in a USB hard drive with 300 gigs free, and capture all my raw footage from this week. While the tapes were loading, I did some more e-mails and did the dishes. Got to the office around 12:45pm and spent the rest of the day on correspondence for other projects, and started chopping up my video from this week. Oh, I also had a conference call at 7:30am!
Thursday evening at home, with the rough narration and script in hand, i cut the first edit of the promo, finishing around 11pm. I rendered an AVI out of Premiere, then used Squeeze to make a WMV file(scaled down slightly from native size - this project is 16:9 SD), uploaded that to our web server for the client to download and shut off the computer around 12:45am.
Friday AM - got to office around 10:30am - more correspondence and followup on other projects, reviewed some DVDs from a colleague, checked the progress on a 2500 DVD in-house duplication project (slow going) and then started preparing some digital stills and graphics for the next edit of the promo. Got home at 5:30pm, watched 2 episodes of Gene Simmon's Family Jewels then fired up the computer for hopefully the final edit of the promo. Final narration from the narrator arrived, new music requested, and some new graphics. Finished at midnight, plus the WMV render got me to bed around 1am.
Now it is Saturday at 9:50am, and I write this blog post while awaiting final edits, so I can make a DVD loop and get to FedEx by 4pm. It is about a 20 minute drive based upon prevailing traffic conditions and weather, so I need to burn the DVD no later than 2:30pm. Presumably i could take the laptop to go and finish burning as I drive, but that's pushing it.
Tonight, as mentioned in my previous post, is the 15 year reunion for my college tv station. Then the rest of this week I get to not drive anywhere besides the office. Joy!
Thanks for reading.
Posted by: Mike Cohen on Mar 29, 2008 at 7:10:14 am
While I have never traveled as much as I did in 2000, my job continues to send me on a few different kinds of trips.
Recently I received a new laptop, with the hope that time spent locked in a metal tube 5 miles up could be a bit more productive than reading the latest Harry Potter book. Oh yeah, we are out of new Harry Potter books, which is a good thing because those things are heavy.
We selected the Dell Vostro. For the price it is a good value. Core 2 Duo, 2gigs ram, 160gig 7200rpm hard drive and thankfully, Windows XP Pro. Loaded up with Premiere 2.0, Photoshop CS3, Encore 1.5 and other useful software, this thing has paid for itself already.
Here are a few other useful programs I have installed.
Audacity - this is an open source sound editor. Very useful for recording temporary narration (scratch tracks).
Bulk Rename Utility - just google that to find it. A handy little app which does just what it says. I primarily use this when dealing with PowerPoint files. Inevitably we are sent long Powerpoint presentations to integrate into a video. Time permitting I redo the slides in Photoshop or Premiere, however sometimes with some tweaking the slides can be used straight out of Powerpoint. Powerpoint exports slides as slide1.bmp, slide2.bmp etc. So open the handy program, set it to change "slide" to "projectname" and in one keystroke it is all set. Then you can import the files into your project.
Another useful application is for digital camera stills, which always seem to be named DSC10034.jpg. Same thing, change "DSC100" to "projectname_" and you suddenly are much more organized.
Video Inspector - this little program will open just about any video file and attempt to tell you what it is. For example, the extension .MPG can be any number of formats. If you just open a MPG file in Windows Media Player, it may play but you get no useful information about it. Video Inspector (there surely are other useful similar programs, probably 100's) tells you the dimensions, the bit rate, the audio format and the codec, if known.
Filezilla - If I am on the road, or just in my living room, I need access to various servers. While Firefox has a FTP plugin, I like using FileZilla. Self explanatory. It tends to time out on the display, even while a transfer continues.
Pidgin - If you must use IM, this is a much less obtrusive app than AIM, which tries to install un-needed stuff and makes noises from embedded ads.
The best feature of this computer is its long battery life. Two batteries gets me cross country, or very nearly.
Most important software is of course Premiere Pro. My first trip with this computer, back in early December '07, was in the midst of editing a job for a client, who was anxious to see the results of the shoot we did a few days earlier. On the flight down to Tampa I managed to get most of the first edit cut, with some further tweaking at the hotel, and then each day for the next few days. I render a medium res WMV file out of Premiere and post it to our password protected client website for easy download.
Given the generous internal hard drive, I was able to use Premiere's Project Manager to create a manageable version of a long-form project and shove it on the laptop also. On my Tampa to Phoenix flight (5.5 hrs) I worked on this video. Granted, my arms are normal length and American Airlines' seats are designed for tiny people, so doing anything besides simple cuts and static titles gets a bit carpal tunnelly. However the 10 minutes of edited content allowed me to make up some lost time on this project.
Finally, while it is sweet to be able to cut video on a plane, or in my hotel room, the laptop also allows me to take work home when necessary. Prior to the new machine, I would take a portable hard drive home and work on my home computer. However, when you move a Premiere Project from one machine to another, it has a hard time finding files.
So here I am, sitting on my sofa, finishing up a project. Much better than sitting in the office on a Saturday evening!
Yes, it's Saturday evening. We're having lamb chops. Stop by if you're in the neighborhood.
Thanks for reading.
Posted by: Mike Cohen on Feb 9, 2008 at 4:25:57 pm
Remember your Mentors. I had a few great ones during college, and I believe everyone starting out should have one.
Prior to going to college, that is high school, I was not really interested in learning, or reading, aside from the required readings.
However a few key instructors in college, even before actual production classes, activated some dormant gene in my brain. Thus a love of reading and learning was born.
Prof. Markham - I could be mistaken about his name, it was 17 years ago, but my first semester Freshman year history professor was hated by most of his students. Why? Because, apparently, he did not follow the text and suggested such outlandish ideas as going to the library and reading about topics he mentioned in class. I did what the man said, and not only did I ace his class, I figured out how to learn, which is something my high school teachers never covered.
Prof. Schofield - Sophomore year we were required to take a lit 101 class. I was assigned her class, and what a happy coincidence. Something about the way she helped us get to the heart of a novel's themes cannot really be explained. Again, the semester's experience taught me how to read, and how to write finely crafted literature analysis. I'm not saying I could do it today, but at the time it was magical. I even won a writing award for one of my papers. So enthused was the class, that most of us took her class the next semester, concentrating on war literature, and womens' war experiences. I tracked Prof. Schofield down using Google a few years ago just to express my gratitude.
Once getting into production classes, I encountered Jim Keener, the long time television instructor at U of Hartford. Jim's method of teaching was...read the book, take the tests, and then ask questions. Sounds simple enough, but it works. Over the next 3 years I developed a great rapport with Jim, helping to teach a Summer class, working as a paid admin assistant in the TV studio, and using his laser printer to print cover letters before graduation. What I learned most from Jim was not so much production skills, but rather the aesthetics of media and the critical thinking approach to problem solving required for having a job after graduation. I was lucky enough to have Jim and his wife Martha attend my wedding a few years later.
The other staff member in the tv studio was Mike Martin, the tv studio technician. From Mike I learned the technical skills not covered in any of the classes, such as using a waveform and vectorscope, basic video and audio cable making, VTR maintenance and troubleshooting of all kinds. Mike took me to the 1993 AVID roadshow, the first time I saw a nonlinear editor, and also to a demo of SGI computers, which I would use soon after in my job at Cine-Med.
The result of all of these experiences were instrumental in my acclimation to my new job upon graduation. Timing decks and even opening up the back panel of an Ampex 1" machine to replace capacitors as well as dismantling VTR head motor brush assemblies and changing video heads were less daunting thanks to my college mentors.
Thanks for everything.
Posted by: Mike Cohen on Aug 29, 2007 at 7:00:22 pm
Having run out of present day things to talk about, allow me to get back into the groove or recalling past experiences.
We got a project doing a promotional video for a local hospital. The on-camera hosts were Skitch Henderson, former leader of the Tonight Show band, and his wife Ruth. Day one included the on-camera intro, using the trusty TRS-80 powered teleprompter. Being entertainers, they nailed the intros, and we were off to shoot other b-roll around the hospital. At one point we needed a "patient" in bed, so I began my tradition of appearing in videos. I donned my hospital gown and acted like a sick patient, whatever that means.
Later that year we were editing a video about the pulmonary system, and we needed video of a singer. So I put my baseball hat on backwards, threw a blue gel behind me in the audio booth, and pretended to sing. Not one of my best performances. And no, this clip will not be appearing on YouTube!
Another interesting project was a series of videos about the immune system. When you hear how T cells and natural killer cells work to identify and kill enemy combatants (viruses) it sounds like a perfectly orchestrated war. Perhaps we should have a molecular biologist in charge of the Pentagon!
The analogy which was cooked up compared the immune system to the ocean. Ocean of Symmetry: The Delicate Balance Trilogy. Sounds exciting, I know, but put your credit card away for a few minutes.
We had a plan to use our primary subject matter expert, a immunologist from New Jersey act as our on-camera presenter. We went to Sandy Hook, New Jersey to the beach where they used to test bombs. We set up our trusty Jimmy Jib, and planned to shoot some dramatic intro and bridge segments with the doctor. Only problem was, it was an extremely windy day, so any sound we recorded would be inaudible. Thus, we shot a few crane shots of the crashing waves and called it a day.
Our next segment was a roundtable discussion with our host and two others. We set this up in a conference room and did our best to shoot this with two cameras and two mics.
A couple of months later, we finally re-shot the on-camera host on a beach in CT. We hired an actor. Again, it was a windy day. This time we had him do his lines both in close up and wide shots. He used a microcassette recorder and an earpiece so he did not have to memorize his lines. Then we re-recorded his lines in the relative quiet of a car. He basically did on-location ADR, and the end result matches up nicely.
This was the year everything started to make sense. As I described in previous posts, Cine-Med was nice enough to hire me out of college. My first job was duping tapes, shipping and receiving, really the most important job we have, which is filling orders and keeping customers happy. Within a few months I was promoted to editor and started learning to shoot surgeries.
I started shooting on my own in early 1995, both surgeries and other types of videos, and really learned how to think on my feet, something they don't teach you in college.
My first trips were with a production assistant, and included New York, St. Louis and Boston. My first solo trip was to St. Louis, and it was uneventful. I will not go into detail on this particular surgery, let's just say it is not dinner table conversation.
In fact my next shoot was in a similar area of the body, and was to this day the most disgusting thing I have seen. Well, maybe the second most disgusting, you'll have to wait until 2000 to hear about that one!
In 1995 I bacame the main guy to edit surgeries. Step one was obviously the video shoot. At this time our two cameras were a Ikegami HL-95 docked to a MII recorder and a HL-55 (2/3" chips!) with a portable BetaSP deck.
Now seems like a good place to talk about air travel. We had, and still have, camera cases from Porta Brace, which hold the camera, power supply, some batteries and accessories. Fully loaded this can be a 20-30 pound load. We used to carry this whole case on the plane and stick it in the overhead compartment, without so much as a batted eyelid from anyone. I usually had a pair of vicegrips, and a Leatherman in the case as well. In fact the only time the security folks questioned the contents of my case was when my set of mini screwdrivers was missing one screwdriver.
Anyway, how times change.
The goal was to shoot skin to skin, that is from incision to skin closure, but not record the whole procedure. Tapes were 20 or 30 minute loads, at a unit cost of around $35 each, so we did not want to burn too many tapes just to edit out much of the material. So We tried to use one 20 minute tape per 1 hour of surgery. This involved telling the surgeon when I was stopping and starting the camera, and of course the occasional missed shot. Who hasn't hit the record button to start the tape, only realizing later the tape was already running and you actually stopped the tape?
Upon my returnit was time for the first edit. I would start a new EDL in the ACE 25 edit controller, load my first tape, throw the deck into remote, and start logging the shots.
Thanks to Google Image search, and the people whose pictures these are, here are some links to get you in the mood:
My method was to add 30 seconds of black, 30 seconds of bars, 30 seconds of black, then leave 7 seconds for title page 1, 7 seconds for page 2 and 4 seconds for the company logo. Next I hit play, and then on the fly hit MARK IN, wait for a good place to cut, hit MARK OUT, then hit NEXT which would create an entry in the new EDL. Having backtimed my first segment of Black so the program starts at 1:00:00:00, with each new edit added to the EDL, the record time would build automatically. By the time I reached my last tape, the goal was to have a total record time of less than one hour.
Next step is to black some 1" tape, although this could be done while building my EDL. For those of you under the age of 30, we used to use 1" machines, similar to the old reel to reel 1/2 track audio recorders you may have seen in old Led Zeppelin videos. You load the tape from the source reel to the takeup reel. 1" tape came in 2 hour reels from Ampex, Sony or another manufacturer whose name escapes me.
Once your tape is blacked, you let the edit controller work its magic. You cue up the 1" tape to the beginning of timecode (although you could just let the edit controller do this, the braking mechanism during rewind would occasionally malfunction, which could ruin your tape) , then load your first tape, and execute the EDL. You then just need to babysit the edit to make sure all the edits happen.
The edit controller talks to the video switcher, so Black, Bars and the videotape decks are automatically routed to the record deck. Then you check your edits, add your titles using an insert edit, and make your VHS time-code dub for the client.
Subsequent edits consist of following the client's time code editing notes, such as "delete from 1:23:45 to 1:23:54" and so on, and sometimes "move 1:20:06-1:20:32 to 1:15:18 then freeze on last frame for 10 seconds and label superior epigastric artery" or someting like that.
In most cases I would make the edit revisions on the original EDL, remembering to first save a copy as Edit1a. The "a" means it is a revision of Edit 1, but not rippled to become Edit 2. You recall an edit to the work area, alter your in or out points on the source, then hit NEXT to send the edit back to the EDL. Then you ripple that edit to make the record timecode continuous, taking into account the trims you made to the source durations. This was the most difficult concept to teach new hires, because let's face it, adding and subtracting timecodes, even with a computer, was not something you could relate to. Kind of like high school calculus (sorry Mrs. Zangari, you tried your best!).
Once your edit revisions are made in the EDL, you have 2 options. Option 1 is to pickup the new edit where the 1st edit left off, and assemble the edit using insert edits (you could use assemble edits for either the first of 2nd edits, but timecode breaks can be a headache). Or option 2 is to black a new tape and make a new edit 2 master. I should mention we often re-used 1" reels from the previous year's 1st, 2nd and 3rd edits, so we always have a pile of reels ranging from 15 to 60 minutes, cut to length. Again, for those of you born after the Empire Strikes Back was released, once an edit on 1" tape was complete, we would cut the tape with scissors so we could use the rest of the reel for another project.
This process of editing revisions continued until the client was happy with the final edit. For the final edit, narration is added and then the final video is layed over the narration video (that is a whole 'nother discussion), and then a decision whether or not to do an A/B roll has to be made. I happen to think that surgery video s look best with dissolves on most edits. However to do dissolves in a linear edit suite is a half day to full day project. First you need to load every edit in the EDL and change from a cut to a dissolve, but making every other edit a new reel number, and doing a dissolve effect in the EDL. Once you had the dissolve set and you hit NEXT, you then need to delete the previous cut incarnation of the 2nd edit, and then using the auto-generated match frame, continue on to the next edit.
Having made dissolves on all the edits, you print out the EDL on the attached dot matrix form feed printer. Have I mentioned what I'd like to do to the guy who invented form feed printers? No? Ok, use your imagination. Reloading the printer paper was a headache, because the printer was in the rack with the edit controller CPU, and there was not much room to work, not to mention lots and lots of cables everywhere.
So you take your printout and a highlighter, and you highlight all the edits which are on reel B. Then you setup a machine to machine editing system, where the record machine (MII deck) has a built-in edit controller, and connect this via 9-pin serial cable to the source machine (either MII or BetaSP). You also attach a BNC cable for video, a BNC for timecode and audio if needed. You need to run the same timecode because the EDL assumes you will have the same timecode on your B-Roll as on your A-Roll for each edit, unless you are dissolving from Tape 1 to Tape 2, which only happens once if at all.
By comparison, the early AVID systems were not broadcast quality, so you did your edit, then output an EDL, and your B-Roll list. Thus, you could assemble your B-Roll tape using continuous time code, simply following the source times provided by the AVID. The EDL would keep track of everything for you. Now back to the old way...
First you black the beginning of your B-roll tape. Then you setup your first edit as an assemble edit with external timecode (is this free-run or rec-run? anyone? Bueller?) Thus, for each edit, the timecode on your b-roll changes to match the timecode on the source tape. You need to add pre-roll and post-roll to each copied piece of video, usually 7 seconds, to accommodate the pre-roll and post-roll of the edit controller, not to mention the pre-roll of the record deck. The pre-roll of the 1" machine could be a little finicky.
With your B-roll made, you should be able to execute the EDL automatically. Hypothetically. In a perfect world. Should with a capital S. Occasionally, sometimes.
Obviously problems arise anytime you are making machines do the work of people, but over time you learn to anticipate problems based upon past experiences.
Fast forward to 1999 when I got our first Media 100 XR version 5. When I learned how to do a full A/B dissolved edit on a whole program in approximately 5 keystrokes, I think my heart grew five sizes that day.
Well, that does it for today. In my next post I will continue with 1995, relating some more travel adventures and the result of spending my evenings on AOL (not yet the internet).
Thanks for reading.
Posted by: Mike Cohen on Aug 1, 2007 at 11:46:44 am
For about 2 months I kept putting off the project I completed today. As with most things for which one procrastinates, the actual work turns out to be not so bad.
This project should be a straight forward edit. Client reviewed the TC VHS dubs, wrote accurate times, and sent me the tapes. Only problem is, I did not shoot the video. Nor did I make the time code dubs.
1. The person who made the TC dubs read time-code off the VHS tape - or they set a tape counter and let it run as a superimposed time-code-like readout. If that person were to do the edit, presumably he or she would use the original raw footage VHS (don't ask) and line up the counter with the dubs. Perhaps that person is still using a machine to machine edit bay. Arghhh!
2. Because here in Connecticut we use these newfangled computers to do our editing, the VHS method ain't gonna happen.
3. Step one was to copy all of the raw VHS tapes to mini-DV tapes. (VHS was recorded in an operating room, one of the few remaining hospitals that uses VHS - no names mentioned of course!)
4. I copied several 2-hour VHS tapes to 6 83-minute mini-DV tapes.
5. Now I have actual timecode, so all that is left to do is sync it up with the VHS TC dubs.
6. Actually, having gotten this far, I took a 2 month break to work on other things!
7. Beginning with the first TC VHS dub, I found the first edit point provided by the client. Then I tried to find the same point on the 1st DV tape. Easier said than done, as surgery is not conducive to scanning visually at high speed.
8. So I first need to find a sync point on both tapes. On the VHS TC dub I find something which is easily identifiable, the view of the laparoscope entering the abdomen. Because the scope goes through a blue plastic cannula, the blue concentric circles is a striking enough image. I pause the VHS.
9. Find the same spot, give or take a few frames, on the first DV tape.
10. Write down the following equation:
VHS IN-1 00:01:24:22
DV IN-1 00:02:07:20
Offset = +42:28
In college we were actually tested on such time-code calculations.
However, in college we did not have Google.
Today, we have Google, which was nice enough to direct me to a free downloadable time-code calculator.
11. Given the offset, I find the first edit point on the VHS tape, pause the tape, do the calculation to find the corresponding value on the DV tape, cue up the DV tape a few seconds before the edit point, eyeball the two screens to make sure it is the same shot, play the VHS and capture the DV clip into Premiere. I watch the VHS as it plays, so I can eyeball when to stop capturing the DV tape.
12. If the next edit point on the VHS and DV tapes is less than say 1 minute away, I just let both tapes continue playing, nearly in sync, and eyeball the next edit point and capture on the fly.
13. If the next edit point is more than 1 minute away, I find the next edit point on one of the tapes, do the math, find the point on the other tape, stir and repeat.
14. This goes on until the end of the DV tape is reached.
Here are some visuals of my edit setup to help you...visualize things.
Every so often the offset was offset and a new offset needed to be calculated.
16. Having spent most of the day capturing all the shots from all the tapes, it was time to make the edit.
With the cursor at the start of the timeline, I select the first clip in the bin, hit "2" which is my keyboard shortcut for insert, and in about 5 minutes have the whole program edited.
Took another half hour or so to trim the edits, as I intentionally took some extra video on each clip to account for dissolves and inevitable needs to extend shots once the narration materializes.
Then added some titles to designate the different sections, used the handy combination of "page-dn" and "ctrl+D" to add dissolves on every edit.
Final task is to export each segment as an AVI, deinterlaced, no audio, with a new timecode overlay matching the timecode of the sequence. Final product will be FLV files to upload to our customized video review website.
An alternate approach to this madness is to capture the DV tapes and VHS dubs in their entirety and sync everything up as different video tracks on the timeline, but this would actually take twice as long. I have done it this way also, and it takes twice as long. Plus when the sync goes off you are delayed further.
Another way would be to have captured the VHS raw and the TC VHS in their entirety, and sync them up likewise. However I like things with timecode, should you every need to go back to the source, or heavens forbid you lose your media drive, but were smart enough to backup the project files to another drive. Can't batch capture off VHS. Now if I only had my ACE 25...
Thanks for reading.
PS - Now seems like a good time to plug my brother's blog - totally unrelated, but very interesting.
Posted by: Mike Cohen on Jul 30, 2007 at 11:02:45 am
I am sitting on a plane, heading to Florida for a much needed vacation. My parents are excited to show me their canasta skills!
This got me thinking about my travel experiences related to video production of course. Let me begin by saying that as many hundreds of airplanes I have been on, it still amazes me that you can strap yourself into a metal tube with wing-shaped gas tanks, and travel at 300-500mph, then land and come to a stop without an explosion. It is a miracle. Now on with the show...
Upon taking my job duping tapes, my other job duty was as a production assistant on the road, with the intent of learning the skills for my future job duties as a videographer.
In 1994 the primary type of video we made was the surgical teaching video, a 15 to 20 minute narrated surgical procedure.
My first shoot was to Providence, RI to help record a narration for a nearly complete video on breast cancer surgery. I met our head guy Jim at his apartment in Middletown, and rode with him to Providence. We were shooting an on-camera intro and narration. Upon our arrival in his very cramped office my job was to setup the camera (HL55) and deck (BVW-100? Portable BetaSP deck) along with the teleprompter. Our teleprompter was (and still is used once a year) a black and white CRT monster, in an equally monstrous metal case. The concept is simple; the camera mounts to a cast iron plate, which holds the monitor. A piece of glass is placed at an angle so as to reflect he text from the monitor while the camera lens shoots through the glass. I know most video folks understand how this works, but your average TV viewer may not. Those two glass things in front of the President during the State of the Union speech are not bulletproof shields, but Teleprompters.
Anyway, since this was 1994, we had a very high-tech Tandy laptop to feed the prompter with text. This "Fischer Price computer" ran on DOS and had some convoluted controls to output video from a serial port to a BNC cable. Editing text was tedious, but hey, it worked. These days if I need the prompter, which is rare, I just use my current laptop with Word, making the font size really big.
The shoot went fine, and Jim was impressed that I knew where to plug all the cables. Obviously he had not read my previous blog posts!
The next shoot was to Chapel Hill to UNC Medical Center for a similar shoot – on-camera narration for a video which I would soon become intimately familiar with. About a month later, Jim suddenly decided to leave the company for a major sports broadcasting network, leaving a sweet job opening in his wake. My boss knew I had done some editing, so he gave me the chance to show my stuff. Jim had left the 1st edit of a very complicated video, which I was to revise and finish. The main elements of the video were the on-camera and narration segments we recorded earlier in this paragraph, some complex 2D and 3D animation, a lot of text builds and stock footage.
Luckily my college experience with EDLs came in handy, as in this pre-digital era, the EDL was the only way to interpret someone else's work. Each EDL line had room for about 256 characters of comments, maybe less. Jim had made notes regarding stock footage tapes or ADO or CG settings, and I suspect there were some notes on the printed script.
So I first had to figure out the editing system itself. Having used the Paltex, Grass Valley and Sony 9000 edit controllers, figuring out the ACE 25 was not difficult. The Ampex Vista switcher was another story. While the master and M/E buses were standard, all of the controls for wipes, dissolves and compositing effects and keyers were contained in an LCD menu system. The ADO was another unique device. Very powerful actually, but it took some getting used to the quirks. Finally the ALEX character generator was a very powerful CG unit, although the saved pages loaded slowly, and the GPI trigger sometimes did not work.
The machine room had 2 VPR-80 1" machines, 3 MII decks, a Betacam deck, and SVHS deck and your usual scopes and terminal equipment, and a bird's nest of wires.
I just realized I have digressed from travel. Oh well, this was a key moment which must be recalled in continuity.
I did have help from the #2 editor, although he did not know much and he too departed.
So there I was working 10-12 hours a day, gradually getting through the video. The main problem was in rebuilding a video on tape, you need to accurately do insert edits of the revisions. If you want to open up 10 seconds mid video, you need to make a sub-master.
Also I wanted to maintain the smooth flow of the video, which meant a lot of match frame edits and dissolves. This was easier said than done. The ACE 25 had a function where before performing an edit you tell the machine which field you want the edit to happen on. Failure to do this results in a head switch error, which upon playback means that the entire picture shifts to the left or right slightly, making a match frame edit a disaster.
Another skill I needed to refine was the timing of decks. Normally the decks were timed, however given the heat generated in the machine room and other factors such as loose sync cables and little men who lived in the racks and obviously messed with things overnight, deck timing was a daily event. For the modern day digital editor, this may be a lesson for you.
Assume you have a Betacam deck on switcher input VTR1, and color bars on the BARS input. We assume that the BARS come from the sync generator, so the synchronization of the BARS matches house sync. Upon playback of a Beta tape, you notice that the colors seem off. This can either be due to a TBC setting or the subcarrier phase adjustment. So you dissolve on the switcher between the two inputs, and if there is a color shift mid-edit, then it is the subcarrier phase. If there is no color shift mid-edit, but rather a visual abnormality with the color based upon your eyeball, then you need to adjust the TBC.
If it is the subcarrier, you need to set the switcher to alternate between the two inputs with a 1 second interval. This is so you can look at the vectorscope, and see where the subcarrier differs between the two input. Again, assuming the BARS are correct, you stick a tweaker (small screwdriver) into the subcarrier adjustment pot on the Beta deck's TBC, and rotate it until the vectorscope shows no differential between the two input signals. Luckily the subcarrier pot is usually incremental, so there are only about 8 different positions, making it relatively east to get a lock.
If the problem is the TBC color phase, not the subcarrier phase, you play your Beta tape. Hopefully you recorded 30 seconds to 1 minute of bars on your tape. Playing the bars, you adjust the TBC hue control until the color vectors hit their marks on the vectorscope.
However if the camera was not properly white balanced, the color bars mean squat, so in this case you eyeball it.
The other phase to worry about, and actually a more common problem, is horizontal or H phase. Incidentally the symbol for phase is a circle with a line through it, although I do not recall the exact configuration. So H phase was apparent if on an effect like a dissolve, the two sources shift horizontally one way or the other. This is pretty apparent. The means of adjusting the H phase is a bit more complex, although similar to the subcarrier.
First you punch up the house sync, such as blackburst, on the switcher. You adjust the waveform monitor to show just the sync signal. If memory serves, you turn the zoom knob until the sync interval is magnified to the full 1 volt display, and the details of this are a bit murky. Then you run your switcher effect to cycle between your house sync and your deck, and then adjust the H phase pot until the scope locks into position. The H phase has unlimited positions, so you need to watch the scope carefully.
Once the sync signals are set, you can also check the other TBC levels. Saturation is adjusted using the vectorscope, getting the amplitude of the chroma into the little boxes. The luma and setup are for your video levels. The white should never exceed 100 IRE, and the black or setup level should be set to 7.5 IRE in NTSC. Again, if the actual recorded camera image is lacking chroma, luma or is the wrong hue regardless of color bar setup, then eyeball it, but stay within legal levels.
So based upon getting to work at 8:30am, it is now lunch time and we have not yet edited anything. Oh and when you put in another tape, you may need to repeat some or all of the above.
2D and 3D animation was a new concept for me. This being the pre-digital, non-DDR days, animation was recorded via VLAN to a deck. Specifically, animation sequences were modeled, rendered and composited in the Alias suite from SGI, the precursor to Maya, running on an Indigo workstation. I will have to look up the equivalent processing power, but the RISC processor running UNIX was probably like a Pentium 2, at a cost in 1994 dollars of about $30,000. Once the frames were rendered, the VLAN software and hardware sent each frame of animation along with RS-422 deck control signals to a MII recording deck. For each frame, a 1 frame insert edit was performed. This meant that for each frame of say a 10 second animation, the deck took about 15 seconds to record the frame. So a 10 second animation took 10x30x12= 75 minutes. You cannot be in a hurry. We tried to lay off the animations over night, but god forbid the power goes out or the little men living in the racks get bored.
So assuming you have all your animation on tape, that becomes one piece of the puzzle, along with CG, stock footage and narration and on-camera segments. In most cases it should have been straight forward insert edits, but since this was re-doing someone else's work, mind you someone who had used this edit system for 5 years or so, it was laborious. The worst parts were when I would get to an animation, look at the source animation, and see it was different than what was on the master tape. Jim had cleverly used the ADO and series of stills and ADO layering effects to composite more elements in animations than existed in the source elements. It would have been easier to re-composite the animations in Alias and re-render the frames, but would probably take much longer given the time to lay the frames back to tape. So I had to figure out the compositing in the ADO. Basically the ADO had two channels, front and back. Either channel could have different video, routed via the ADO router within the Vista switcher. And each channel could be frozen, and a luma key generated in the switcher could be re-composited within the ADO, so you could grab a still frame of the composite image in one channel, display that channel in the switcher, then use the open channel to add a new composite layer and re-grab that and so on. So you could have a complex composite of stills. I believe the ADO also saved transparency, today known as alpha channel, then known as luck. Each still frame was often the end of a new ADO animation. These days you would do it with multiple video tracks in Premiere or After Effects. This was the old school way.
After about a month of this, the video went out for final review, and then I had to do a final round of edits to complete the video. Having proven my ability to work under pressure, I was assigned editing projects with much less complexity, namely surgery videos. Using my EDL knowledge, ability to add and subtract timecode in my head, and an intimate knowledge of the editing system, life was good.
Well, the captain just informed us of our initial descent into Fort Lauderdale, so I must sign off for now. In my next post I will continue the travel adventures, perhaps relating the associated editing experiences in continuity. I am an editor after all, so continuity is important.
Thanks for reading. Oh my inner ears…
Posted by: Mike Cohen on Jul 10, 2007 at 10:09:31 am
The first video I ever made was "American Money" a 3 minute video for 8th grade history class. I did almost no research or preparation, to the point where I was drawing each frame before shooting. Oddly enough I have met producers with the same preparation skills!
Seriously, it was one of several school projects using video. Note to reader, this was in 1986 using a video camera attached to a boom box sized "portable" VHS recorder.
The secret to these early videos was in-camera editing. Shoot what you want in the sequence it needs to be and don't screw up!
The next video was "30 Minutes: Ancient Rome," with Dave N. as host and myself and Joe C as correspondents. This was a jump forward intechnology to a 8mm camcorder, roughly the size of a toaster oven. This was such high technology that I was not allowed to touch the thing. My dad, to my great embarrassment, had to do the camera operation. This project included more preparation, mainly consisting of Dave N. writing everything, and giving me the most boring parts to read, on camera, with no visuals. The one fun part was I got to build aRoman house using foam core, and demonstrate this on-camera.
Despite these two experiences, I decided to go to college for broadcasting.
In college our first assignment in EFP class was to make an in-camera edited video of a sequence. It was actually a decent way to learn about editing, which we did not learn until later using the very high tech u-matic edit bay.
So what in-camera editing experiences have I had lately? Since the advent of cheap tape I just leave the camera rolling. In shooting medical procedures you don't want to miss something during the second it takes for the tape to start rolling. But in the old days when tape was non-cheap, we did a certain amount of in-camera editing. At the least it made the actual rough cut faster, since there was less material to work with.
Cheap though DV tape may be, logging just the shots I need is indeed more tedious with a lack of in-camera editing.
Insert profound life lesson here.
Posted by: Mike Cohen on Jun 25, 2007 at 1:22:09 pm
I have a passion for my job, which entails training for medical professionals such as surgeons, nurses and administrators, not to mention various industries.
Technology is great, but how you apply your skills is what pays the bills.
Years ago I canceled my Media 100 support contract upon discovering what a treasure trove of helpful advice can be found on the Creative COW website. I am proud to be a part of this fantastic community.
In my blog I talk a little about media production, a lot about travel and workflow, and occasionally about cooking, nature and my four-legged friends.
Follow me on Twitter: med_ed_mike
I'm also on LinkedIn if you can't get enough of me!