![]() R3Ds straight from the camera, wired or wirelessly, into your editors hands and now you’ve got an additional full-size backup to your cards and drives on set on top of everything else. That’s right, if you really wanted to (and without any additional hardware beyond the camera itself) you could simply send those. R3D files in up to 8K resolution and ProRes Proxies natively. Lucky you, the RED V-Raptor (and Fujifilm XH2 with a grip) can send. For this off-label use of the system it’s a perfectly acceptable trade off… but what if you wanted those original camera negatives instead? In some cases out of focus things really look very out of focus, but as long as that’s the background and not someone’s face it’s fine. In any case, if you go watch the videos yourself I bet you wouldn’t notice any strange blocking or anything even remotely giving away that we were using such a low bitrate proxy. ![]() That being said I did have a Pearlescent filter on the lens one time and the bloom looked kind of bad in that environment and didn’t hold up amazingly once proxi-fied so maybe don’t do that. Remember, these are simple interviews or to-camera pieces going straight to YouTube so we don’t need 4K clarity or anything. We found that 10Mbps was a perfect balance of quality and speed, and later found that 8 or even 5Mbps was perfectly acceptable. The first time we did this we had the proxy encoder on the Teradek set to 15Mbps and we found that to be too big to transfer at a reasonable speed in such a spotty connection area as inside a convention center with a buncha wifi signals everywhere. My C500mkII mounted with the Teradek Cube 655, next to a Sclera Wifi unit. If I wanted to get really crazy I could have had each clip save to both CFx cards, bringing the in-camera total of redundant clips up to 3 on top of the version in the cloud, giving me. As we were shooting for web and not even using my internal recording, the full-fat footage saved to the CFexpress cards and the internal 2K proxies to the SD card were nice redundant backups. Since we were using my C500, all I had to do was set the camera to Rec709 (I actually sent a nice custom Viewing LUT that I had made down the HDMI cable, to “pre-grade” the footage) and made sure the microphones were on Channel 1&2 and monitoring correctly so the editor had both my and the Interviewee’s audio on separate channels for processing purposes. Now for our event coverage we wanted to be as close to “first to air” as possible, so we actually didn’t even use the original camera negative, we went straight to the web with the proxies and guess what: it was fine. This also means other non-production-essential members of the team can remain remote and offer notes without having to be in Video Village or the DIT tent or what have you. I also love the idea of having another set of eyes, completely detached from the hectic nature of a set, watching each clip as it comes in, analyzing it, and letting the production know that something’s wrong like missed focus, a bad clip for VFX, whatever the case may be, in real time. Time is money in this industry right? Getting started on the rough cut while it’s being shot is shaving a lot off the overall timeline without adding undue stress to the editor. ![]() This is bonkers for a couple reasons, but I always like quoting Michael Cioni who said “ it makes the linear process of shooting and then editing parallel ”. Frame.IO has a technical guide if you’d like to check that out. Once they get the original footage from set, they can simply re-link the clips as they normally would. Every time you stop recording, about that length-of-time later your editor gets a notification that there’s new footage downloading and it’s available right there for them to start a cut. If you’re unfamiliar, Camera 2 Cloud is a system in which you can film something, and a standard but specific piece of hardware you’ve likely already got on set (a Teradek, Monitor/Recorder, etc) will send proxies of that footage, near-instantly, to your editor who’s got the Frame.IO C2C tab open in their NLE of choice. The Atomos Shogun Connect, seen here in dramatic fashion. Luckily, purpose-built hardware is now on the market and I got to test out one solution, the Atomos Shogun Connect. That was a sticking point for me, but the avalanche of Pros outweighed the minor inconvenience of a Con. ĭuring those events we’d been using the Teradek Cube 655 to transmit footage to our editor as it was shot, which worked but the Teradek was kind of finicky every time I tried to set one up anew as the Cube wasn’t built for C2C originally. Over the past year I’ve been testing out Frame IO’s Camera 2 Cloud system in various situations, primarily using it in service our live event coverage for ProVideo, namely Cinegear, NAB, and Adobe MAX. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |