| | | | |

Blackmagic Camera ProDock: The exact iPhone accessory I’ve been looking for… almost!

A few weeks back Apple did a Keynote releasing the iPhone 17 Range and to say that I am impressed is a bit of an understatement. Scratch that—I’m absolutely buzzing, the kind of buzzing you get when you’ve just seen the future of your favourite hobby/side hustle (which, let’s be honest, is practically your main hustle these days, right?).

The iPhone 17 Pro has been engineered to be a powerhouse for professional video production, truly putting cinema-grade tools directly in the user’s hand. And for a former university student who once tried to stream a full campus event using a dodgy laptop and a prayer, this is the stuff of dreams.

The Tech That Makes My Sing

At the core of this beast is the new A19 Pro chip with advanced vapour chamber cooling, and honestly, the cooling is the unsung hero here. We can talk about teraflops all day long, but if your phone turns into a hand warmer after two minutes of 4K 60 fps recording, none of it matters. Previous iPhones, bless their stylish hearts, could get a little… delicate under heavy load. You’d hit record on a hot summer day and your phone would politely tell you, “Nah, mate, I’m good, maybe try again when you’ve found an ice bath.”

The vapour chamber isn’t just a marketing buzzword; it’s the difference between sustained performance and a frustrating 3-minute clip limit. That stable thermal profile is what makes the whole “Pro” moniker actually mean something in a professional, run-all-day environment, ensuring lightning-fast and sustained performance when handling massive media files and intensive video processing.

And speaking of the camera system, wow. The Pro camera system features all 48MP Fusion rear cameras and offers the longest-ever Telephoto zoom, providing an unparalleled 8x optical-quality range up to an equivalent 200mm focal length for incredible composition flexibility. Eight times! I remember lugging around a monstrous 70-200mm lens on a DSLR and having my back hate me for days. Now, this incredible compositional flexibility, which is essential for things like documentary work or capturing action from a safe distance, is just… in my pocket. That Telephoto lens alone is a game-changer for those quick, unplanned shots where you just need to reach out and pull the action closer without looking like you’re trying to sneak onto the pitch.

The ability to support ultra-stabilised 4K 60 fps video in Dolby Vision and to record at up to 4K 120 fps in Dolby Vision delivers high resolution and frame rates essential for cinematic quality. Let’s not mince words: 4K 120 fps is pure gold for slow-motion capture. If you’ve ever tried to capture a subtle moment and slow it down in post-production, you know how quickly your footage can turn into a blurry mess. Hitting 120 frames a second at that resolution means buttery smooth, detailed slow-motion, letting you stretch a single second into five beautiful ones without any jankiness.

ProRes RAW and Apple Log 2: Mastering the Data

For the professionals (and the aspiring prosumers like myself), the iPhone 17 Pro’s software features solidify its position as a serious production tool, moving beyond just great video capture into proper workflow territory.

It is the first smartphone to support ProRes RAW, an industry-leading Apple-developed codec that grants the highest level of quality and post-production control. Now, if you’re new to this whole world, think of video files like a cake. When you record standard H.265 video, Apple has already baked the cake, sliced it, and put a pre-chosen layer of icing on top. You can maybe scrape some of the icing off, but you can’t change the recipe much. ProRes RAW is like getting all the ingredients separately. It gives you the raw sensor data, allowing you to change the “bake” after the fact—things like white balance and exposure can be finely adjusted in post-production with minimal loss in quality, because you’re working with so much more information. This level of control is priceless when you’re shooting run-and-gun and don’t have time to perfectly set up every single shot.

This deep integration into pro workflows is evident through support for a wider colour gamut with Apple Log 2, broadcast frame rates, and the ability to record open gate.

Let’s unpack Log 2. It’s basically Apple’s way of ensuring you capture the maximum dynamic range (the difference between the brightest and darkest parts of the image) the sensor is capable of. When you look at Log footage, it looks flat, washed out, and low contrast—but that’s the point! It’s storing all that precious tonal information, especially in the highlights and shadows, so when you get into DaVinci Resolve or Final Cut Pro, you can shape the image, recover details, and apply a professional grade that simply wasn’t possible before. Trying to grade compressed footage that already has all its contrast and saturation baked in is a recipe for banding and headache; using Log 2 footage, even from a phone, is going to save colorists hours and make the final product look infinitely more cinematic. My anxiety levels just dropped by 10% thinking about those future grading sessions!

And then there’s open gate. This is a feature usually reserved for high-end cinema cameras and it means the phone is recording the entire image area of the sensor, including parts that would typically be masked off for a standard 16:9 widescreen video. Why do we love this? Flexibility! If you’re shooting for vertical TikTok/Reels content as well as horizontal YouTube videos, shooting open gate lets you decide the final crop later. If your subject moves slightly out of frame? No sweat, you’ve got extra breathing room above or below to re-frame in post. It’s like having a safety net woven out of pixels.

Critically, the iPhone 17 Pro includes support for Genlock and time code, a feature vital for multi-camera shoots and super-precise synchronisation. This is explicitly enabled when using professional accessories like the Blackmagic Camera DockPro. These features, combined with breakthrough battery life for extended shoots (which, let’s face it, is a huge win—no more external battery packs the size of bricks!), make the iPhone 17 Pro a highly capable and workflow-compatible choice for both independent creators and Hollywood-level productions.

The Blackmagic ProDock Reality Check: A Step Towards Broadcast

When Apple mentioned all of this in the keynote, my mind jumped straight into gear thinking about utilising the iPhone 17 Pro in Multicam production. This is where the magic happens, folks. You can have a three or four camera setup that costs less than a single mid-range traditional cinema camera, all perfectly in sync.

So, I went straight over to both Apple’s and Blackmagic’s websites to get a little more detail on the accessory that makes this possible.

The Blackmagic Camera ProDock is the professional solution that truly turns the iPhone 17 Pro into a cinema-grade camera rig, providing the essential connections needed for a production environment.

The Sync Saviours: Genlock and Timecode Deep Dive

Most notably for a multi-cam setup, the DockPro features BNC connections for external genlock and timecode input. This isn’t just a nice-to-have; this is the essential connection.

If you’ve ever tried to sync four different camera clips in post-production using nothing but their audio waveforms, you know the pain. You might clap a slate (or your hands, in a low-budget scenario), and you can align the spikes in the waveform perfectly, but here’s the problem: digital cameras, especially consumer ones, all run on their own internal clocks that drift. They might start almost in sync, but over an hour, they’ll drift by a few frames. That’s a nightmare to edit, forcing you to nudge and check constantly.

Timecode is the digital clock stamp that gives every frame a unique address (Hour:Minute:Second:Frame). The DockPro lets you input a master timecode signal (from a device like a Tentacle Sync or an Ambient Lockit) to all your iPhones, ensuring that even if you stop and start recording, the time stamp on the footage aligns perfectly across all devices. This means you can drop all your footage into DaVinci Resolve or Premiere Pro, hit ‘Synchronize by Timecode’, and boom—all your shots are ready for the multi-cam edit. It’s a genuine time-saver that pays for the dock many times over.

Genlock (or reference sync) is the final piece of the puzzle and what separates film-style timecode from live broadcast. While timecode gives the frames a unique name, Genlock literally locks the actual physical clock of the sensor. It ensures that all cameras are hitting ‘record’ and capturing frame zero at the exact same microsecond. Without it, your cameras are technically shooting frames out of phase, and while this is fine for post-production editing, it’s absolutely vital for live switching. If you’re cutting between non-genlocked cameras, the switcher has to momentarily hold or drop a frame to align them, which can result in a tiny, almost imperceptible hiccup in the program output. The BNC Genlock input on the DockPro is a massive nod to the broadcast world, and Blackmagic deserves a medal for including it.

Beyond synchronisation, the ProDock significantly enhances the phone’s utility on set. It includes a full-sized HDMI output for connecting to external on-set monitors. This is crucial for the crew because trying to frame a shot, check focus peaking, or verify the exposure using the tiny (albeit glorious) iPhone screen is an exercise in futility. The director or focus puller needs a proper 7-inch or 17-inch monitor to be sure. The DockPro gives us that clean, high-resolution feed.

For audio, it adds a 3.5mm stereo input for connecting professional external microphones and a 3.5mm headphone jack for monitoring. While the iPhone’s internal mic is surprisingly good, if you want high-quality dialogue or sound effects, you need a proper external microphone, usually going through a mixer or wireless system. The 3.5mm input is functional, but as we’ll get to later, it’s not quite the pro standard (we’re missing that lovely, chunky XLR, aren’t we?).

Furthermore, it expands the recording capability with two fast USB 3.2 ports for connecting high-capacity external SSDs, massively extending recording time. This is where the maths hits you: 4K 120 fps in ProRes RAW is going to eat storage faster than I eat a packet of biscuits on a bad day. An hour of that high-resolution, high-frame-rate footage could easily push into the terabyte range. The USB 3.2 connection provides the bandwidth to handle that sustained, high-speed write without dropping a frame, which is paramount for professional integrity.

The entire unit is powered by a locking 12V DC input (a proper professional standard, thank goodness!) and features multiple 1/4-20 pin lock mounting points for easy rigging, transforming the highly capable iPhone 17 Pro into a fully integrated, professional production camera.

The Dream DockPro Mk2: My Live Production Wishlist

This is all amazing and a huge step in the right direction, but as the focus was clearly on making the iPhone a post-production camera (shoot now, edit later), we have lost out on a few features which I think could have made the DockPro a game changer in live multi-cam production.

So I thought I’d get these thoughts out and share them all with you. You never know, someone at Blackmagic may see this and think these ideas are worth something for the Mk2 or a separate new product aimed purely at live production. We can dream, can’t we?

1. Networking: The Power of Ethernet

I’m hugely disappointed that the DockPro doesn’t include a dedicated Gigabit Ethernet port. This could have opened up a number of truly revolutionary opportunities, especially when pairing the camera with an ATEM switcher (which, let’s be real, is Blackmagic’s ecosystem goal anyway).

Camera Control and Tally

First and foremost, a simple Ethernet connection would enable camera control and Tally from the switcher. Imagine this: you have the iPhone 17 Pro on a tripod, connected to the DockPro, and a single Ethernet cable running back to your control room where the ATEM lives.

  1. Camera Control: The switcher operator (the Shader) could remotely adjust the camera’s settings—Iris (for exposure), Focus, White Balance, and Shutter speed. This ability to instantly and remotely match the look of multiple cameras is what separates a professional live broadcast from a shaky amateur production. It means the camera operator can focus purely on composition, while a dedicated shader handles the technical matching.
  2. Tally: This is so basic, yet so vital. Tally is the little red light that turns on when a camera is live on air (the Program feed) or on preview (the Preview feed). It tells the camera operator, “Hey, everyone is looking at you, don’t pick your nose!” Without it, the operator is flying blind, and that little red light is a sanity saver on set. Embedding Tally data in an Ethernet connection is cheap, easy, and essential for a true professional tool.

ST 2110 IP Video Streams: The Future of Broadcast

But let’s go one step further and get really technical and exciting. If Blackmagic was really thinking ahead, that Ethernet port could open up ST 2110 IP video streams.

Now, if you’ve never heard of ST 2110, you’re not alone. It’s the new backbone of high-end broadcast facilities, designed to replace the old SDI (Serial Digital Interface) standard that has dominated TV studios for decades. Think of SDI like a big, thick water pipe that carries everything—video, audio, and all the metadata (like timecode and control)—all mixed together in one stream. It works, but it’s inflexible.

ST 2110 (a suite of standards developed by the SMPTE) is different. It breaks the video signal down into separate essence streams over a standard IP network:

  • Video (ST 2110-20)
  • Audio (ST 2110-30, based on AES67)
  • Ancillary Data (ST 2110-40, which is where your Tally and control data lives)

Each of these three elements is sent as its own independent stream, all perfectly synchronised using a master timing signal called PTP (Precision Time Protocol), which locks everything down to the microsecond. This is why this system is so robust and flexible. The network can route and multicast these streams anywhere instantaneously. Turning the iPhone into a native ST 2110 source would make it instantly compatible with high-end broadcast infrastructure, putting it in the same league as cameras costing tens of thousands of pounds.

Mini NAS Functionality: Sharing is Caring (and Faster!)

And finally on the networking front: turning the iPhone into a mini NAS (Network Attached Storage).

The fact that the DockPro has two fast USB 3.2 ports for external SSDs is great, but getting that footage to the editors still involves physically unplugging a drive and carrying it across the studio. It’s clunky and prone to human error (i.e., me dropping it). If the DockPro had an Ethernet port and could host those attached SSDs as a network share, video editors could be downloading, reviewing, or even editing the footage in real-time while the camera is still shooting (or at least, immediately after the shoot wraps). No more “sneaker-netting” the drives around the building!

2. Storage: Why M.2 is King

I mentioned external storage is necessary, but if we’re designing the perfect Mk2, I would scrap the external USB SSDs and integrate a proper space in the DockPro to add an M.2 NVMe SSD card.

M.2 drives are tiny, wafer-thin sticks of storage, and NVMe (Non-Volatile Memory Express) is the protocol that allows them to communicate with the CPU at warp speed. They are significantly faster and much more compact than their traditional external SSD cousins (which usually require a full metal enclosure and a cable that can, and often will, fail at the worst possible moment).

An integrated M.2 slot would have been an amazing way to expand the recording space for the iPhone in a clean, secure, and streamlined manner. No more dangling cables. The drive is protected inside the dock, and it removes a major point of failure—the connection cable itself. Plus, if you pair that integrated storage with the network connectivity I just mentioned, you could literally have this unit functioning as a mini NAS all the time.

3. Video I/O: The Unbreakable Backbone of 12G SDI

I think by leaving out 12G SDI In & Out from the DockPro, Blackmagic really missed a trick here—and this is probably my biggest, most genuine disappointment.

Most serious production houses, broadcast facilities, and even high-end film production units are going to be using SDI rather than HDMI. HDMI is fantastic for consumer devices, computer monitors, and cheap on-set viewers, but it’s fragile; the connectors are weak, the cables are limited in length (you usually can’t go more than 50 feet without a booster), and it doesn’t have a secure, locking mechanism.

SDI (Serial Digital Interface), on the other hand, is the workhorse of the industry. It uses a BNC connector that locks into place with a twist—tighter than my jeans after Christmas dinner, I tell you. More importantly, you can run SDI signals for hundreds of feet on a simple coax cable without a repeater, and it’s incredibly resilient to interference.

Why 12G?

We need 12G SDI because that is the standard needed to carry a full 4K 60p video signal over a single cable. Leaving it out means anyone trying to integrate the iPhone into a professional flypack or OB (Outside Broadcast) truck has to put another expensive converter right after the DockPro, adding clutter, complexity, and another potential failure point.

Closing the Loop: Camera Control and Tally Return

But here is the absolute game-changer that 12G SDI In & Out enables, which HDMI cannot: The return path for Camera Control and Tally.

I mentioned that the DockPro has Genlock In and Timecode In via BNC. But guess what? A single SDI cable, when connected to a Blackmagic ATEM switcher (or many other professional switchers), can carry all the necessary signals:

  1. Video Out: The DockPro sends its clean 4K signal out to the switcher via SDI Out.
  2. Return Data (In): The switcher then sends its Program video (what’s on air), Talkback audio (for the camera operator’s headset), Tally data, and Camera Control Unit (CCU) data back to the camera over the same SDI cable via the SDI In port.

This is the holy grail of professional live multi-cam. It means your camera operator can hear the director, see the Tally light up, and the shader can control the look of the camera remotely—all through one clean cable run. By only providing BNC In for Genlock/Timecode (which is an analogue signal) and not a full SDI In/Out loop, Blackmagic forces professionals to rely on HDMI or an external converter, which defeats the purpose of making it a truly “Pro” Dock.

I really think these three changes could make the DockPro a game-changer in multi-cam production, especially for groups with a low budget such as schools or colleges, community groups, or churches. When you’re dealing with volunteers and limited funds, having a self-contained, high-quality, and synchronised camera system that works seamlessly with affordable switchers like the ATEM Mini line would be revolutionary.

The Low-Budget Dream: Cinematic Glass and Digital Jiggery Pokery

So let’s stop dwelling on what’s missing and talk about the low-budget dream that is still possible, but is about to get a whole lot cooler with a little extra gear.

If you want to take your cinematic look to the next level—moving beyond sharp focus everywhere (which is what phones usually give you) to that dreamy, buttery background blur known as bokeh—you need to look at the Beastgrip’s Beastcage & DOF Adapter MK3.

This combination allows you to pair your iPhone with any Canon EF mount lenses you already have (or, better yet, any of the gorgeous, cheap, vintage manual lenses you can find on eBay with the right adapter ring).

The Magic of the Ground Glass

The DOF (Depth of Field) Adapter is where the real magic happens. It doesn’t physically link the lens to the iPhone sensor; instead, it projects the image from the larger DSLR lens onto a tiny piece of frosted glass, called a ground glass or focusing screen, which sits inside the adapter. The iPhone’s camera then focuses on this projected image.

The results are stunning:

  1. Shallow Depth of Field: Because the image is projected through a large-aperture lens (Beastgrip recommends fast glass like f/1.2 to f/2.8), the background melts away beautifully, drawing the viewer’s eye exactly where you want it. It instantly gives you that “classic film” aesthetic.
  2. Character: Using vintage prime lenses (like an old Helios 44-2, if you know, you know!) brings unique characteristics like lens flares, swirling bokeh, and specific colour shifts that modern, clinically perfect phone optics just can’t replicate.

It’s important to remember that using the DOF Adapter requires two things: a fast full-frame lens and manual focus control. Canon’s newer STM lenses, for example, won’t work because they rely on the camera body’s electronics to move the glass. You need a mechanical focus ring. The entire process becomes wonderfully tactile and challenging: you have to manually pull focus on the lens itself, often relying on the iPhone’s little screen (or ideally the HDMI-out monitor) to nail the focus perfectly.

The Motorised Control Fantasy

Now, combining the iPhone 17 Pro + DockPro + Beastgrip adapter is where the ultimate dream of low-cost broadcast gets a bit of “jiggery pokery.”

If we had that essential 12G SDI In/Out or the Ethernet connection I wish for, and the Beastgrip has a large manual lens attached, you could potentially (with a bit of jerryrigging and ingenuity) control motorised lenses from the ATEM if camera control was available across either the SDI or Ethernet links.

Think of it: there are existing motor systems (like Nucleus-Nano or various low-cost follow-focus motors) that can be clamped onto the focus and iris rings of the EF lens on the Beastgrip. If the ATEM Camera Control data (which normally tells a Blackmagic URSA to move its internal motors) could be translated into a command that tells those external motors to spin, you would have a fully remote-controllable cinematic camera rig powered by an iPhone! The shader could remotely adjust focus and iris from the control room using the ATEM panel, letting the camera operator concentrate entirely on keeping the phone steady and the composition perfect. That is a $50,000 professional featureachieved with an iPhone, a $325 dock, and a few hundred quid of accessories. The thought of it gives me shivers.

The University Nightmare: A Ghost Story of Dropped Frames

I honestly think these three small things—Ethernet, M.2, and 12G SDI—could make an amazing difference. And I know this because I lived the nightmare of trying to produce multi-cam live streams using only mobile phones and consumer tech when I was at university.

We were trying to cover various events across campus, from theatre performances to sports matches, and the budget was, shall we say, non-existent. We had three phones (not even all the same brand, which was a nightmare for colour matching) and we were using a piece of software called MIMOLive on a single laptop to act as our switcher.

The setup was a mess:

  1. The Cameras: Three different phones, each streaming their video feed back to the laptop over the campus Wi-Fi network.
  2. The Control Room: One laptop running MIMOLive to mix the feeds and add titles.
  3. The Internet: The same laptop was then simultaneously streaming the final mixed program out to YouTube, also using the campus Wi-Fi.

The experience was, quite frankly, a disaster—a high-stakes game of “will the Wi-Fi drop out this time?” The connection between the cameras and the laptop was constantly battling for bandwidth with every other student on the network. We’d get random spikes in latency, where one camera feed would suddenly freeze for a terrifying second before jumping forward, throwing the audio-video sync completely out of whack. The laptop’s single CPU was trying to juggle decoding three simultaneous h.264 video streams, mixing them in real-time, rendering motion graphics (which were slow and clunky), and encoding the final stream for YouTube.

It was just too much for the computer to handle. The fans sounded like a jet engine preparing for take-off, the frame rate dropped constantly, and the whole stream was unstable, occasionally just dying with a polite little error message that made me want to scream. We spent half our time manually restarting the camera apps, hoping the timecode wasn’t too far off for the final edit. It was proof that while the idea of mobile multi-cam is great, the infrastructure simply wasn’t there yet in a consumer package.

The Dream Scenario Retold

But let’s imagine that same university scenario with the hypothetical DockPro Mk2:

  • The Cameras: Three iPhone 17 Pros in Beastcages with the DOF Adapter and great vintage lenses, all connected to the DockPro Mk2.
  • The Connection: Instead of Wi-Fi, three long, simple SDI cables (or even better, a single Ethernet cablerunning ST 2110 IP) go from each DockPro directly into a small ATEM Television Studio Pro HD switcher in the control room.
  • The Control Room: The ATEM handles all the mixing and frame synchronisation (thanks to the Genlock input!), taking the massive processing load off the computer. The shader uses a separate control panel to perfectly match all three cameras’ focus and iris remotely.
  • The Graphics & Stream: A dedicated computer is used purely to provide high-quality live graphics to the ATEM via an HDMI input. A separate streaming device (a HyperDeck Studio, perhaps, or a dedicated streaming encoder) takes the clean program output from the ATEM and sends it online.

This modular setup would have made the whole thing dramatically more stable, allowed for higher quality graphics and better camera work, and saved my sanity (and the audience’s patience!). This is something I have been waiting for for a very long time—the convergence of high-end camera features with proper broadcast infrastructure at an affordable price point.

It looks like the wait is going to be a little longer, unfortunately. But the iPhone 17 Pro and the current DockPro have laid the essential groundwork. All Blackmagic has to do now is close that final loop and give us the connectivity that professional, low-budget, multi-cam productions desperately need.

What features do you think Blackmagic should prioritise in the next iteration of the DockPro? Is it the 12G SDI, the Ethernet, or something completely different that I haven’t even thought of? I’d love to hear your thoughts drop a comments on the post that brought you here!

Similar Posts