Hi all, really cool that you have taken an interest in this project, a lot of your comments below are very insightful and interesting. I work on the team that deploys this tech on set. We focus on how the video signals get from the video output servers to the LED surfaces, coordinating how we deal with tracking methods, photogrammetry, signal delivery, operator controls and the infrastructure that supports all of this. As noted in some of the comments, the VFX industry solved tracking with mocap suits a long time ago for the post-production process. What we are exploring now is how we can leverage new technology, hardware, and workflows to move some of the post-production processes into the pre-production and production (main/secondary unit photography) stages.<p>If you are interested in checking out another LED volume setup, my team also worked on First Man last year. This clip shows a little more of how we assemble these LED surfaces as well as a bit of how we use custom integrations to interface with mechanical automation systems. [<a href="https://vimeo.com/309382367" rel="nofollow">https://vimeo.com/309382367</a>]
Here is the Unreal Engine tech they are using: <a href="https://www.unrealengine.com/en-US/spotlights/unreal-engine-in-camera-vfx-a-behind-the-scenes-look" rel="nofollow">https://www.unrealengine.com/en-US/spotlights/unreal-engine-...</a> . This is a video of it in action: <a href="https://www.youtube.com/watch?v=bErPsq5kPzE&feature=emb_logo" rel="nofollow">https://www.youtube.com/watch?v=bErPsq5kPzE&feature=emb_logo</a> .
This is super interesting stuff and I've been following it for some time. I just wrote it up with a bit more context:<p><a href="https://techcrunch.com/2020/02/20/how-the-mandalorian-and-ilm-invisibly-reinvented-film-and-tv-production/" rel="nofollow">https://techcrunch.com/2020/02/20/how-the-mandalorian-and-il...</a><p>It's not just ILM and Disney either, this is going to be <i>everywhere</i>. It's complex to set up and run in some ways but the benefits are enormous for pretty much everyone involved. I doubt there will be a major TV or film production that doesn't use LED walls 5 years from now.
"The solution was ... a dynamic, real-time, photo-real background played back on a massive LED video wall and ceiling ... rendered with correct camera positional data."<p>Gee, that sounds a lot like a holodeck. We've come a long way from using Wii Sensor Bars[0] for position tracking.<p>[0] <a href="https://www.youtube.com/watch?v=LC_KKxAuLQw" rel="nofollow">https://www.youtube.com/watch?v=LC_KKxAuLQw</a>
I wonder how the photogrammetry aspect will intersect with intellectual property laws. The example used - scanning in a Santa Monica bar so that you can do reshoots without revisiting the location - would be an obvious example that might raise someone's hackles ("because it's our bar you're using to make your money" for instance). If you add that bar to your digital library, do you have to pay them royalties each time you use it? Is it any different to constructing a practical replica of a real-life location?<p>Can someone wearing a few cameras walk through a building and digitise it completely without getting the owner's permission? Here in Australia, "copyright in a building or a model of a building is not infringed by the making of a painting, drawing, engraving or photograph of the building or model or by the inclusion of the building or model in a cinematograph film or in a television broadcast," for instance. (Copyright Act 1968 §66)
Fascinating, but this article needs a proofread, damn...<p>"The virtual world on the LED screen is fantastic for many uses, but obviously an actor cannot walk through the screen, so an open doorway doesn't work when it's virtual. Doors are an aspect of production design that have to be physical. If a character walks through a door, it can’t be virtual, it must be real as the actor can’t walk through the LED screen."<p>Not to mention the multiple paragraphs that are basically re-stated immediately afterwards. It's like they hit publish in the middle of editing.
The Mandalorian was probably a very likely candidate for this kind of approach, since it's essentially a western, meaning a lot of wide landscape shots.<p>The LED screen approach works nicely for fairly uncomplicated background geometry, like a plain. Try shooting Spiderman climbing up walls on that, and things will get tricky fast.<p>As the article notes, slow camera moves are a plus as well. The reason given is technical, but I also wonder how far you could really push the camera motion even if tracking lag wasn't an issue. The background is calculated to match the camera's viewpoint, so I expect it would be very disorienting for the actors if the camera was moving at high speeds.
"Once Upon a Time" (2011-2018, with Robert Carlyle as Rumplestiltskin!) was shot on virtual chroma-keyed sets with real time integrated pipeline tools to preview how it would look.<p><a href="https://en.wikipedia.org/wiki/Once_Upon_a_Time_(TV_series)" rel="nofollow">https://en.wikipedia.org/wiki/Once_Upon_a_Time_(TV_series)</a><p>The tech behind Once Upon a Time’s Frozen adventures<p><a href="https://www.fxguide.com/fxfeatured/the-tech-behind-once-upon-a-times-frozen-adventures/" rel="nofollow">https://www.fxguide.com/fxfeatured/the-tech-behind-once-upon...</a><p>Once Upon a Time” TV Series VFX Breakdown<p><a href="https://web.archive.org/web/20180623020817/http://www.animationboss.net:80/once-upon-a-time-tv-series-vfx-breakdown/" rel="nofollow">https://web.archive.org/web/20180623020817/http://www.animat...</a>
For those wondering, this appears to be not nearly as expensive as I thought. The 20" square panels used are available for around $1000 each if you buy a lot of them used. Compared to a typical production budget for a high-quality franchise, it's surprisingly cheap to build one of these walls. The equipment to run it, on the other hand, is likely not cheap at all.
Sony had a big demo of this in their CES booth a few weeks ago. They showed the camera moving around the Ghostbuster's Ecto-1 car and the background moving as well. You could see from the overhead screens what the final composite looked like. It was pretty awesome, given it was all set up in a booth at a trade fair. [1]<p>As expensive as all this is now, I think this is really going to make an impact in lower-budget movies. Not having to fly all over the world or having massive sets to film convincing scenes might be a really good thing.<p>1. <a href="https://www.youtube.com/watch?v=kh2Q9pCxuJw" rel="nofollow">https://www.youtube.com/watch?v=kh2Q9pCxuJw</a>
Funny that this is practically the same concept as shooting in atelier with exterior background painted on walls as it was done in old movies. The progress is only in technology - now it's created by game engine and projected to giant LEDs, back then in 1930s it was done by hand by painters.
This reminds me of The Mill BLACKBIRD - <a href="https://www.youtube.com/watch?v=OnBC5bwV5y0" rel="nofollow">https://www.youtube.com/watch?v=OnBC5bwV5y0</a>
"Postproduction was mostly refining creative choices that we were not able to finalize on the set in a way that we deemed photo-real."<p>Does anyone know how they were able to swap out the in-camera version of the background originally shown on the LED wall with something more convincing later? Seems like it'd be tough since it's not a green screen!
I think the demise of Lytro was a huge missed opportunity for the film industry. They had this and a number of other features in their cinema camera before they became defunct a few years ago.<p><a href="https://www.youtube.com/watch?v=4qXE4sA-hLQ" rel="nofollow">https://www.youtube.com/watch?v=4qXE4sA-hLQ</a>
I've had a peripheral interest in virtual sets and real-time compositing by way of a colleague from grad school.<p>A quick visual summary of this tech: <a href="http://www.lightcrafttech.com/portfolio/beauty-beast/" rel="nofollow">http://www.lightcrafttech.com/portfolio/beauty-beast/</a><p>This video was from a pilot several years ago, and it didn't make it to air, but it was visually wonderful.
This seems slightly limited in its current form in that they either have choose to either shoot the rendered background as-is (making it harder to modify in post production) or specifically turn the areas around the actors into green screens (sort of defeating the purpose).<p>I wonder if they could use some sort of trick like projectors synced with high fps cameras to make the real time rendering invisible to the cameras instead?
The highest resolution LED panel pixel pitch I've seen to date is 0.7mm...wouldn't this result in a lower resolution capture of the projected background? Specifically, when they're trying to shoot movies at or above 4K range? Also, how do they cope with the scan rate of the background video being played back to sync with the camera recording the footage?
This technique will produce potentially significant rendering artifacts in the final image. The backdrop is correct only from the position of the camera. A reflection from any surface will not be geometrically correct (as seen by the image from the article) I think that even ambient lighting would contain noticable deformations.
So I guess the next step will just be movies made entirely inside of Virtual Reality environments, and an Actor in a MoCap suit who plays his virtual Avatar right?
I don't care how "groundbreaking" the graphics pipeline. I watched couple of episodes and I had to force myself to keep watching to, I don't know, give it a chance?<p>I wonder when The Industry figures out the story is more important than the graphics. You don't buy books for beautiful cover and typesetting... at least not most you and not most of the time.
Please send me a link of someone that watched the original Star Wars, the later trilogy and finally the last “attempt” and really appreciated it.
I even watched “Rogue One” in the biggest screen available around me, with high expectations, and I’m feeling really sad because of that.