Thesis Blog Part. 1 | Figuring out the Photogrammetry Pipeline

Work In Progress / 18 January 2019

Hey guys, I'm currently developing my thesis project, an Environment Look Development project based on Balinese culture. I originated from Bali, so I took the chance over the break to do some photogrammetry for the project. While I do learn the basic workflow during my internship at Turn 10 Studios, this is my first time dealing with it from scratch by my own. In addition, not many students in SCAD are currently proficient at developing a photogrammetry based assets. So it's a challenge for me to figure out the pipeline for my thesis. Let's get started!

1. Scanning the Object

Before we start we doing aligning images, it's best to start the work in creating a Color Profile. X-rite provides a free color checker software that allows us to extract the profile data using DNG images.  One of the biggest practice is to always bring a color checker with you when working on a photogrammetry project.

Now that we have the color profile data, we can batch apply them using Adobe Bridge. Just open one image through the Camera RAW plugin and copy the attributes to the rest of the shot. This will allow us to generate a same color correction without the need to do it manually one by one.

Finally, we can start working on making it into 3D. I choose Reality Capture as it provides a faster iteration along with less noise mesh compared to Agisoft Photoscan. And working on a laptop gives Reality Capture more advantage and flexibility over Agisoft. 

Another benefit in using Reality Capture over Agisoft is the fact that the UI is so much easier to navigate. A nice dark themed UI, along with in program tutorials made me love Reality Capture right away. It took me around 1 hour from aligning the images to exporting the highpoly mesh from the software.

Not to talk bad about Agisoft, but it will definitely took me a lot longer if I were to use it over Reality Capture. In the end speed triumphs everything in production.


2. Retopologizing the Scans

After all the scanning work done in Reality Capture, it's time to retopo the mesh. Working with such a high poly mesh is a nightmare, especially if you're using Quad Draw like me. Fortunately, I found this cool trick by FlippedNormals about GPU Cache in Maya. 


Importing the scanned data itself took me around 10 mins (speed up for GIF). Even after the mesh was imported, navigating in Maya is really a hard thing to do. But for this trick to work, we still need to import the mesh before we begin anything.

Now that we have the mesh imported, it's time to export it out as a GPU Cache. You can go to the Cache Tab and select GPU Cache to export the selected mesh as an alembic cache. 

All you need to do now is re-import the cache back in to Maya and start retopologizing it. As you can see, the cache didn't give you any poly count while still preserving the high poly from the scanned asset. This way Maya will be a lot friendlier to navigate and use. 


3. Baking and Re-texturing the Scanned Asset

Once I'm done with retopo, all I did was import it to Substance Painter to bake the textures. I prefer doing this over re-importing it back to Reality Capture as I'll still need to do another texture pass on top of the scanned data. Mainly for the parts that I couldn't scan well enough. Diffuse map is baked from the Vertex Color on the High poly mesh, Roughness/Specular map is extracted from the Diffuse G Channel in Nuke, the rest are baked directly in Substance.

Here's a screenshot in how I work with layering the materials. I have also created a Metallic mask in Photoshop to make sure I get more control in shading the mesh later. Once satisfied, I just exported them using the Packed UE4 Preset from Painter. This is one of my main reason to go straight to Substance for texturing, as we know, Substance Painter supports Unreal Engine 4 even though it uses a different shading model than what we used in offline render engines.

4. Look Development in Unreal Engine 4

Setting up the material in Unreal is quite tricky. The default Material doesn't give us much freedom in creating an good 'render'. So, I've created a Master Material that allows me to at least customize the intensity of each maps, along with adding a texture variation. The next thing I'll do is make an overlay setup for grunges to give it more details.

Here's a screenshot of how the material tree looks like:

This setup helps me create Material Instances quickly with adjustable attributes for look development purposes. In this setup, I've added a tiny amount of displacement 0.005 to be exact, just to give a small touch in depth. Enough with the blog post, here's the final render straight from Unreal Engine 4:

Thanks for reading my blog, I'll be updating them weekly for my thesis project. 'Till next time!

Realtime Car Commercial Blog, Part 9. Wrapping Up

Work In Progress / 14 November 2018

Hi everyone, back to my real-time integration project blog! As the quarter draws to an end, it is time to wrap up this experimental project. For the past week I've been mostly focusing on fixing the compositing in Nuke while helping on the usual trouble shooting pipeline in Unreal. 

After struggling with matching the color on the plates, my professor, Bridget Gaynor, gave me some tips in color correcting image by using color offset. This helps the mood match by a lot. My hardest struggle before was matching the contrast between the 2 plates, using grade, gamma and gain to no avail. By offsetting the shadow value by a small amount, I managed to match the color better. Here's the final nuke tree for the first shot:

The first 2 backdrops are for controlling the HDRI transition that I've showed in last week's blog update. With an addition of a zdepth pass for the final touch. Each passes are exported by my team mate, Antonio Gil, by hacking through Unreal Engine 4. 

The second shot is more complicated, especially with the transition FX happening in this sequence. I've mostly worked on color correcting and reformatting the tree to make it look more pleasant, but my other team mate Haley Jones did most of the heavy lifting for this shot. She did a lot of roto for the shot which help me integrate it tremendously.

Last, but not least, the final shot Nuke tree. This one is more straight forward as it doesn't have much to comp. We just merge the color corrected beauty over the plate. I added a color constant to match the feel of the previous shots. 

So, without further ado, here's the final integrated video of our real-time car:

And with this, I'll end the project blog here. It has been a long roller coaster ride trying to make a vfx pipeline in Unreal Engine 4 from scratch. I gave a huge shout out to my team members: Haley, Taizyd, and Antonio, for helping me making this project worthwhile. Do kindly check on their websites for their amazing portfolio!

Realtime Car Commercial Blog, Part 8. Importing FX and Compositing

Work In Progress / 06 November 2018

Hey guys, welcome back to my real-time car commercial blog! At week 8, we finally get to work fully on compositing. The past week I've been setting up a new lighting rig inside Unreal Engine 4, getting a nicer rim light and a more defined reflections on the car itself. In addition, I've also attached several lights for the tail and head of the car. Here's a little preview of how the lighting rig look like in engine:

Now that we have a proper lighting, it is easier to integrate the car into the back plate. I've set a transitioning lighting using 2 beauty pass with different HDRI attached, making it it feels like the car is moving into the tunnel. Here's the how the first shot look like integrated in Nuke:

Car FX was handled by a talented tech artist Taizyd, you should check on his blog on how he developed the Shader FX! The glitch FX was inspired by Disney's Wreck it Ralph, using shaders to move the mesh around. The transitioning HDRI was driven by a custom depth pass that we exported out from Unreal. Here's how it looks like from Nuke:

The transform nodes were used due to a miss match in unreal's aspect ratio to nuke's. Thanks to the glitching FX, moving the translation value of the CG object wasn't a huge problem. The hardest part of the compositing comes from matching the value of the CG and live action back plate. While we do not need to count on the render time, we've spent most of our time trying to figure out how to render out a specific passes like what we have in AOVs from offline renderers. 

In addition movable or dynamic objects can't use any static or baked lighting, at least in Unreal Engine 4. So we are very limited in terms of render quality, having no access to raytracing at all. The only way we can fake them was hacking through the engine to export the render passes. Another problem arises in shot 2, a panning shot where the big FX happens:

With a really low light information in the plate, it becomes a very challenging task to integrate a real-time cg object into the plate. I've tried brightening up the back plate to no avail. When I tried pushing the light value brighter, the plate will look so grainy and doesn't feel right. Thus, I've decided to maintain the dimmed back plate for the shot. We planned on getting an additional FX that our tech artist have made earlier in the project. It's a light streak FX that were inspired by The Speed of Light demo by Epic Games in SIGGRAPH 2018.

Well then, that's it for the update this week! We'll be wrapping up the project next week, so stay tune with our experimental real-time integration project!

Realtime Car Commercial Blog, Part 7. Troubleshooting Fest

Work In Progress / 31 October 2018

Hi guys, welcome back to my weekly blog updates for the real-time car commercial project! As usual, the past week has been full of troubleshooting and development. We've been stuck with a camera miss-match between Maya - Unreal Engine 4 - Nuke. For some reason, the camera that has been used to match the track in Maya and assembled in Unreal doesn't match the plate in Nuke. Thus, I went back and try to re-track it in Maya. 


I found that there's a slight miss-calculation with the crop factor of the Blackmagic since the 22.00 x 11.88 mm lens has an aspect ratio of 1.85, while the rendered out back plate has an aspect ratio of 1.78. After another research, I found out that the official sensor size for the Ursa Mini is 25.34 x 14.25 mm, which rounds up to 1.78 (16:9).

Thus, we exported that out from Maya and hope that it works in Unreal and Nuke. And guess what? Nope, that doesn't change anything. It still doesn't match the camera in Nuke. At this point, we decided to just go with the previously working camera but ended up cropping the edges to make it look correct. 

With the integration done by my team mate, Haley (check her blog updates to learn more about her compositing nodes), I went ahead and start doing a rough color grading pass. I tried making it look more cinematic with the orange and teal color palette, using a simple grade and color correct node in Nuke.

In addition, I tried another way using a LUT in Premiere Pro. In this grading version, the warm tones maintained their value so it makes it look less dull than the one in Nuke. I think moving forward, I'll stick with doing post production in Premiere Pro or After Effects as it's easier to control. 

Another problem I noticed from this pass is the lack of lighting information that Unreal can render out as an image sequence. The car looks so dull especially after applying the color grading to the plate. The next big thing to do on my list is to rework on the lighting, giving the car more depth and overall contrast. I found this tutorial online talking about how to light a night scene.

I realized that right now we just brute forcing the scene with our HDRi and it's not sufficient at all. By watching these video on how to light a night scene in a film, I understand that it took several additional lights that's not in the scene originally to create the effects that we want.

So I tried to add several lights to the scene in UE4 as a quick test. These additional lights help build contrast to the car, and creates another point of interest as the engine in the F40 is uncovered. I will continue build on this light setup and hopefully get it finalized by next week.

This is it for this week's update, see you next week!

Realtime Car Commercial Blog, Part 6. Refining the Real-time Pipeline

Work In Progress / 23 October 2018

Hey everyone, back again with my weekly updates on the car commercial project! As usual, I've been mostly dealing with R&D especially when we don't have enough resources and knowledge regarding the real-time VFX pipeline. Over the course of the week, I've been trying to troubleshoot the problem with animation import to Unreal.

One of the main issue that I found was Unreal has a trouble understanding a skeletal mesh with multiple root points. And as for our car rigs, we have a secondary position root control which were added by our rigger. This ends up giving me several issues when importing the animation data to Unreal. In the end, we ended up removing the secondary root control and it fixes the animation issue.

Here's a gif showing the animation working inside Unreal Engine 4:

We still have several issues to fix, such as lighting and reflection captures which doesn't match my previous setup. While it seems like I posted a tiny update this week, most of my time was spent on researching how to import animation correctly. I've tried using fbx, alembic cache, and even animated geo cache but it wouldn't until I figured out the problem which is the multiple root controls on the skeletal mesh.

In addition, here's a closer look at my Nuke tree of my previous compositing test. I'll work on finalizing the compositing part with my teammate, Haley, this week.

I guess that's it for this week's update, see you next week!

Realtime Car Commercial Blog, Part 5. Camera Track Fix

Work In Progress / 16 October 2018

Hey guys, back to my real time commercial blog updates!

I've been trying to fix some camera issues that we found while using the cinecam in Unreal Engine 4. While the cinecam was created to mimic the real world cameras, I found that there is a focal length difference between the two. One thing that my professor pointed out was the crop factor of Blackmagic cam that we used. The Mini Ursa has a 1.7 crop factor, which translates to 1.7 x (Real focal length). here is an example on how they factor into the calculation:

Now that I have the correct focal length information, I can finally work on finalizing the camera match. I have one of my team mates work on extracting the final camera track data from Nuke as an fbx. I then match the track in maya using some basic geometry as a guide. This helped me to see if there is any perspective or distortion issue with the camera. Here is the final previs for the transition FX on the Ferrari F40 to LaFerrari:

In addition, I have also extracted a few render passes to composite the CG car in Nuke. This test still uses the old camera data which has the wrong focal length information. But I wanted to show that it's possible to extract render passes, such as Z Depth, Shadow, Occlusion, etc. from Unreal Engine 4. I set up these passes for the composite test, but I have another team member working on a proper render passes setup from UE4. 

Here is the nuke graph for the tweaking the Z-depth pass. For those who are not familiar with Nuke, the green area shows the focused point of the object, in this case the F40. While this is a small part of the compositing that we'll be doing, it's great to finally see real time engine being used in a proper VFX pipeline.

That's it for this week, see you on the next update!

Realtime Car Commercial Blog, Part 4. Compositing Test

Work In Progress / 09 October 2018

Hey guys, back with my weekly blog update on my real time car commercial project!

I've been working on finalizing the F40 textures using Substance Painter and the final material lookdev in Unreal Engine 4. I've added a subtle surface details on the car, while still making it look pristine. Here's a screenshot for how it looks like using iray inside substance:

Here's how it looks like in Unreal Engine 4, rendered in real time using an updated studio light rig. I've also updated the car paint shader with a dark tone layer to give an additional fake occlusion pass. The additional roughness map helps me to recreate a fake ferrari car paint wavy reflections. I've also updated the post process volume so it gives higher resolution in AO, shadow and bounce light reflections.  

I've also rendered an image based lighting turntable setup using the HDRI we captured from River St. My biggest issue with the hdri is the shadow quality shown in real time. Based on this result, the HDRI will most likely only be used for reflection capture for the car, but I will have to use Unreal light system to give a proper shadow pass.

 In addition to the refined shaders and textures, I've also started playing with compositing in unreal. I tried to recreate the image plane projection setup that I usually used in Maya using a plane and the cinecam in sequencer. This setup works as a compositing test, but I'll need to update it after getting the final live action back plate.

One of the things that's quite important to match the camera track was to get the exact camera settings inside Unreal. After researching on the Blackmagic Ursa sensor type along with the lens settings online, I came up with this data to use :

Lastly, I started studying the magic behind 'The Human Race' made by The Mill in 2017. The real time commercial has been our biggest project inspiration, especially as our team are specifically designed to experiment the possibilities of real time engine. One of the more noticeable scene setup was the fact that they have a plugin to feed live action footage straight to unreal. Which looks similar to my own image plane setup.

A few things that I noticed from the making of video are some of the render passes that they used to export for compositing. Looking at their lighting setup, it looks like they only use a directional light which can be seen through the shadow pass. They've also exported a simple reflection pass using a glass-like material on the ground plane while most likely composited the real time HDRi using their massive blackbird rig.

Another thing to note about, it seems that they've always put a motion blur whenever the car is in front of the camera, limiting what we can see clearly in a close up angle. In addition, they also have a harder time to show a photo-realistic render of the car in a dark area. Especially this shot in the tunnel where you can see the limitation of real time rendering, which they ended up coated it with additional FX work to distract our eyes from the car themselves. 

This is by no means trying to disrespect what they did, they've shown an incredible achievement using a real time engine. My main goal on studying their work is mainly to set a realistic goal for our own project. While we are still being mentored by people from The Mill, we don't have any access to a similar workstation setup that they have when making the short film. In the end, I feel like we're heading towards the right direction for the commercial project.

Well then, see you on the next blog update!

Realtime Car Commercial Blog, Part 3. Look Development

Work In Progress / 02 October 2018

Hey everyone, welcome back to the next update on my blog!

Continuing where we left last week, we finally shoot a real HDRi using a fish eye lens borrowed from my professor. We went to River Street at 3am to give us more time for preparing the camera. We shot 4 sequence with -/+4 EV for each HDR, stitched them together using equirectangular projection in PTGui. Here's one of the HDRIs we took.


Now that I have the HDR plates to work on, it's time to polish my Ferrari F40 materials. I've unwrapped the car model and assigned a total of 59 Material IDs. Since I know I'll be working back and forth with Substance and UE4, I decided to do my lookdev directly inside Substance Painter for convenience purposes.

Look developing a car isn't as easy as it looks. Based on my experience interning for Turn 10, the amount of reference really dictates how good the car will look in real time. Thus, most of the time I spent working closely with my reference, trying to get the similar reflections or bounce light values. And as I mentioned last week, still images just don't do justice in terms of reflections, and video references have different lighting situations. Thus, it's not a simple task to match the lighting scenario within painter.



One of the hardest problem to overcome was the car paint surface detail on the F40. They don't have flakes, instead they have some additional normal details to create that wavy reflections. I tried emulating the same effect using 2 procedural normal noise in Substance Painter for my first shader pass. The Iray render doesn't show much since it's a limited engine, but it shows me that the method is working. I'll probably move to Substance Designer to generate a procedural texture for my final Car Paint Material.



Last but not least, I worked on assigning materials in UE4. I created a master material for the non transparent objects using a simple setup. I put a bunch of texture parameters and multiply them with another parameter so I can adjust them on the fly later.


For translucent objects, I used The M_Glass material from the automotive pack as a base. I deleted the Base Color nodes and replaced them with a vector parameter, multiplied them with an emission function, added another 2 texture parameters for normals and AO. This setup was used for creating the head and tail lights material for the car. The downside of this set up currently is, UE4 sees the object as a see through material when seen up close. But for the early pass, this setup works.


In addition, I also created a studio lighting setup. I tried to emulate on how people photographs real car in real world, by using several rectangular lights in UE4. I have 1 huge softbox on the top of the car, 2 fill lights from each sides, and another one from behind the camera to give a softer gradation on the body.

 
Here's the final turntable using a studio lighting setup I created rendered in real time using Unreal Engine 4.


Realtime Car Commercial Blog, Part 2. R&D

Work In Progress / 25 September 2018

Hello everyone, welcome back to my blog! 

After getting our concept approved by our mentors, we decided to start working on research on the Composure feature in Unreal Engine 4. We went out and shoot a live action footage on top of our school garage to test out pipeline. We took a spherical photo of a chrome ball and converted it into a latlong format using Nuke.


In addition, we also exported a camera tracking data out of the footage we took at that time. It took us awhile to get the tracking data to work in Unreal Engine 4 as composure works fairly different than the usual sequencer pipeline. 

In the end, we found out that the lack of information regarding the camera height, lens distortion, and exposure really affects how Unreal Engine composite the scene.


Moving on to shading R&D, I gathered a few references for the Ferrari F40. I decided to use videos as my main reference as car paint reflections are hard to capture in still frames. One of the more interesting information about the F40 is the high normal frequency detail on the car paint surface, as you can see on the images below :


The second most important research I did was how the lights reacts in the real world. I found a good cinematic video that shows this. The tail lights on the F40 bounce pretty clear especially at night, this is due to the fact that it is located much closer to the ground. The head lights also shows an interesting feature, it has a wide profile while not letting it's base over exposed. This detail will definitely be a hard thing to recreate in real time.


Last, but not least, I also gathered some references regarding the engine and interior. Believed as Enzo Ferrari's last produced supercar, the F40 is not built for luxury. The interior has a minimal treatment, showing only a limited amount of materials. This contrasts the engine behind the car, which shows a good amount of material separation.


With all the references gathered, I decided to move on and prepared the model to be used. I created 46 Material IDs and Unwrapped them using the tri-planar UV projection in Maya over the weekend, because I know the challenges of rendering a photo-realistic car in real time. I learned the importance of having a good material separation and subtle texture details in cars from my mentors back when I was interning at Turn 10 (Forza).


 Okay, that's it for the update this week. See you next time!

Realtime Car Commercial Blog, Part 1. Concept

Work In Progress / 18 September 2018

Hi everyone, I'm currently working on a collaborative project mentored by the wonderful people from The Mill! The goal of the project is to create a car commercial that we can put as part of our own reel. As part of the lighting  group, I will be responsible for the look development of the car, along with the environment lighting.

As a group, we came up with the main concept of:

“The evolution of a vehicle brand over time”

With Ferrari being our brand of choice, we decided to work on the iconic Ferrari F40 and Ferrari LaFerrari. It was a hard task to get good quality 3D models from the internet, but I was fortunate to find 2 good ones from dmi-3d.net.


The idea of a transforming car came from a Mercedes campaign in 2017 titled AMG Project ONE: The Future of Driving Performance


As we live in Savannah, we would like to incorporate the historical value of Savannah with the well-known brand of Ferrari for our commercial project. Thus, we decided to use the River St. as our main live action plate that we're going to composite later in Unreal Engine 4.

We went on to check and gather some shot references at around 04:00 am. We choose to go in the morning as there is less to no people at that time that might bother our reference hunting session. Here are some photos that I've taken with my iPhone.


That's it for today, I'll be uploading my progress of the project every week!