Friday, 5 June 2015

VFX Summary and Evaluation

Digital Tools are merely the latest instance in a long history of imaging technologies that have been designed to viewers through a looking glass into domains of novel perceptual experience. (Stephen Prince, 2012)

In television, film and even animation, there can be a use for creating visual effects. They are usually 3D graphics to create depth and build up to whatever emotion is set in the scene, while also highlighting more light and weight to the actors or props. When viewing a VFX example broken down, each layer shows how much an effect reacts to which part of the object or background, such as adding lights, glows and blurriness. Once seeing it in order of development, you can tell the difference before and after the scene in that particular shot has been composited. For my own visual effects project, I had to write an idea for a simple short film, record my own footage for the shots, and then add my own effects at the end. I will explain the process on how visual effects are made, through experience, and then run through the process and evaluation on my own project.

In the world of VFX, a pipeline is all the people and processes, in their order or flow, in a production. To understand VFX and be able to communicate with other artists in the VFX pipeline, you have to first grasp that VFX is only one small piece of a much larger picture. Many VFX artists quickly wind up in hot water when they fail to realise their role in the grand scheme of production. (Jon Gress, 2008)

A visual effects project is started by having the movie footage filmed first. The shots must be film at a steady pace to avoid jerkiness and sudden blurriness from the camera. There the movie shots are taken and edited down into frame by frame images, which are then used for camera tracking. There, tracking the camera will detect every motion of movement made in each frame, and make it easier for the later stage. Any smaller jerky movements detected in the process can be adjusted manually on the program. These can also help for any film footage with reflected object, such as computer screens, windows and anywhere filmed with water. The footage would be tracked in 2D, while the program can view the tracks completed in 3D view. After tracking every motion in the sequence, the shots are exported into a file for use in 3D developing software, providing you select one of the tracks first to detect an angle for the next stage.
Matchmoving, Also known as camera tracking, is a major aspect of modern visual effects. It’s the key underlying process that allows visual effects artists to convincingly insert computer generated elements and characters into a live-action plate, so that everything appears to live in a consistent three-dimensional world.  (Richard Radke, 2008)
 As many of these programs are used in 3D animation, for example Maya, the footage can be used to add additional 3D objects to the scene. For any items to be placed in the foreground of the scene in the final product, a flat plane would be needed to be green-screened later on. Animating the 3D objects should be simple, providing they are in the right position and angle, to fit with the film footage viewed from the camera. Once the animation works, the scene can be rendered out and be composited. Before this, the software also provides settings on how to render this out. This is important as they are useful for the compositing software to read in the next stage. The render settings include Passes, which you add on the object you selected, such as shadows, reflections, and main important passes for matte, beauty and diffusions. The passes would then be saved on to a pass contribution map, to which that is then added to the render.  Once the settings are ready, the render batch can be processed into the correct file format for composition. A batch render will take a long amount of time, depending on how many frames are in the shot.  

Compositing is sort of umbrella term we use to cover a number of different technologies that allow the creation of a new image (or composite) from multiple unrelated elements. These can usually be thought of as separate layers that are sandwiched together to create the new image. Anyone who has created a still graphic on Photoshop will be familiar with the concept of layers. (John Jackman, 2007)

The next stage, in the compositing software, we import the image sequences and can now add the effects. The rendered files from the previous step would go at the front, while the original image sequence we used further back would be used as the back. The composition software is where you will find a huge list of tools to try and add to your film. The map on the program will tell you where everything is connected and where you want it to be. Every effect you made would go from the image to a blend option, where the front and back inputs would be connected, and then connected to the final output. With the effects added in place, and then previewed, you can finish the film by rendering it completely into a full quality movie file, and then your film should be saved with all of the effects that you created.

After learning everything I need to know, I could then start on my own project. The idea for my piece is about a rivalry being two gaming consoles: one is a handheld device, and the other being a game engine from a PC. Their battle would shoot up a couple of beams, sending their avatars up to meet face-to-face, and fight. The end result is them being equally matched, due the consoles having to be updated frequently, and a truce is made. Following the same method of the VFX process, I filmed my footage at home, in a room with enough space that could show the two consoles facing each other, and have enough light in the daytime for the clear footage. Once everything was filmed, I attempted to match-move the footage. There were a lot of times where the tracking points were less positive. But I managed to easily adjust them, as well as using a contour to block out the reflection from the console screens. For the characters, I modelled a mouse pointer and a Mii model (representing the Nintendo console). Both characters are equipped with swords so they look like they are in a fight. For the Mii model, I created a quick character rig for the arms to help him move the sword. During this, I found out that rigging is also used in other forms of visual effects. The soundtrack I have used is the Duel of the Fates track from Star Wars.

Rigging with the joint tool is most often associated with character animation, not necessary effect. However, joints are really just another deformer. Sure they get their own set of menus and a whole bunch of cool options, but when you get right down to it, their main purpose is deform geometry. (Eric Keller, 2007)

Rendering the footage with these objects took a long time, seeing as each frame would be rendered at a time, and was taking up a lot of memory. Once that finally finished, I could composite them with the glow effects I wanted for the lights. These were easy to add on, while the characters in the beams where difficult to find a tool to make them more visible. The settings on the compositing software confused me, as I tried to render the final footage. But I eventually managed to complete it all and compile into the film. I edited the footage down in the form a teaser trailer, as the later scenes in the storyboard were put aside in case of time constraints.

After finishing the project, I asked for some feedback on the film. I printed out a questionnaire for them to fill in after watching it. After receiving them back, the results were an overall average score. While they enjoyed the film, they said that there a few major flaws. One was regarding the characters in the film. While they did agree that they were portrayed well, there was the lack of appeal from them, and they were not visible enough inside the holograms. So, i would need to add another effect somewhere to adjust their appearance, or edit the glowing effect on the holograms so they are toned down. They also pointed out a few moments where the holograms were clipping with the background shot, meaning there must have been some errors i missed during the process. Approval on the effects were mixed, with some saying they strongly agreed while others disagreed. other than that, the soundtrack that accompanied the film did not blend well. So, while it was a fair score, it is clear that i need to make more improvements on the visual effects and editing.

In summary, Visual Effects are an interesting set of tools in film and animation to look at.  They provide better graphics to certain shots, and make the scene more grand and colorful. As for myself, it was not a bad experience, but I did struggle. After learning from experience and the feedback from the audience, working with visual effects would not something I would look into, but i would consider maybe a little more practice, as it would come in useful for other animation projects in the future.


Reference List
1.       Prince,S,2012 “Digital Visual Effects in Cinema: The Seduction of Reality”, US, Rutgers University Press, Page 11

2.       Gress, J,2014, “[digital] Visual Effects and Compositing”, New Riders, Page 24

3.       Radke, R, 2012, “Computer Vision for Visual Effects”, Cambridge University Press, Page 207

4.       Brinkmann,R,2008 “The Art and Science of Digital Compositing”,US, Morgan Kaufman,Page 250

5.       Jackman, J, 2007, “Bluescreen Compositing: A practical guide for video and moving making”, Oxford, Focal Press, Page 5

6.       Keller, E, 2008, “Maya Visual Effects: The Innovator’s Guide”, Canada, John Wiley and Son, Page 83

Final VFX Video


VFX - Final render and post-production

At the final stages, i am finished with the batch renders. I have now begun adding the effects on the film, using Autodesk composite. here, I have 2 layers of the film: the front shows the lights and characters, and the original film footage is at the back. for the effects, i am just using a glow effect for the light beams, when they travel through the consoles and bringing the characters up.


The film shall be rendered out and edited into the final film. as i will be making this in the form of a teaser trailer, i have also made a set of text cards using photoshop.

VFX Project - Production

So far the process is going well. Although at first, i was worried that i would have to film the footage a 2nd time. The original footage works fine.

At the process, i tracked through all of the shots, and they were positive. while a few points came out red, they still work okay, for the screens to not effect the tracking process, i placed a contour over the screens, which block any tracking movements inside it.

For the characters, i have modelled them separately. The characters represent the mouse pointer for the PC and a Mii figure for Nintendo. The Mii model has also been rigged, to control his arms. A Sword is also equipped on the characters.


After that match-mover process, I have created what is essentially a cube. this will be the glowing effect which will merge out of the laptop and 3DS screens, as beams of light sending the two characters up. the black screen blocking the footage is to be for the green-screen effect i will use when composting the film.


VFX project - Pre-production

And now, i am given the task to create my own short film.

The premise for my idea will be simple and quick. the plot involves two consoles: a Nintendo 3DS and a laptop, preparing for battle to find out which is more powerful. Originally, the plot would end with both sides being equally matched, as they required an update. However, i will focus at the beginning, and finish it as a trailer.

Here is the storyboard for the original concept.

Wednesday, 3 June 2015

VFX Research - Prometheus


Here, we have a breakdown for the effects used in the film, Prometheus, created by MPC.
A good example in the beginning of the video is the scene with the spaceship, all broken down in many layers. Starting from the background, being completely dark and mostly blank, we then see the additions of the moon, followed by it's own lighting effects and being partially blocked by another planet to make it look like an eclipse. The the planet at the bottom of the screen is one huge image plane, showing the mountains and later covered by another set of image planes for the clouds.

The spaceship is on a separate layer, which is also has it's own set of effects. First the footage is shown as a black silhouette-like state, until we see it is given some brighter effects composited on the ship. Additional light effects are added to the engines, to show they are on full power, while the effects for the outside of the ship are adjusted to show the moonlight being reflected whereabouts on the ship, while the darker shadowy tones are where the light does not touch. The visual effects composited on here also help make the spaceship become large and heavy while it travels through the background in this scene.

Another example in the video highlights a lot of image planes and textures that have been edited to be more transparent in the background to brighten or darken the tone when viewed in that shot.  

Autodesk Matchmover and Composite

As we learned about his unit and how it works, we went through a tutorial on how the process is made when making Visual Effects in film.

To start off, we first needed to record some video footage. Using a high quality camera, such as the Nikon D3100, we filmed a few short clips primarily turning at an angle, to get the 3D feel later on in our test project. With the footage gathered, we compiled the clips into a sequence of frames, using Adobe Premiere Pro, and imported the sequence on Autodesk MatchMover. one the sequence is imported, we set up an automatic tracking, using the 2D tracker menu, and the software will delete all the soft tracks, track every frame in 2D and create a possible render for the camera movements. some points made in each frame might need manual adjustments during or after the process. this is due to how the video quality varies on the recording, so it is best avoid any fast or jerky movements with the camera.

Once the process is complete, we saved it as a file for Maya, so we can use the footage to add any features we want in the scene.
Using the 3D objects we want for the scene, we had to angle them in the right position so they move in the right place with the film footage we've just recorded and tracked on MatchMover. Once the outcome is approved, we altered the Render Settings so it is saved as a frame-by-frame OpenEXR file, while also using Mental Ray. Then once the Mental Ray is selected, we can choose what passes we want in the scene, while creating a pass contribution map for the associated passes to be linked with. Once the map is selected on the Master layer of the Render, we can start a batch render. The process of the batch render could take over a day, depending on the amount of frames in the scene.

Once the render is done, we can import the EXR files onto Autodesk Composite. This software enables us to create layers of different feature over the film, using only selected parts we want to add as visual effects. The program shows a map where we can connect which path to each section, such as the input, mask or matte, and everything should be connected to the end which is the Output. There, we could add glowing effects, brighter or darker aspects, and any 3D moving objects having their own effects without tampering the video footage.