Categories
Nuke

Week 1: 3D Tracking.

3D tracking uses the 3 axes i.e X, Y and Z. X denotes the width, Y denotes the height and Z represents the depth, this helps us to establish a 3D point in the video. 

But first, to tack, the video needs to be distorted. To do this, we added the Lens Distortion node and the distorting can then be done in two ways: the automatic way is to go under analysis>grid detect>current frame>click detect> click solve under editing and drawing, the other way to do it is the manual method in the Lens Distortion node, under analysis>editing and drawing>click add lines> draw lines in width and height on a building or the horizon which will help the software to understand the distort in the video and fix it> click on solve it.

To track, we need to add a camera tracker node. The node will analyse the video and then place points, it is also better to mask out any reflective surfaces so the tacking will be perfect. Add a roto if there are any reflective surfaces> added to the mask input on the node. In the Camera Tracker, there are a lot of options. In Source there are 2 options one stills, and the other, sequins. Still is one frame and sequins are the whole video. In Mask, there are various options but if anything has been masked out the setting should be changed to Mask Alpha. The other option is for the camera and its settings. 

In the camera tracker setting, under features, the Number of Features is the number of tracking points and the preview features must be switched on so that the points can be viewed. Go back to Camera Tracker, click on Track then click Solve, the Error should not go over 1.0, then go to the Auto track and click on Delete unsolved and Delete Rejected.

After solving, the green lines will be visible, select any line and right-click it and there will be a toolbox opened. Select the ground plane>set origin, this will be the centre of the 3D world, now select a few points which are on the ground again go to ground plane>click on set to selected, this way the software will recognise the ground.

In the Camera Tracker, go to export in that Scene+ make sure you have switched on Link output.

In the Camera Tracker, select a point or more and right-click on create. There will be various options. If you want to add a 2d object in the face click on the card and place the card where you want the poster to be then add the image or video to the img input of the card. By pressing Tab on the viewer you can access the 3D view and move objects in 3D space. 

For our homework, we had to play around with the camera tracker and this is what I created.

3D Tracking home work.
Home work Nodes

My biggest take away from here, apart from 3D tracking is, to change the anchor point in the manual in Corner points. So Roos and I are working on a group project, and I wanted help with it aligning the video in the right perspective, so I asked Gonzalo for help, and he showed me this easy way.

In Corner points setting, unselect all the enable and move then according to your requirements, you can then go to From click on Copy ‘to’ go back then click on Copy ‘from’ then select all the enable. This will allow us to edit to the right alignment. 

Corner points after moving
After changing corner points.
Categories
Showreel VFX Fundamentals

Showreel Term 1

Categories
Design For Animation

Week 10: Research.

This was the last class for the term. The class was about guidelines for the audio-visual, so we all had to share our knowledge with an audience through an audio-visual presentation, based on our research. We have 3-5 minutes to inform the audience about our topic research. 

After all the research and finding the right resources to complete my critical report, this is how my research has come out.

For my video presentation, I took videos from youtube and edited them in DaVinci resolve.  

I liked the topic I picked because I learned a lot more about green screen and chroma key, and the origin. It started by painting a piece of black glass to create an optical illusion and now there is no movie which doesn’t use chroma key. It’s unbelievable how so easily by a click of a button we can remove a whole background or a person and more, but to get here there were so much trial and error and so much time spent on it, and we don’t even appreciate it.

Categories
Nuke

Week 10: Real Scenarios in Production.

The class started by reviewing our Balloon Festival work, which we all have been working on it for a long time. 

The class was more based on theory and less on using the software. Gonzalo was giving us some tips which will help us grow and understand the workflow of a VFX company. The first tip he gave us is to review our work before publishing the work, I felt it was the best advice because a lot of times I had not seen my work and given it to clients because they wanted it ASAP and later realised that there is jump cut or a black screen.

The stages in the production.

  • Temps/Postviz: This team help to get the VFX in low quality to understand the tone of the movie.
  • Trailers: For a movie to get good PR they have to publish trailers on youtube and IMDB. So this team tries to gather a shot clip which can be published.
  • Finals: When the client is done with all the corrections and changes. This will be the final.
  • QC: Is also known as quality control, VFX Supervisor looks at every pixel and makes sure that it looks good to publish.

Every company has its own way of commuting and organisation with their employers. Google docs and sheets, Ftrack, and Shotgun are the few software which VFX compy use to communicate, all of them have they’re pro and cons, but Shotgun is used for VFX solely.

They are a different way of reviewing dailies.

  • Desk dailies review: this is a more friendly way of reviewing, there is a VFX supervisor, VFX producer, line producer and lead comp. 
  • Small Cinema dailies: This is a little fancier, the line producer generally calls the meting. In this dailies, we will see the lead CG, lead comp VFX supervisor, VFX producer, line producer, lead comp and sometimes the client.
  • Big Cinema dailies: lead CG, lead comp, VFX supervisor, VFX producer, line producer, lead comp, editorial, lighting lead, client producer, client VFX supervisor, client director and sometimes the client.
Categories
Maya

Week 10: Rendering The Face.

Nick wasn’t well so he didn’t come in, but he took the class online. We started off by adding lights to our scene so we can render it out we used a 3 light setup we sued a spotlight to make the setup.

3 light setup

Then we had to make a tongue with a cube > add subdivisions > use vertex to give the tongue shape> put in the mouth and use edge to make sure it fits in the mouth> in attribute editor add colour and name>in outline out the tongue under jew_rotate so it moves with the mouth.

With tongue

Next was to do the colour correction, open hypershade>graph on the tool bar>graph materials on selected objects a node graph will open> press tab and type colour correct> from your face out colour contacted to in colour of your colour correct> get the right tone or you face by playing around with colour offset>contact the out colour from the colour correction to specular colour in your standard surface>add one more color correct> put the out color of the 2nd correct to coat color and base color change the color offset to get the right tone> after making the changes got to the presets on the material click on skin> under subsurface make the tone is changed to red to give it a subsurface scattering.

The next step is to render the file, go to render settings change the setting on frame/animation ext to “name.#.ext” under frame range put the keyframes you want to render and select the image size you want it, also tick on merge AOVs and half-precision.

Render Settings

I add this video in Davinci resolve because I don’t have after effects and Davinci is for free and does the same job. I added a background gave to a blur and colour-corrected the face to match the background.  

With the edges on
Final render

Modelling the face was fun and hard, fun because we had used a lot of new tools and hard because of all the crashing and forgetting to save the file and doing the whole thing again and again. I really liked the shape editor, which used to make the movement of the facial expiration, but sometimes I use to keep two of the target on and both would get affected and I had to command it whole thing again and do it again, overall it was a fun experience.

Categories
Design For Animation

Week 9: Report Structure and Referencing.

This week was about how to write our research and how to refer to Harvard referencing. 

There are a few things to keep in mind while writing, so we cant use sources which don’t have credibility or it is opinions, like personal blogs, YouTube videos or films review. 

Sir also showed us the structure of the research. The word count starts from the contents page and ends at the conclusion. 

  1. Title: subtitle
  2. Acknowledgements (optional)
  3. Abstract 
  4. Keywords
  5. Contents page
  6. Introduction 
  7. Literature review 
  8. Main body of text
  9. Conclusion 
  10. Appendix (optional)
  11. Bibliography
  12. Image list (optional)

In Harvard referencing, you have to have all your reference like any quotations or paraphrases at the end of your report and if you refer to more than 40 words you should add it separately from the main body, and live 1cm of space around on each side.

Title: The origin of green screen technology.

Abstract: With just a push of a button or a click of the mouse, we can remove a background or add all new scenes to a movie with the help of the computers and camera in our modern world, it’s easy to forget that the very first motion pictures were, themselves essentially special effects. It has generally been forgotten how these special effects were created.  

Keywords: Chroma Key, Green Screens, Matte, Blue Screen, Williams Process, Norman Dawn.

Contents page: working on it

Introduction: The use of green screens in weather forecasts is well-known. In post-production, the weather map is added to the green background against which the forecaster is positioned. Movie productions that rely on separately filmed or animated background shootings are another common application for green screens. Chroma keying is a technique for changing a monochromatic background to a different one.

Literature review: working on it

Main body of text:

The background of contemporary greenscreen was the history of optical illusions. At the end of the 19th century, Georges Méliès was one of the first prolific filmmakers in history. According to Ezra, a man who devoted his life to learning the craft of illusion (2019, Georges Melies). Méliès used a visual technique that is the primitive forerunner of what we now think of as greenscreen compositing in his 1898 film Four Heads are better than one, in which he as used mattes for multiple exposures. Parrill said This was the very first matte which was used in moving pictures. (2011, European Silent Films on Video)

Méliès would use a piece of glass with black paint on it to “black out” certain scenes in his movie. This is referred to as a “matte,” and it was created to exclude all light from the film so that it would not be exposed to light. Then Méliès would stop the film, rewind it, and this time expose only the area of the frame that had been covered by the matte previously (Mullen, C.J. and Rahn, J. 2010). The double exposure was created entirely within the camera and may combine two or more distinct photos into a single frame. 

The issue with mattes is that the camera had to remain motionless at all times, and nothing could cross the matte line, the boundary between the real-time action and the matte painting. Black matting, a technique that Frank Williams first invented in 1918, “was used to shoot the couple against a blank background and then create a travelling matte to composite them against a transforming background” (Editing and Special/Visual Effects, 2016)  in the movie Sunrise, 1927. The film would then be duplicated to highly contrasted negatives until a silhouette in black and white was visible. 

The Williams Process, often known as the black back matte effect, was utilised in 1933 for the movie The Invisible Man. In order to capture the scenes in which the invisible man was stripping off his clothes, the actor had to be photographed while wearing a full black suit and posing in front of a black surface this is also known as “self-matte”. (Editing and Special/Visual Effects, 2016)  Even after more efficient procedures were introduced, this effect continued to be used because it was so memorable. There were problems with the Williams Process as well any shadows on the subject would disappear in the matte. 

Around 1925, C. Dodge Dunning created a novel alternative that employed two colours, lighting a backdrop screen with blue and the foreground subject with yellow. The Dunning Pomeroy process would support the blue and yellow light to produce a travelling matte by applying coloured filters and dyes. Dunning process was first used in King Kong (1933) on the scenes where King Kong comes through the big village gates. (Mitchell, Mitch 2004)

Many effects artists adopted the concept of utilising a blue screen to isolate an element photographically to produce colour composites as the years went by and colour film became available. “One of the early examples of the process was that developed by Lawrence Butler for Alexander Korda’s The Thief of Baghdad in 1940. The colour blue was first used for the simple reason that there is very little blue in skin tones. The blue wavelength can be isolated while still getting a fairly acceptable colour rendition for faces”. (Mark 2011)

In the past, only video systems were covered by the phrase “chroma key.” That is no longer the case. A keyer was a mathematical procedure used in early video mixers to make a variety of colours in a video signal transparent. Of course, weather map special effects are used frequently in television newsrooms all around the world.

When movies began digital post-production in the late 1990s, green began to overtake blue as the dominant screen hue. Why Green? In general, green was less expensive and easier to light than blue, registered as brighter on electronic displays, and worked well outdoors (where the blue screen might match the sky). Additionally, since digital cameras have begun to replace film cameras, many digital sensors now employ a Bayer Pattern to collect brightness, which has twice as many green photosites as blue photosites. Because of this, current digital cameras are significantly more sensitive to the green portion of the spectrum, making it slightly simpler to pull a matte from the greenscreen.

Conclusion: Special effects have been used by filmmakers to advance the medium since the beginning. The only thing that matters in filmmaking, without a doubt, is what is on the screen. It ultimately comes down to opening a window into another world, from Edwin S. Porter’s matting railway station window to the contemporary action spectacular. All of these effects we have are the only means to get there.

Bibliography:

Ezra, E. (2019). Georges Melies. Available at: https://books.google.co.uk/books?id=YXICEAAAQBAJ&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false. [Accessed 22 Nov. 2022].

Parrill, William B. (2011). European Silent Films on Video. Available at: http://ebookcentral.proquest.com/lib/ual/detail.action?docID=2066564. [Accessed 3 Dec. 2022].

Mullen, C.J. and Rahn, J. (2010). View finding: Perspectives on New Media Curriculum in the Arts. Available at: https://books.google.co.uk/books?hl=en&lr=&id=_F3FyK-rRTMC&oi=fnd&pg=PA31&dq=M%C3%A9li%C3%A8s+would+use+a+piece+of+glass+with+black+paint+on+it+to+%22black+out%22+certain+scenes+in+his+movie.&ots=YWsfK0Y8vU&sig=mnXABVIKT4fUduKlXYzU245-l1Q&redir_esc=y#v=onepage&q=M%C3%A9li%C3%A8s&f=false. [Accessed 4 Dec. 2022].

Sawicki, Mark. (2011) Filming the Fantastic: A Guide to Visual Effects Cinematography Available at: https://ebookcentral.proquest.com/lib/ual/reader.action?docID=739038&query=green+screen [Accessed 3 Dec. 2022].

Editing and Special/Visual Effects, (2016) http://ebookcentral.proquest.com/lib/ual/detail.action?docID=4647677  [Accessed 3 Dec. 2022].

Mitchell, Mitch. (2004) Visual Effects for Film and Television, Taylor & Francis Group Available at: http://ebookcentral.proquest.com/lib/ual/detail.action?docID=226841. [Accessed 15 Dec. 2022].


So I needed to work on completing my main body I have to write about the technology used now to remove the green from the other colours in our software. I exactly don’t know what is a literature review I think if I read about it I will understand it. I found out that we have access to ProQuest ebook central which was really helpful to find ebooks more easily.

Categories
Nuke

Week 9: Clean Up.

The class started off by talking about blur, so the right way to blur things is not by using the blur node but to use defocus. The difference between them is that defocus works like a lens when the object is not focused we get a bokeh effect that’s how defocus works but, blur on another hand just blurs out everything.

ZBlur with the help of the ramp give a gradient blur on a plate but it is only in a 2D space, but ZDefous works in a 3D space there is a small dot on the screen that says focal point that point can move and that can be dragged anywhere in the 3D space to make focused, and there is more potion in ZDefous like to add depth of feed. After changing the output setting to focal plane setup it will help you to identify which place is focused and which is not. I felt was really nice.

We later moved on to Roto Paint, this node is used to clean up and clone out objects in the plate. The Roto Paint is a blush tool for painting on a plate. There are different options you can change under roto paint like transform, motion blur, shape, stroke, clone, lifetime, tracking and node. We show a few potions under strokes like the brush size, spacing and write on start and end.

How to roto paint to clean up, first we will have to use the roto paint tool to clean up the area, then use a roto to draw a box around the clean up the area and then premult the patch, later add a frame hold to the same frame where clean up was done, add a tracker and create a transform match move so that the clean up can be seen throwout the video.

So for our homework, we had to clean up a video.

Without clean up

We also had to complete our balloon festivals project.

This had a line on the screen so I had to fix it.
And this is with it fixed.

Categories
Maya

Week 9: Lips Syncing.

Last week Nick told us to get audio so that we can sync with the face we model. So I thought I will use a shot from Loin King, but then Nick told me to use something which not animated. So I picked up this.

The Departed.

After I downloaded the video we had to put it in After effects or Premer pro and then export it as a jpg and the audio had to be converted to WAV format to put it in Maya. To import the video on your background of the Maya window we had to go to view>import image> then select the 1st jpg where to have saved it> import, then go to image plane attributes then click on a use image sequence.

If you have edited your video and the jpg doesn’t start from 1 then in frame offset put the number it states from and it should line up. To import the sound you need to right-click on the timeline go to audio> import audio select, then your audio should be there.

The first step in lips syncing is to open the mouth of the model when your reference character opens his/her mouth and keyframe it at the right time. Nick told that was the better way to do it if we don’t have a reference, we should keep our hand under our jaw and say the lines and when we feel our mouth pushing down on our hand add a keyframe there. The next step is to open the shape editor and use the sculpting tool to make the facial expiration. and keyframe it with the mouth opening and closing

Lips Syncing.
whit3fck (2013),The Departed - Maybe. Maybe not. Maybe fuck yourself. YouTube[Online Video]. Available fromhttps://www.youtube.com/watch?v=7JYJhWIwGUw&ab_channel=whit3fck [9th November 2022].

Categories
Design For Animation

Week 8: Literature Reviews and Writing Approaches.

This week was about literature reviews and how to get your topic to develop your research topic. So before we lock down on a subject, we should ask ourselves a few questions: What motivated you to do the research? What will the reader learn? and How might the inquiry connect with previously established research? Then we saw the structure of the Critical Report which we all had a lot of questions about. Nigel also gave us academic resourceslike Google Scholar, Jstor and Animationstudies 2.0, he also spoke about how writing introductions and conclusions which were helpful.

So I have been writing my research for a long time, but I didn’t know that I had to update it in my blog. So now I will try to update what I have done.

The topic I chose is the history of visual effects, but it was too much to cover in the research, so now I have planned to stick to only writing about green screen. I found a lot of papers about the green screen but very few about the origin of the green screen. Then I came across [digital] Visual Effects and Compositing By Jon Gress which briefly spoke about matting and I felt that was where it all began, but a lot of people say that I began with the film The Thief of Baghdad by Lawrence Butler in 1940. Now I need to find more academic resources to help to complete the paper which is hard to find.

Categories
Maya

Week 8: Facial Expiration.

We had to fix our modal jaw so that we could make the puppet movement with our modal. When I say puppet movement I mean to say the jaw going up and down. After making surer that our joints are in the right place we started working on the facial reactions.

So to get the facial reactions we use the shape editor tool. When you click on the shape editor a new window opens and then you need to add a target and make sure you have clicked on the edit button then go to sculpting and use the tool to get an expiration you like then when your happy with the face switch the edit off. Then you can move the control front and back to achieve the expiration. This way we had to so smile, frown and blink.

The tools we used in this class:- Shape Editor, Sculpting, Pull, Paint Skin Weights and Timeline.