Categories
Collaborative Showreel

Collaborative Showreel

Categories
Collaborative

Critical Reflection.

At the beginning of the collaborative unit, I had some self-doubt because I was not sure if I would be able to deliver what the team required and unsure if I was able to deliver it in time to meet deadlines. I had this feeling because I was fairly new to 3D modelling. After this unit, I can confidently say that I do not have any self-doubts anymore and I can work comfortably with 3D modelling. 

Teamwork is a major part of the collaborative unit and we all came from different courses and fields of expertise as well as different cultural backgrounds, so we had to find a common ground to complete the project to the required standards. One of the ways we achieved this goal was by conducting regular meetings, so all of us could check in on each other and get updates on the progress. I felt that when one of us would miss these meetings now and again, it would be hard for the team to get that person up-to-date on the progress of the team, and also wasted a lot of time getting them up to speed. This helped me to understand the significance of attending every meeting.

Having a good workflow is an important tool for a team. This helps teammates to understand what they needed to work on and what the next task is. This also helps to divide work equally, so everyone can be a part of it. I feel that our team didn’t have a good workflow, because I was requested to model a specific object but in the final video, there was a different model which was downloaded from the internet and my model was never used. This was never discussed and the decision to use a different model was made unilaterally. I also saw that one teammate would do all the work and sometimes even end up doing the work which had been assigned to someone else and perhaps this ended up being an unfair division of work. 

Communication is the backbone of a team and it also helps the teammates to be on the same page. This was a major factor that was missing from our team. We had a language barrier with some members of the team and we would struggle to communicate our ideas with each other sometimes and perhaps this lead to misunderstandings. Now after looking at the outcome I felt that the VR team and the 3D modelling team didn’t have adequate communication with each other. For example. I had modelled the whole bedroom and given the texture and handed it over to the VR team and later, during the final submission realised that none of the texture which I had given was used. I was required to make a simulation of virus cells and even that was not used. This taught me two things, one is not to assume that your model will be used, and the next is to follow up with your teammates and ask what is happening with your model or if they need help with it. 

The collaborative unit taught me a lot of new things like working with new software, understanding how VR performs, and how to perform better in a team and all in all, it was a good experience.

Categories
Collaborative

Week 10: Editing and Compositing.

We planned to edit the video over the weekend, but our VR had some problems, so we had to record the VR again and match it with the live action. I had 2 days to edit and composite the video. 

I had to wait till Xiaoyan gave me the video of the VR recording and she wanted to edit the audio and send it to me separately. I got all the videos and audio on Monday and then I started removing the green screen. 

To remove the green screen I used Nuke, first mature I used was the IBK node and I was not that happy with the result. The curtain stand was made out of chrome, so everything was refracted because of that and the stand reflected the green screen. While removing the green screen even the stand became transparent. 

The next thing I tried was to get a clean plate of the greenscreen without any props or actors and then I could use the Spill Replace and use Divide and Multiply. Later, I combined the audio with the video and I also did baes colour colouration.  

I felt that the audio had some noise and it didn’t sound clean, so I put the audio into Audition and did a quick cleanup.

First not clean and second clean.

I also had a problem matching the audio to the video. I shot the video at 29.97 fps and Xiaoyan sent the audio which was 24 fps. I didn’t know how to fix the problem I tried to change the fps when I was rendering the video from Nuke at 24 and still didn’t work. When I spoke to Gonzalo he asked me to get the audio working file and put it on the timeline and see if that works. When I asked Xiaoyan about the working file, unfortunately, she had already deleted it. The next thing Gonzalo advised was to make sure all the elements have the same fps, like the videos, audio and timeline, but even that didn’t help. 

Later that day I came home and tried to fix the problem, and I did exactly the opposite of what Gonzalo said, and I got lucky and it did work! I changed the fps of the timeline to 24 fps, the video to 29.97 fps and the audio to 29.97 fps.

This is what the final video looks like.

Categories
Collaborative

Week 9: Shooting.

To get the props to college, we had planned to go to the location by tube and then get a cab back with the props. When we reached Film Medical Services we saw this.

Film Medical Services.

The problem was that the props were way too big to get into a cab. A person from the store said that there is a company called Deadline which could pick it up and deliver it to us, so we booked them for the pick next morning and we planned to shoot at 6 pm.

We had one more problem, and that was the floor in the room was green and when removing the green screen even the floor would transparent. This would make the actor look like they are floating. We weren’t sure what to do with this. The next thing I did was speak to Christos and he arranged for something to cover the floor.

Floor covered.

We all had a role to play in the shooting and we had a worksheet so that everyone knows what their task is.

Work Sheet

I had to do a few things before the shoot: the first thing was to set up the camera. For the shoot we used Blackmagic 4K. I was new to the Blackmagic ecosystem, I had to watch YouTube videos on how to use it. I am familiar with Sony, Canon and Nikon but I am not familiar with Blackmagic so I took this chance to get to know it better.

My second task was to set up the lights. We used the lights that the room had to offer and kept the kelvin between 6000K to 7000K. I went with these lighting settings because the hospital generally has cooler lights, other than those lights, we used 2 Amaran P60c these were RGB lights and we could control the colours via a phone this helped us a lot and this was easier than using gel sheets.

For the audio, I chose a boom mic with a zoom h4 recorder. I played around with the settings to get cleaner audio. 

I had to do the post-production like compositing the green screen and editing the video, so I told Xiaoyan that we should only capture dislodges for the actors when shooting the video and later I would add the sound effect in the post, but she wanted to play the sound effect while the shoot which I felt was not a good idea.  

After the shoot happens.
Categories
Collaborative

Week 8: Booking Props and Simulation Video.

We held a meeting at noon on February 27th, this meeting was mainly to discuss the content of the video as well as the props. We decided to rent props to improve our performance.

Meeting 9

Jiageng and Xiaoyan suggested buying the bed, IV stands, and curtains but this would potentially be a waste after teh shoot as we would need to throw them away. I wasn’t a fan of the idea, not only because it was expensive, but also because it would be a waste. I was able to find a company online, called Film Medical Services. They rent out medical props for a week and this was exactly what we were looking for.

This week, I also had to make the Simulation video for act 3, which would be projected on a circular screen. The idea I pitched was that the actor would be standing in the centre of the circle and different medical equipment like a vial, cannula and pills would attack the actor and whenever these models hit the actor on the video the virus-cells will reduce. ( Act 3 is the treatment )

Maya Simulation
Maya Simulation

Beuse I knew how to work with Maya, I started working on the simulation using Maya. I used nPartical, but I couldn’t achieve what I wanted. I even watched a few yYouTube videos, but I was still unable to do it and when it worked, Maya crashed, so I gave up. On Monday I asked Nick for help and he showed me how to work on it but it was still crashing. Then my team mate Saurabh suggested Houdini. I had never used it before, but he showed me how to use the particle.

Houdini Simulation

To create the simulation, first I created a sphere and connected it to popnet. Inside the popnet created a source input, in which I controlled the simulation and then added popforce and later added popdrag so the object doesn’t have a smooth flow which will make the simulation look fake. Then added a wire_pops_into_here then popobject contacted both of them to popsolver.

For the triangle virus cell, I needed to add an attribrvop and attribrandomize so that they don’t float in on angel. I create simulation 5 because the plan was to hit the actor with 5 different models so every time the model hit the actor the cell gets smaller.   

Houdini

While rendering in Houdini for the first time, I had some problems, for example, there was no texture for some cells and the cells were flickering. I didn’t know what was the problem but I spoke to IT and they helped me fix it. The problem was that in Houdini the render options for Declare Materials were on “save only referenced materials and shaders” but it should have been on “save all materials and shaders”

The final render took 1 whole day to complete, but it was worth it.

Render Video

We rehearsed at the final venue, and this time we rehearsed the entire process, as well as rehearsed in the venue with the lighting, the sound and the camera.

My job was to set up the lights and camera to make sure that everything goes smoothly on the day of the final shoot. The layout was made to make sure that the design of the VR matches the live-action set-up. 

In the beginning, I thought it was more like a short film so it would have multiple cuts and cameras with shots stitched together. Xiaoyan suggested a theatre-style approach, so we went with point-and-shot. For the lighting, she wanted the base lighting and two RGB lights so she can change the lights when the mood of the story changed.

Categories
Collaborative

Week 7: Texturing and Rehearsal.

We had our first offline rehearsal on February 23rd. This was attended by Minghong Zhang, Xiaoyan Dong, Jiageng Guo, and Zhongge Sui, as well as Wanwei Zhong and myself. The rehearsal was at college. We mainly rehearsed the first and second acts in order to mobilize the actors’ emotions and let them understand their roles.

In the beginning, we didn’t have a proper location to have a rehearsal or shoot the final video, so I thought of shooting it in college, but we weren’t sure what room we could use. I then spoke to Pav he said that we can have a rehearsal in M312 and we can shoot the final video in W108. We saw the place later and it was perfect. 

The team had asked me to shoot and edit the video, so I had to make sure that everything was working before the shoot. I had to make sure that the VR and the live-action were blended well and matched each other in the post. 

In the First brief, the plan was to project whatever is happening in the VR onto the background by using a projector, but it would be difficult to light up the stage for the live action. The next week in my Nuke class, Gonzalo showed us how to work with a green screen, which made me think about shooting the whole project in front of a green screen. I have previously also worked with chroma-keying. I pitched the idea to Xiaoyan and we decided to go ahead with it.

During the rehearsal, I quickly took a shot of the green screen and asked Zhang to make a short video from the VR to see if the green screen will work. I removed the background using Premiere Pro for the demo to show the team.

Demo

I started texturing all the moodle after I got confirmation from the team that the models are looking fine and they would not need any changes. I started texturing the bedroom first because it had a lot of elements.

For these models, I have applied base colours and presets from AIstand materials like chrome, velvet, glasses and hair physical shader. I changed a few settings like transparency, bump map, roughness and metalness. 

For the models which are shown above, I have added texture from the internet, and then textured in the hyper shade. The UVs were a little complicated, because I didn’t know how to work with them, so I learnt about them on YouTube and after watching a lot of videos, I was able to understand how to work with them. I was advised by the VR team not to give any texture to the inside of the photo frame and the mirror.

Texture

The most complex UV was for the pillow, because it had a lot of faces. I learned two ways to unwrap the UV. The first way is to go to the UV editor under toolkit – go to crate and click automatic, cylindrical or spherical. The second way is to use Camera Based. Both of these did not work perfectly, so I had to use sew and stitch.

Categories
Collaborative

Week 6: Making new Models and Fixing Modeling Problems in Unity.

The sixth meeting was held at 20:45 on the 15th of February. In this meeting, we specifically discussed the models needed for act III, the scenes and the interaction gameplay of act III.

We needed to add more models in the bedroom so that there would be more interaction with the live action. I was asked to model a coat rack and wind chimes. 

The coat rack was created by using simple shapes like cylinders, corn and torus. To makes the curves for the handles, I used the EP curve tool and extruded only one face of the cylinder.

For the glass wind chime I used simple shapes like cylinders, boxes and spheres. To make the glass spheres, I deleted a few faces from the bottom of the spheres and extruded them outwards. For the card, I used a box and for a few faces, I used circularize to make a hole.

Later that week we had a meeting which was held on February 20 at 4:00 p.m. We discussed some concepts for the fourth act and some issues with rehearsals and shooting locations. We also communicated about some issues with the previous models at this meeting.

Meeting 7

We had a few problems at hand, the first being the location to shoot the live-action. We discussed booking a theatre for that purpose, but this was out of our budget.

The next problem was that all the FBX files I sent to Jiageng didn’t have the desirable surface and the bed curtains were a box. That is the reason for my confused expression in the picture above.

I couldn’t figure out what the problem was, so I started looking for solutions on YouTube and Google. Later, I found out that if you smooth a surface by pressing 3 in Maya, that smoothness won’t be carried over to another software. You needed to give the surface mesh smooth and then export it to FBX.

Unity

We also saw a demo of the VR and saw the interaction and Jiageng shot a small video.

Demo
Categories
Collaborative

Week 5: Modelling the Cells and Medical Equipment.

The fourth meeting took place on February 9th at 7:30 pm. This meeting took place after the VR mid-crit, and they got some feedback from their mentors, in which they discussed some concepts and details related to the third act, and they also created more clear divisions between the virtual part of the video and the realistic performance.

Meeting 4
Act Brack Down

We also planned to make two videos which will come at the end of act III and between act IV and act V. Near the end of act III, there will be an interactive part from the VR, for the treatment part.

The plan for the end of act III was to keep the patient in the middle of a 360 screen and he will see the video, There will be three to four 3d models around him and when he reaches for these models, the video will change or there will be a reaction in the video.

The second video was for act IV which will be at the end of the VR world. This would be just played on a plain screen and there would not be any interactions but the actor would just sit on the bed and realise that he is going to leave the virtual world.

Act III

For act III, we had to model cells and for the video, we had to model cells and a few pieces of medical equipment for the actor to interact with so that the video can change.

To make the model of the vial, I took a cylinder changed the division and deleted the top face and then extended the edges to make the inside of the model. To get the shape of the vial, I used an edge loop and made the shape. For the cap of the vial, I used the cylinder and made the rubber seal with the centre face. Then I extruded the inverse of the edge. For the edges, I added an edge ring to make it smooth. For the liquid inside the vial, I used a cylinder and used sculpting tools like pull and shapen, soft edges.

The cannula can be broken down into many parts. I started off with the body, then I extruded the cylinder and used the circularize to make perfect holes on the wings of the body, I used the same matter to make the Luer connector. The port cap was created by using the same tools. 

To make the needle grip, I combined multip cylinders to shape the body. For the needle grip itself, I used the cylinder and extruded and moved the edges and face to give it the shape.

Cannula

Then I added a needle and the needle cap and also added tubing by drawing a line on the EP curve tool and extruding on only one face of the cylinder to make the tube.

The next task was to start working on the cells. First came the DNA, creating this was fun and easy. I used a sphere and gave a nose in texture to give it an uneven look and gave it MASH. Then I gave more points and moved it x distance and gave it 360 x rotate, following which I flipped it to make the other side. To make the connector, I used a duplicate special.

For the RBC, I used a sphere and used soft selection and pulled-in inverts and used a sculpting tool to make imperfections. For the WBC sphere, I gave it noise in textures and put some smaller cells inside the cell.

For the green virus, I did the same thing as the WBC but the time I used sculpting tools to pull out to give spikes. For the blue virus, I used a box and gave it a noise texture and used sculpting tools to make spikes, for the Inside of the cell, I added a helix.

This was the 5th meeting but I was unable to attend. The meeting was with the actors, the people who attended were Minghong Zhang, Xiaoyan Dong, Jiageng Guo, Zhongge Sui and Wanwei Zhong. In this meeting, they introduced the basic concept of our project play and process to the actors, and briefly experienced our demo.

5th meeting
Categories
Collaborative

Week 4: Modelling the Bed Room

This week I focused on making the bedroom and all the elements required to make the interaction work.

I started making the mattress using a box and gave it an edged ring to make it more curved. To make the box spring I took a box and extruded the face, inverted and then pushed it down and then added an edged ring to make slats.

Next, I worked on making the headrest for this I took help from YouTub.

Headrest.

I started adding legs to the bed, and then I saw a few images of beds to get an idea of how to make the legs. I saw that they had curtains on them, which gave it a royal look which is what we were going for. For legs, I used a cylinder and then started playing around with an edged ring and extruding the object. I also created the post for the bed to add the curtains.

Reference

For the curtains, I used nCloth. I had never used it before so I took help from Youtube and the video helped me to complete it. I faced a lot of problems while making it, for example, I couldn’t see the wrinkle effect when I moved the anchor points, and later, when I played the emulation it didn’t work as I wanted it to. Because it was a box the other side of it would come to the opposite side and this would appear as black spots.

After playing around for a long time with the settings, I finally managed to make the curtains. I still couldn’t fix the black spots, so I went into edge mode and push all the overlapping edges to the other side. I also modelled a cord tieback to hold the curtains together then added a hook.

Pillow

To make the pillow, I used nCloth again. I used a cube and then flattened it out. Added nCloth to the object then made the gravity 0, then I chose a pressure of about 0.25 to get the pillow to puff up and appear more realistic.

Making the side table was fairly easy as compared to the other objects. I used basic shapes to make it like box and cylinder and then extruded and changed the sizes and edges, and did the same thing with the legs too.

Bedside stool

For the bedside stool, I used the same legs as the side table and gave it the same button style that I used for the headrest.

I used a torus and then extruded and inverted it to make a frame and for the mirror, I did the same, but for the shape of the mirror, I pressed “B” and moved the shapes.

For the lamp, I started off with the bases by using a cylinder for the body I used a cylinder again and sculpted by changing the size and adding the edged ring to give smooth curves.

Bedroom.

The third meeting of the group was held at 8pm on Monday, 6/2/2023. We discussed the composition and details of act II.

In this meeting, Xiaoyan proposed a complete script for act II and marked the interactions that needed to be made in it. The contents of the interactions are

①There will be a sound when the wind chimes are touched

②Player can pick up the picture frame

③The player will touch the music box and the model of the little girl will pop up

④The player walks to the mirror and can see the little girl in the mirror saying hello to the player

Also in this meeting, we discussed some issues with the special effects video. In this meeting, The team asked me if I was able to shoot the video and I was happy to comply.

The blue texts had to be modeled and this was easier to understand than act I. So Saurabh and I decided to divide the models equally.


Polygon Pig (2018), Maya – Create Curtains with nCloth [Online Video]. Available https://www.youtube.com/watch?v=BtrP4lB_dqg&ab_channel=PolygonPig [ 8th February 2023].


Iris Ogli 3D Artist (2019), 3D Modeling Tutorial – Modeling Chesterfield Mattress in Autodesk Maya 2023 [Online Video]. Available https://www.youtube.com/watch?v=hV4E3NKHW2A&ab_channel=IrisOgli3DArtist [ 8th February 2023].

SharmaJi97 (2020). Lamp (online). Available at: https://sketchfab.com/3d-models/lamp-9687d800e7374d49ae6047a35f71d25f [Accessed 8th February 2023].

Categories
Collaborative

Week 3: Scripting and Art Styles.

This week we had two meetings to discuss the project, build the storyline and assign our roles. 

First meeting content:

For the performance, we all thought it was better if we combine virtual and realistic forms.

We came to the conclusion that the theme of our performance should be Hospice Care.

We selected the structure of the script and the general story content. The structure is divided into three parts:

  • Part one: inside the hospital, these will be realistic forms.
  • Part two: goes into the virtual world (which has 3 acts)
  • Part three: back to the real world (realistic forms)

We agreed to use Trello to stay updated on our work and the deadlines.

Trello
Meeting 1

The meeting was conducted online.

Content:

We spoke about art styles and act one would be fairly real. 

Continued building and refining the story.

We also discussed means of interaction, for example, what objects can be used for interacting with the actor 

We allocated roles and jobs for each team member.

The roles

Xiaoyan Dong
MA VR
Roles:  Screenwriter, Director, Costume and Stage Designer, Character Modeler, Unity Developer.

Jiageng Potter
MA VR
Roles: Unity Developer, Programmer, Interaction Designer, Sound Designer.

Waikar Saurabh
MA VFX
3D Modeler, VFX Designer.

Yitong Jiang
External collaborator
Costume Assistant.

Zhongge Sui
External collaborator
Actor.

Wanwei Zhong
External collaborator
Actress.

Vitus Lasrado
MA VFX
3D Modeler, VFX Designer.

Meeting 2

In these meetings, I understood the technical part of the story and I feel the story is really cool. I was also told that this was not done by anyone before and I am excited to see the final outcome. I can’t describe the technical part any further until we are done with our project. I had asked for a mood board so that I can understand the kind of tone we are going for and this will help me with the modelling.

These are a few links which I am attaching below, where they managed to combined VR and live-action together and our project will take inspiration from these works.

Story Structure

Muse (2021),Muse – Enter The Simulation (with Stageverse) [Online Video]. Available from: https://www.youtube.com/watch?v=wo8C7b_c_VE&ab_channel=Muse [ 8th February 2023].

Royal Shakespeare Company (2017), The Tempest | Cinema Trailer | Royal Shakespeare Company [Online Video]. Available from: https://www.youtube.com/watch?v=BZKtQAIE4ew&ab_channel=RoyalShakespeareCompany [ 8th February 2023].