The beard – Version 1

The first version of the beard was meant to look like a pharaoh’s beard that is attached to the chin. I researched pharaoh beards and talked to Jacob about what he wanted the beard to look like. So the essential idea was a pharaoh’s beard made out of metal with a mechanical inside that consists of microchips and watch parts in order for it to follow the movie’s cyberpunk aesthetic. The lid parts of the beard were meant to move like claws, depicting the actor’s emotions accordingly. The claw-like metal lids proved very difficult to built, I did not know how I should built it and what kind of mechanics they would need in order to be believable and in order for them to work in the real world. I tried several different methods, shapes and mechanics. After a while I decided to work on the inside of the beard, because I did not want to waste any more time on the lid mechanics. I looked at pictures of motherboards and microchips to find out how I could recreate one in blender. I noticed a motherboard has a lot of diodes and resistors so I set out to build one of each components and duplicate the meshes with the array modifier. For easier distinction, and being able to quickly show what the model looks like to Jacob, who throughout this project acted as my first client, I gave the different meshes a material in different colours.

After roughly a month of trying to make a believable pharaoh beard with a motherboard and cables inside of it, the decision was made that we should rethink the beard design from scratch and, rather than only attach it to the chin, make it a full beard that is attached to the sides and the jaw. I understand the decision to change the beard’s design, it took a big workload off of Lyon, who would have had to animate it or create the mechanical insides through a 2D animation. And it took a lot of work off of James who would have had difficulties with the matchmoving. But, this required me to make new sketches, so I started with them immediately.

Test Pieces

In the early stages of the project I thought that I had to do the matchmoving along with my modelling and texturing process. So, in order to remember how to do matchmoving I tracked the free footage that I used in the first semester, downloaded from hollywoodcamerawork.com . I quickly got the hang of normal camera tracking again and was confident that I could do it for the short movie. Screen Shot 2016-02-17 at 11.30.26

 

Together with James, we tried out facial tracking, since at that point none of us knew how to do it, and if and how much it differs from camera tracking. James completely relied on me finding that out and testing it out at that point. I tried tracking my face the way I know it from camera tracking. The footage was filmed with James’ iphone and all in one go. So, I imported the footage of me moving my head into premiere pro and cut it up into small mere seconds footage, in order to make it more trackable in blender. James then used the footage to learn how to track in Blender and later on how to do facial tracking in Blender. Screen Shot 2016-03-08 at 21.59.16

 

Once I was at home I wanted to see what colour the tracking markers on skin should have in order to be trackable. Since I saw white markers on white skin, blue markers on white skin and black markers on white skin when doing my research, I wasn’t sure which would be the best and the right approach, so I got white and black paint and asked my housemate to lend me his face and his camera for an evening. This test was done in quite a haste, because we were due to shoot the night scenes where we needed tracking points only a few hours later. It turned out that black on white tracking markers worked best, I tried different lighting conditions as the shoot that these wood be needed for, would happen in the dark and be lit by nothing but orange street lights.

Screen Shot 2016-03-09 at 18.43.13 Screen Shot 2016-03-09 at 23.17.03

The bad quality of the footage made it not too easy to track but I got it done eventually.

Jon suggested to use the lattice tool to warp my beard model to our actor’s face. I followed a tutorial on youtube  and read about the lattice tool on wikibooks. I used the 3d model that I downloaded to try out the lattice tool and got the hang of it quite quickly. I remembered that I used a mirror modifier on the beard mesh, which I applied. So I wanted to see if that affects the lattice tool in any way. I used a low poly sphere, cut it in half and used a mirror modifier on it which I then applied. I then used the lattice tool on it and noticed that it creates a whole in the middle where the 2 halfs of the sphere should connect.

Screen Shot 2016-03-11 at 16.25.39 (2) Screen Shot 2016-03-11 at 16.49.32 (2) Screen Shot 2016-03-11 at 17.08.09 (2)

 

 

Monday 25th Digi Workshop

Today we had Dr. Mark Lochrie in to give us a little inspirational talk about ‘the internet of things’. He talked a bit about the team he works with, which is called Media Innovation Studio. He also introduced us to a little electronics DIY kit , called littleBits.

It was a very easy to use kit, with many many possibilities of what you can make with 2 starter kits. Some of us built a remote car, other started a rave show by connecting all of the lights, adding a sound sensor ( ? ) and playing some club music. Graham managed to built a a little keyboard with a speaker and played ‘Jingle bells’ on it, Jacob managed to build a synthesiser and managed to create a club-worthy beat with ups and downs. In conclusion I can say that littleBits brought out talents in each of us that we, or at least I, didn’t expect. In short : littleBits brings out the DJ in all of us.

 

In regards to my own work, I don’t know how I could use this technology in collaboration with 3D modelling. But I will think about it thoroughly once I get the time for it.

BB-8 Droid

I’m going to start blogging properly, so what I’ve wanted to do for quite some time now is to recreate the BB-8 droid from Star Wars. It is seemingly a simple shape, made from a UV sphere for the bottom and another one cut in half for the top, with a little altering such as extruding and scaling. I think that BB-8’s look is mostly created through its texture. So the mesh is going to be a very simple one, but the texture is going to be the challenging part. Which is good, because I’m confident in my mesh building abilities, but not in my texturing abilities. BB-8 has 3 main colours, which are white, orange and a dark grey. The surface is also panelled. I won’t use the technique with the modifiers that I used for Ship-E, because it created quite a few problems with the texture later on, so what I’ll try this time is to create the panels with a bump map. BB-8 is altogether a little dirty and shows a little wear and tear which I’ll create in Substance Painter later on.

The final piece

ConceptSketch

In a meeting with my tutor in week 10, we came up with the sketch on the top as design for the ship. It has steampunk elements, such as the exhaust, labeled as ‘number 4’ and ‘Rocket’

on the sketch, the steam tanks and even the design for the

fins. The fins and the exhaust were inspired by this design for a

steampunk rocket.

Before I did my proper research on this piece, I thought that it

was an actual physical model of a wooden space rocket. It turned

out that this is a 3D model, which just goes to show how important

good textures and good lighting are for the photo realism of

3D models. The American company that made this 3D model is

called Reimage and is situated somewhere not far from Atlanta.

Unfortunately, they did not write any information on their blog as

to how they did it, how long it took or what software they used.

23 I started with the ship’s most prominent shape, a sphere. The

main idea was to combine Steampunk with Sci-Fi elements to

create an interesting space ship with a lot of potential for texturing.

The sphere would look a little too plain if just left like this,

so we decided to make it a panelled surface, which also helps

to combine the two main styles. To do that I followed this short

and easy tutorial, that solely uses Blender modifiers. So the steps

were : Go into Edit mode, select the edges you want to be the

edges of your panels, then press CTRL + E and choose ‘Mark

sharp’. Tab out of Edit mode and go to the modifier menu and

add the ‘Edge Split’ modifier, untick the ‘Edge Angle’ box. The

previously selected edges can now be seen on the model. After

that you just add a ‘Solidify’ modifier and a ‘Bevel’ modifier, turn

‘Auto smooth’ on and tell the Bevel modifier to use angle offset.

This was a really easy way of creating a panelled surface, which I

always thought would be a rather complicated and time consuming

process. This technique of creating a panelled surface certainly

worked, but has created several problems with the texture

afterwards.

The workfLow

My tutor helped me to understand how much work I had in front

of me and how little time, by writing down what the workflow

and schedule would look like for a CGI showreel.

Week 11

Monday -> Wednesday Complete mesh

Thursday -> Friday Texture Paint

Week 12

Shooting footage + matchmove

Week 13

Render + edit

3D workflow

3d model + UV unwrap in Blender

Export fbx file to Substance Painter

Paint in Substance Painter

Export bitmaps to Blender

Import model , textures into matchmoved footage

into Blender

24

I’m not sure how it confuses the computer softwares, but UV

mapping in Blender as well as in Substance Painter was quite a

challenge. The light gets reflected in very odd ways because of

the panels, which I only came across far in into the work process.

I wasn’t sure how I should build the fins, because I had to make

a fairly round object out of a cube. So the method I chose, was

the ‘stamp method’ with the help of a Boolean operation. This

worked alright, but created fins with an unbelievably high face

count, which, besides me using a subdivision surface modifier on

everything at that time, in the end crashed my computer several

times and made Blender sheer unusable. There were also problems

with the side faces of those fins. If you look closely on the

picture above you can see several triangular white flares on the

model. I couldn’t get rid of it and so I had to redo them by scaling

and creating edges, which were dragged into place, in the

end.

I had to learn the hard way that the Subdivision Surface modifier

makes everything look pretty and nice, but really is not a good

idea to use. I haven’t used it since, and have now learned, after

completing a model with texture, that textures dictate the look

of the model, not the smothness of the mesh.

W ith this many faces

you would even crash

a Pixar computer and that

is quite a skill to have!’

26 I then proceeded to add to the model and built pipes, tanks and

the exhaust. For the exhaust and the tanks I used a simple cylinder.

To build the exhaust, I selected the top face of the cylinder

and extruded it outwards, which I also scaled to create the raised

‘step’ where the exhaust ends. Then I extruded the top face inwards

to create the hollow effect of a tube and scaled the face to

give the tube an even thickness. To finalise this mesh, I bevelled

the edges in order to make it look less computer generated and

more handmade.

Now the way I extruded my meshes somehow messed up the

normals. This wasn’t visible in Blender, but showed when I

imported the fbx file into Substance Painter. The problem that

the messed up normals created was, that some of my objects

were half see-through, unfortunately I forgot to take a screenshot

of that mistake but you can imagine it as having the mesh,

with paper thin walls, cut in half so that you can see the inside. I

first noticed this phenomenon on the exhaust, so we tried to find

out what it was. We looked at the UV mapping, to see if the bug

was there and came to the conclusion that the UV maps were

alright.

Jon then looked at the normals of the mesh in Blender and it

turned out that the normals were all over the place. This could

be easily fixed with simply clicking on ‘recalculate normals’ in the

menu on the left. I exported the file anew as fbx and imported it

into Substance Painter. The transparency had disappeared from

the exhaust, but then I noticed that the tanks and pipes had the

same problem. So I went back into Blender to check the normals

of those meshes and they were all over the place same as the

exhaust’s earlier. So I recalculated the normals of every mesh,

imported it into Substance Painter and had fixed that little problem

with only a few clicks.

Before I knew about the normals mishap I first UV unwrapped the

model and made sure that the square pattern on each mesh is

roughly the same size. I remembered most of the steps of UV unwrapping

from the Unreal Engine 4 workshop and the microchip

model that I made previously to this one, but if I did get stuck I

looked at a UV unwrapping tutorial from the curious engine.

28 After debugging the model and making sure that everything is

in place I imported the final fbx file into Substance Painter. Substance

Painter is a rather easy to use texture painting software

that works very similar to photoshop with its layer system. I again

followed a tutorial series from the curious engine, about how to

use Substance Painter and add different levels of detail and dirt

to the model. I chose a dark red as the main colour for the ship,

because in colour theory red represents many things such as war,

danger, strength, power, determination as well as passion, desire,

and love. Dark red specifically represents willpower, anger,

courage and wrath, which are all traits and words associated with

a dystopian future and therefore it fits the overall theme quite

well.

I played around with the colours of each mesh and created my

own copper material to use it on the pipes, tanks and exhaust. I

got feedback that the yellow fins don’t work too well with the design

and the idea of the ship and that my copper could use a few

tweaks because it looked like rose gold, instead of copper. The

yellow fins make the ship look like a toy, that’s why I decided to

make the fins the same material as the main body and it makes

28

sense that in a dystopian future, you wouldn’t spend money on a

different colour to paint your space ship. I added dust and a lot

of leakage and rust to give the ship a story and make it interesting

to look at. The copper I used in the end was a smart material

from Substance Painter which looked weathered and oxidised

and was exactly what I needed.

I then exported the textures and normal maps and introduced

them back to the model in Blender with the help of a Simple PBR

shader that I found on the internet. It really made it much easier,

because all you had to do is create an Image Texture, upload the

texture bitmaps, such as Roughness, Metallic, Normal map, etc.

and plug it into the shader.

I mentioned earlier that the panelling created problems with

the texture. This problem is depicted below. It turned out that

the normal map is too strong and that all I had to do was turn

it down from 1000 to 200, which I have done in every tracking

scene for every texture.

After my textures seemed to work fine in Blender I went out to

shoot some footage that I could track. I learned that tracking

footage needs to be one very smooth movement and preferably

a forward or backward movement. I have really shaky hands and

therefore most of the footage I shot was unusable. I managed

to get roughly 6 seconds out of one shot though, which is still

very shaky, but it wasn’t too hard to track it. I also found a German

website that offers free to use HD footage. I used 2 of them.

They were very easy to track, but I had to convert them from 50

fps to 25 fps with Premiere Pro. Since I had never used Premiere

Pro before I used another tutorial from the curious engine to

help me with it. It was quite easy to follow and done in about 5

minutes.

After that I tracked the footage as I did the test pieces, inserted

the ship in it and let it render. One render started with 2.08 minutes

render time for 1 frame. I calculated that this piece, which

had 250 frames, would take about 8,5 hours to render. The render

time per frame got gradually more with each frame, so in the

end my pieces all took 2 to 3 hours longer than what I calculated.

I recorded my matchmoving workflow 2 times, but I am unable

to link to it in this document, because I haven’t uploaded them

yet. The upload time for one of them was 13 hours, so I’ll upload

them after the christmas break and will put them on my blog.