Sefki Ibrahim / Realtime Digital Double with Character Creator and TexturingXYZ part 2

Applying the Displacement

Back in Character Creator, I select the GoZ button to export the model to a new ZBrush scene. This scene is specific to texturing the head in high-resolution. So when you press GoZ, the same window will appear; however,  this time choose the settings shown below. 

This setting will allow us to work on the head separately from the body to apply the displacement map on a very dense mesh. Once in ZBrush, I subdivided the head geometry to level 7 or 16M polys. Remember when I said we would be projecting our sculpt later on? Well, this is that time. 

I imported my level-5 model (your highest subdivision level) and projected it onto this new head base. The best way to approach this is: start at the lowest subdivision of the new head and select ‘project all’. Then, up the subdivision and project once more. Repeat this until their polycounts match. You shouldn’t encounter any geometry errors or ‘spikes’ since the heads align; however, it’s good to double-check, mainly the mouth corners. 

On the left is the initial import, subdivided up to level 7 and on the right is after projecting my sculpt.

Now, I created a new layer and imported the displacement-R as an alpha. In the displacement map section, select the map and change the intensity to something low. The values I used for R, G, B were 0.2, 0.1, 0.03, respectively. Double-check that you are happy with the intensity and press Apply DispMap. Repeat this process on a new layer with displacement-G and make sure to turn off any other displacement layers as you do this. 

Once you have added all of these displacement layers, you can fine-tune them by sliding the intensity of the layer until you get a look you like.

I then continued to work on the face with many more hours of proportion changes and subtle detailing until I eventually felt like I produced a strong likeness of the actor. Creating an accurate portrait doesn’t happen overnight…well for most of us. There were many iterations that I tested in Character Creator and even in Arnold to ensure I was getting the best likeness possible. My advice, don’t get stuck in ZBrush the whole time!

Then the normal map was exported out at 4K using the settings below.

The thing to note here is that you can press All to update the face again in Character Creator. 

Just in case you’re confused at this point why you have two ZBrush scenes. The first scene was necessary for the initial block out where you establish the main proportions, the positioning of the eyes, teeth etc. Now, in this new scene, we can refine the face and push the likeness as much as possible. You don’t have to revisit scene 1 from this point on. It takes some time to get used to the back and forth between Character Creator and ZBrush; I recommend experimenting with the workflow.

Hair Card Generation

Now before we move to the fun stuff (look-dev and SkinGen), it’s time to take a break from the face itself and work on card-generated hair for Ed. In CG, anything involving hair takes time, whether this is using XGen or hand-placing hair cards. Also, I have to say; this is my first time attempting hair cards!

  • The first step was to create the hair textures using Substance Designer. Credit for this method goes to Ambience Lee. In Substance Designer, I began by creating a scratch generator node. With this node selected, I turned down some of its attributes, shown below. The network below shows how a Directional Warp was used to give the hair a little wiggle with a Perlin noise used to drive this. 

  • From this point on, I created seven different clump variations (scratch generator variations) and blended them into a square tile. It’s essential to develop modifications for various purposes. For example, the base hair cards should be thick and unnoisy to cover the scalp. In comparison, the flyaway hairs should be less dense and have some more noise since they will provide definition and character to the groom.

I proceeded to connect this hair texture tile to a curve node for the transparency, normal map node, ambient occlusion node and a colour node. Here are the textures generated from Substance Designer. The roughness and specular maps were created by grading the ambient occlusion map in Photoshop. 

  • Starting with a simple plane in Maya with just one subdivision along its width and two along its length, I built a card per hair-clump in the UV-tile like the image shown below. From this, I can simply duplicate the hair card and place it on the head; manipulating the vertices to follow the curvature of the scalp. I made sure to build the hair cards on just one side and mirror at a later stage. Finally, I then fine-tuned the hair cards asymmetrically to produce a more natural final result. 
  • Once I was happy with the hair cards, they were imported into ZBrush and further manipulated using the move tool. This tool allows you to move more substantial chunks of hair with ease. I also subdivided the hair cards for a smoother result.

Ensure to delete the lower level for Character Creator to read in this new subdivided hair. (Probably a good idea to make a duplicate of the base and export out for safekeeping!)

Before jumping back to Character Creator, we can adjust the hair card UVs in ZBrush to ensure our texture maps are read in correctly.  With the hair subtool selected, navigate to UV Map and under Adjust, select Flip V. Proceed by pressing the All button to update the Character Creator scene. 

  • In Character Creator now,  I navigated to the material tab and chose the material type to Digital Human Hair. I then plugged in the maps into their correct slots. 

  • The last thing to note is the tangent map. On the Reallusion site, it’s explained that a red colour represents a vertical hair flow whereas a green colour represents a horizontal hair flow. What we want is a vertical hair flow, so I created a red colour map in Photoshop and imported it into the slot. You will notice if you play with the roughness strength, it shows how the shine of the hair moves correctly.

Dynamic Texture Editing with Character Creator SkinGen

Before activating the SkinGen editor, I started by attaching the textures for the head material, which should have the Digital Human Head shader type applied. Character Creator allows a maximum texture size of 4K so even if you import an 8K map, it will downsize the resolution.

I proceeded by editing the eye, skin and teeth shaders in the modify section. In this section, I changed the subsurface scale and was able to alter the roughness in different regions of the face, which is incredibly efficient and artist-friendly. 

Below Shader Settings, Character Creator also offers the artist flexibility to apply their custom maps and masks for various attributes, such as micro-normal detail (which are also provided by Texturing XYZ), subsurface weight and a bunch more.

I created a custom sclera and iris map and replaced the default diffuse texture to get a closer match to the reference. It’s needless to say how vital eyes are in any portrait. You can also utilise CC’s various eyeball presets to get the look you want without any work.

The high-fidelity detail of the normal map won’t show through until we enter SkinGen mode. So before you activate it, use this time as an opportunity to ensure the roughness of the skin looks correct and that you connect an appropriate micro normal map. But of course, once in skinGen, you can return to make changes to the material if necessary. 

SkinGen is an interactive editor: presets from the content gallery can be dragged and dropped into your scene, and its effects displayed in real-time. Presets range from skin detail to skin blemishes to blood spatter and more. 

The option to insert masks is what makes this tool so dynamic. You can choose from Character Creator’s vast library of region masks to specifically alter regions of the face and closely match the look you want.  Below, you can see that I connected the Utility-R map (from the set of textures created earlier) to the mask slot of a sunburn preset (the Utility-R map is intended to control the melanin of the skin). Within this layer, I can control many attributes. Such as the opacity of the layer itself or the colour of the sunburn.

I think you can see I went a little overboard with the number of layers in my editor. I recommend playing around with the different presets; observing their effects and how they alter the look of the skin. 

Under the modify tab, switching from Skin to Makeup can allow us to add a variety of eyebrow, eyelash or make-up presets. In the content editor, under make-up, I chose a bushy male eyebrow and brought it into the scene. From there, I tweaked some of the settings: I reduced the effect of the normal and increased the brightness and hue of the diffuse to give a slight grey/blonde tint to the hairs. 

For artists looking for even more flexibility, be sure to check out the SkinGen Tools. You can drag and drop say a Tile layer into the scene; import your alpha and tile it across the character. 

And that is it! 

I’m relatively new to the real-time space, but this project has allowed me to jump head-first into the development of a real-time character. And with Character Creator’s diverse range of tools, the workflow has eased the jump for me somewhat. SkinGen, in particular, is a powerful feature. I have also tested the implementation of some of the maps baked out from Character Creator in an offline renderer (Arnold), and the results look great.

Animating in iClone for Unreal Engine

Here are some tips to get you animating your character via Reallusion’s variety of tools and plug-ins. 

Facial Expression Editing in Character Creator

Now, with our fully rigged character in Character Creator we can begin by simply dragging and dropping different facial expressions into the scene via the expression content. It's a really quick and easy way to apply different facial expressions and instantaneously bring life to your character.

For more control, you can also animate your character’s face using Edit Facial. Simply select a region of the face and drag your mouse around to activate different blendshapes and pose your character. 

Facial Puppeteering and Motion Capturing in iClone

In iClone, you can animate your character's face using the Face Puppet tool. First select a suitable facial profile for your character, then select an emotion to start making the character perform the emotion. 

Animating in Unreal Engine via iClone Unreal Live Link

Moreover, to bring your character into an engine, you can use the Unreal Engine Live Link. This incredible tool sends the character to Unreal Engine seamlessly building all of the shaders, parameters and the skeleton setup. Additionally, the animation, cameras and lights are all synchronised in real-time. The one-to-one translation from Ed Harris in Character Creator to Unreal Engine is pretty mind-blowing! 

Reallusion offers another great plug-in called Motion LIVE. We can animate the character with body and facial motion-capture devices. This one demonstrated in the video is using the iPhone. The iPhone tracks the face with a depth map and analyses subtle muscle movements for this live character animation. Then you can use the Unreal Engine Live Link to translate your facial motion capture into the engine. 

Finally, with Reallusion’s motion content packs, you can apply a full body animation to your character to get them moving and finally watch on in pride. Ed is now alive!

I hope you guys have enjoyed this article and tutorial series and have found it a useful stepping stone into using Character Creator and Texturing XYZ to build some really awesome looking characters. Also, thanks to the Reallusion team for letting me be a part of this project! 

Feel free to follow my work: https://www.artstation.com/sefki_i

Download a free trial of Character Creator and iClone.

Part 1 - Part 2