Welcome!

Blog Feed Post

Zbrush to Houdini - Texture and Normal Map Workflow

This is a quick set of notes/instructions explaining, for complete Houdini/ZBrush newbies (that would be me), how to move a model from ZBrush to Houdini, and to ensure that textures and normals make it along the way. This will include the creation of a very simple shader network that will take the normal and texture map files created by ZBrush and use them in Houdini. Houdini will be used to create the UV coords, and export this as an .obj file. Prior to this evening, I wasn't sure how hard this was going to be, and I stunned at how easy it is (or perhaps I'm finally starting to grasp how the various parts of all of this are supposed to work).

Create the model in ZBrush. If you're using ZSphere's make sure to subdivide up a number of levels *first*, then resume modeling.

From Learning 3D


After subdivision
From Learning 3D


The reason is so that you have at least one level of subdivision, and to "tighten" up the initial sub division level 1 geometry. Without this, things may appear a little weird when you export the model to Houdini (or Maya, Modo, et al). I'll use the "Super Average Man" model supplied with ZBrush 3.1 for this. Don't forget to make the tool into a PolyMesh3D. Do your ZBrush editing/sculpting as you normally would.

From Learning 3D



Bring the subdivision levels back down to level 1, and export the model as an OBJ file. The default export settings seem to be OK for this.

From Learning 3D


Exporting
From Learning 3D



Bring the model into Houdini. A simple way is to create a Geometry node in your scene, drop down a File SOP, and import the new .obj file you just created.

From Learning 3D



From Learning 3D



At this point we have the base model in Houdini. We need to create UV coords that we can then make use of in ZBrush for the Normal and Texture maps. One simple way to create some UV coords is to add a UV Unwrap SOP to your file node.

From Learning 3D




Switch to UV view and you can see what's happened. Houdini has unwrapped your geometry, to that point, inside a UV square. Make sure that the uv unwrap node is toggled as the render/display node, and then export the geometry as an OBJ format. This save the geometry, which is unchanged, and add the uv coordinates that were added by the UV unwrap SOP.

From Learning 3D



Exporting with UVs
From Learning 3D





In ZBrush import the newly saved obj file, making sure that you're at sub division level 1.

From Learning 3D



Note that in the Tool palette's Texture options, the EnableUV button is disabled - meaning that it picked up the UV map in the new obj file.

From Learning 3D


At this point we can create the Normal map. Open up the ZMapper plugin. Make sure that the Object Space.nmap option is selected.

From Learning 3D


Click the Normal/Cavity Map tab, (bottom far left tab in the UI), and with the default options, click the Create NormalMap button on the far right. This will take a few seconds as the normals are calculated. Once it's finished you can exit the ZMapper. Now we have a Normal map that's selected into the Texture area of our tool. Select the map, and export it as a tif file.

From Learning 3D


From Learning 3D





Let's verify that the normals are OK by creating a shader network and a shader to use. Create a SHOP Network in your geometry container.

From Learning 3D


Enter the SHOP Network node, and add a "VOP VEX Displacement SHOP" node.
From Learning 3D



This will use the normal map data to displace the geometry. Make the node a Material by selecting the node, and hitting "Shift" + "C" keys - this will wrap your displacement node into a Material node and attach it to a sub-output node.

From Learning 3D


From Learning 3D



Enter the displacement node, and enter the following network:

From Learning 3D


From Learning 3D


From Learning 3D


All that's happening is the creation of a UV parameter - a special parameter that is a vecgtor type, and has it's node name and Parameter Name set to "uv" - case matters, "UV", Uv", or "uV" won't work - it's got to be "uv". This will store the current UV coordinate from the geometry network as it's being evaluated for rendering. Make sure that it's set to invisble, as we do not want to promote this in our Material parent node.

This is pushed into a UV transform node. We need to flip the data to properly handle the data output by ZBrush. There is an option in ZMapper to do this as well, but it's worth noting what cna be done in Houdini without altering the ZBrush data.
The transform output is converted from a vector to a float, and then sent to a Bump Map node. This creates the displacement, with the u and v coords send to the "s" and "t" inputs of the Bump Map node. Create another parameter for specifying the Normal Map file name by middle-clicking on the "tmap" input of the Bump Map node.

Finally connect the "dispN" output of the Bump Map to the "N" input of the final output node ("output1").

Jump up one level to the Material node, right click, and select the "Promote Material Parameters".

From Learning 3D


This should make the texture map parameter visible here. Enter the tif normal map file you created earlier from ZBrush.

From Learning 3D



Back in the Geometry container network, add a Material node to your geometry, and select the material you just made in your SHOP network.

From Learning 3D


At this point you should be able to render with Mantra and see the effects of the normal map, despite the low res geometry.

From Learning 3D


From Learning 3D


From Learning 3D


From Learning 3D


From Learning 3D




At this point I'll go back to ZBrush and poly-paint the model.

From Learning 3D


From Learning 3D




When that's done, go to the Tools > Texture sub-palette and click Col>Txr. This will create a texture map, based on the UV coords we had earlier, and colorize it according to what you've painted on the model/tool. Export this texture out as a tif for use in Houdini.

From Learning 3D


Export
From Learning 3D





To use this in Houdini we need to modify our SHOP material. Go back to the material node and edit it. Add a new "VOP VEX Surface Shop" node and attach it to the suboutput1's "next" input.

From Learning 3D


From Learning 3D



Enter the surface node and create the following network.
From Learning 3D


From Learning 3D



We'll start with a similar setup to the previous normal displacement network. Add a uv parameter, flip it's V vector component, and separate out the individual U and V components. These go into a texture node, into the "s" and "t" inputs. The output of the texture, the color, is sent into the "diff", or diffuse color, input of a Lambert node (you could use something else, this is just a simple example). The "clr" output is then connected to the "Cf" input of the final output node. Render with Mantra, and Voila! a textured model!

From Learning 3D


From Learning 3D


From Learning 3D



Credits:
The normal map stuff was gleaned from this post at odforce for starters. I found a tutorial on ZMapper elsewhere. The bits about initial subdivision in ZB I picked up from the 3D Buzz ADP tutorials.

Read the original blog entry...

More Stories By Jim Crafton

Jim Crafton is software developer currently doing a variety of work in C++, C#, and Java. He is the author of the Visual Component Framework (more at http://vcf-online.org/), an advanced C++ application framework. He's also interested in graphics, particularly 3D graphics using tools like Houdini and ZBrush.

Latest Stories
"Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT," explained , Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We are focused on SAP running in the clouds, to make this super easy because we believe in the tremendous value of those powerful worlds - SAP and the cloud," explained Frank Stienhans, CTO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We are an IT services solution provider and we sell software to support those solutions. Our focus and key areas are around security, enterprise monitoring, and continuous delivery optimization," noted John Balsavage, President of A&I Solutions, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We want to show that our solution is far less expensive with a much better total cost of ownership so we announced several key features. One is called geo-distributed erasure coding, another is support for KVM and we introduced a new capability called Multi-Part," explained Tim Desai, Senior Product Marketing Manager at Hitachi Data Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Calligo, an innovative cloud service provider offering mid-sized companies the highest levels of data privacy and security, has been named "Bronze Sponsor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Calligo offers unparalleled application performance guarantees, commercial flexibility and a personalised support service from its globally located cloud plat...