Spacial.io Avatar Workflow
For your viewing pleasure garment development and Spacial.io upload process. Starting with Make human, moving to Blender, transferring to CLO3d, back to blender and finally rendering and uploading to Spacial.io.
I made this document as notes for myself to document the tools I was using and reflect on the different methods; it is a working document that I periodically update and am happy to answer queries around.
While, due to its focus on Spatial.io there is an intrinsic link between this process and Unity this workflow does not include use of Unity as I wanted to see how simplified the experience was without it. The unity element of Spatioal.io is fairly similar to creating 3D games though I have found some elements such as the Rigid body work slightly differently. It also does not reflect on the integrations with Ureal though again this is an interesting area that I am doing other research on as part of my PhD.
1.creating body
A key first step is creativng the avatar, although you can use one of the premade ones available in clo. I creating the avatar body in make human although you can also use DAZ, add the Make Human plugin to blender or make one from scratch in blender or any 3D software yourself.
2. Editing model in blender to replace head
3. Adding avatar to clo
The best way is to add the avatar via the convert to avatar which can be found under Avatar instead of file along the top options. This allows you to apply CLO 3D accesories, motions, and hair to your imported avatars.
If an avatar is not in the T or A pose it is usually not possible for Clo to do automatic placement points, so you can just add them under “File”, for this method you will need to ensure you have a rigged avatar befor you import it. This second method and not having a T or A pose is not recomended and can pose a challenge to 3D garments, it may be limiting in terms of design but for the purposes of this design it wasn’t important.
once saved the avatar can be added to future projects scenes, converted avatars are usally put in a folder called “Conveted” automatically unless you manually request the programme to save them elsewhere.
4. garment creation in clo3D
Dress avatar in clo and export as FBX fileavatar
5. into blender
Import into in blender select the garment then go into edit mode, in edit mode press A to select all of the garment and press F3 and search Merge by distance to remove unwanted vertices
5.1 weight painting and armature
from object mode select the body first hold shift and then select the garment, no go to weigh paint mode
from here go to the weights menu and select transfer wieghts, this is the first step in alowing the garment to move in sync with the body
when the menu apears at the bottom there are two settings to alter, ensure the Vert4ex mapping says ‘Nearest Face Interpolated’ and the Source Layer Selected says ‘By Name’
Select the garment first hold shift and select the armature and press CTRL and P then from the set parent to menu select ‘With Empty Groups’
*If this doesn’t work remove parent by pressing Alt + P and in edit mode press A to select all of the garment and press F3 and search Merge by distance to remove unwanted vertices. and then back in object mode parent again garment to avatar
** To reset the armature to original position Clear the rotation of all bones by using A twice to select all, then Alt + R to clear rotation.
move bones if required
5.2 adding materials and altering values
press add materials and textures and normal maps using blender nodes these will have to be baked in order to export to other programmes when rendering change render enging to cycles and ensure devie is set to CPU
select the object and select the image that the texture is being baked to in the nodes , this should be disconested from everything right now
pick the type of bake based on what element or type of material you are baking
Bake a diffuse map for the colour, change to for normal to bake a normal map and roughness for texture map. If you have emissions on you model bake an emissions map too as it may not show correctly without one. This proccess will take time based on the quality and complexity of the image. Finally join the baked image textures in place of the blendee nodes.
5.3 rendering image
press to see the camera view
position camera
position lighting
To make the film/background/world transparent, we go to the Render Properties Panel. Scroll down and find the tab for “Film.” Expand this and there is a box labeled “Transparent.” Checking this box will render the film as transparent
5.4.1 exporting to fbx
With the baked texture images applied, in place of material nodes, go to file export and choose FBX, them pick a place to save the model.
Compress the files along with the textures into a zipped file. This file is what you will upload to Spatial.io to add to your space.
6.1 renders and model upload
I decided that For this show I wanted to render 2d images in line with the vaporwave retro theme instead of exporting the models.
Once uploaded as transparent PNGs it was easy to resize and move items into desired positions and lock them in place. The images display on both sides so do not need to be place to hide the back.
Although you can bake the textures into UV maps and export them in a zipped format, with baked images textures you can zip the exported FBX model and the texture image together into one file. And then upload the zip file instead of the 2D textures.
I used this technique for the 3d models in the distance of the exhibition but found that it did not work well for every garment, which is why I used the 2D render instead.
6.2 Inside Spatial.io notes
Some image angle were not included or designs altered to suit to Spatial rules on appropriate dress, which is no nipples, genitalia or bare posteriors.
Having tried the Spatial.io web based gallery upload system I found it very accessible but would rather use the Unity version in future for the customisation options, 3D support and ability to add a variety of code and gamification.
The exhibition is live on Spatial if you would like to explore.