AmesyExport Roadmap

After adding and planning more functionality to AmesyExport, I realized that its likely to become one of those projects that suffers from feature creep and never really gets finished. To prevent that, I’ve planned out a roadmap from where I am to a version 1, at which point I’ll be happy to call it finished.

V0.1 = Export Functionality

V0.2 = Multi File Export Functionality

V0.3 = Multi Object, Multi File Export Functionality

V0.4 = Dummy Object Insersion

V0.5 = Optional Z up Export

V0.6 = Support for skinned meshes

V0.7 = Dummy Bone Insersion

V0.8 = Keyframe Baking and Animation Export

V0.9 = Automatic Rigid Skinning

V1.0 = UI Overhaul and Release Video

Can’t wait to get this done! A lot of this is new territory, so should be a great learning experience.



Multi File, Multi Object Export Questions

Now that I have multiple file export working, I’d like to add multiple object exports in one fbx, as not being able to do this really limits the tool.

This seems like quite a difficult problem however, as the multi file function takes each object as an entry in a list and I’d rather not combine or group objects until export time.

The three things I need to work out are:

  • How to take multiple objects into a single entry in a list – possibly using tuples?
  • How to group objects at export time only
  • Making sure I undo the grouping after  export – should be able to add this to the undo chunk?

I also need to work out how I’d like meshes to combine – should it be done by the scene set up, or my material ID?

Things to think on!

AmesyExport V0.2

Had today off from work for the easter weekend, so decided so spend it productively and add a multi export function to my exporter.

Rather than selecting objects and running the export tool, I now have the user add objects to a list, which then outputs each object as a seperate fbx with unreal settings, triangulated and set to the origin.


The next step is to add support for multiple objects within one file.

You can download it here:


Telecomms Box Asset

Been practicing PBR in photoshop some more. I wanted to do something in a realistic style, so created this asset from a photo I took near my flat.


Had a go with Marmoset Toolbag for rendering, I don’t think I’ve quite got the hang of it yet, but it seems like a great piece of software.







How Do Normal Maps Work?

As I’ve been practicing normal mapping recently, I wanted to find out a little more about how normal maps for game assets work on a lower level and how they interact with shaders.

Surface Normal Direction

In a standard rgb normal map, each channel corresponds to a surface normal direction.

R = X Surface Normal, G = Y Surface Normal, B = Z Surface Normal

Tangent Space

Tangent space is used to specify coordinates for a poly face. This is mapped in a similar way to a UV coordinate, however the third axis represents the normal of the face, aka the direction that the face “sticks out in”.

X = Tangent (U), Y = Bitangent (V) , Z = Face Normal (N)

R = U, G = V, B = N

The values for these coordinates represent the direction that values increase in across the face.


World Space

The world space explicitly states the normals of an object in relation to the world. Regardless of object orientation, world space normals will always face the same direction in world.

Tangent Bias

Light rays are calculated in world space, whereas normals are calculated in tangent space. So I guess we have a problem here! This is where the tangent bias comes in. This is used to convert from world space to tangent space.

The tangent bias compares incoming light rays against the normal directions in the map, which then determines how each pixel is lit.

There are many different way to calculate tangent bias. It is important to match these between baking and shading applications, in order to get correct results.

Some engines may skip tangent bias calculation, instead opting to convert the tangent space normal from the map into world space within the shader, before comparing against light rays.



Unpacking Normals

When saving a normal map out from a baking or painting application, it is limited to a positive range per channel, e.g 0 – 255. When this is used in a shader, it is parsed as 0-1. However, being directional coords, normals need to use negative values, so we “unpack it”, remapping it into -1 – 1. This gives us access to all channels, including alpha, as well as the normal vector length.

Why Are Normal Maps Blue?

The 0, 0, 1  colour created on bake represents a flat surface in a normal map, with the blue value representing a completely outward facing normal direction. When this is remapped into 0 – 255 to become a texture map, it becomes 128, 128, 255, giving us a purpley blue colour.