Search public documentation:
Without custom content created by the artists, games would surely not be anywhere near as enjoyable to look at and players would find it more difficult to become immersed in the world the game is trying to portray. Using both external applications and the tools provided with Unreal Engine 3, artists have complete freedom to create content that will bring the game environment to life.
Unreal Engine 3 provides a pipeline for importing geometry including static meshes and skeletal meshes, animations, and morph targets as well as other elements, such as cameras and scenes through Matinee, using Autodesk's FBX file format. The FBX import pipeline means content can be created in virtually any 3D application that supports exporting to FBX files, instead of relying on proprietary exporters or formats. The FBX pipeline provides a unified workflow for importing geometry and textures, as well as animations and morph targets in the case of Skeletal Meshes, all at once through a unified interface. The process is simple:
- Click the button in the Content Browser.
- Navigate to and select the FBX file containing the content to import.
- In the Import dialog that appears, choose the appropriate settings and click the button. (See FBX Import Properties for complete details on the available options)
- The import process will begin, displaying the progress:
Static meshes are non-animated render meshes, or a collection of related triangles, created in an external modeling application, such as 3dsMax or Maya, and then imported into UnrealEd through the Content Browser. These meshes can be used for several different types of Actors being placed in a level. Most commonly, a static mesh will be the visual representation of a StaticMeshActor to create world geometry and decorations, an InterpActor to create moving objects, or a KActor to create rigid body physics objects. In reality, any Actor that has a StaticMeshComponent can use a static mesh. For a guide on creating and importing static meshes, see the FBX Static Mesh Pipeline page. Static Mesh Editor Once imported, Static Meshes can be viewed and certain aspects can be modified in the Static Mesh Editor. This is where global properties of the mesh can be modified, LOD meshes can be imported, UV sets can be managed, and simplified collision geometry can be added or removed. Collision Static meshes can have collision calculated against the actual triangles of the render mesh if desired, but generally they use separate simplified collision geometry to reduce the complexity of the collision calculation and improve perofrmance. In the image below, the red and green wireframes show the geometry used for collisions. You can see it is much simpler than the shaded render mesh, but defines the general shape which is usually good enough for collision calculations in most situations. Simplified collision geometry can be created in an external modeling application and imported along with the render mesh. This allows for completely customized collision geometry. The Static Mesh Editor also contains tools for adding simplified collision geometry to imported meshes. These tools are less flexible, but can work well in certain circumstances. An overview of collision for static meshes can be found in the Collision Reference. Level of Detail (LOD) A level of detail system is built in to allow each StaticMesh asset to render different meshes depending on the size of the mesh on the screen. This can be a great optimization as you can have meshes of varying complexity, each with their own material(s); rendering a detailed mesh up close and switching to meshes with lower detail as the cammera gets farther away and intricate details become unnecessary. LOD meshes for a StaticMesh asset can be imported and set up within the Static Mesh Editor.
Skeletal meshes are meshes weighted to a skeleton, or hierarchy of bones or joints, created in an external modeling/animation application and then imported into UnrealEd through the Content Browser. These meshes can be animated; or rather animation can be applied to the skeletons of the meshes to animate them. They can also have morph targets applied and their bones can be controlled individually by Skeletal Controllers. These meshes can also have special locators created called Sockets that are attached and/or offset from bones in the skeleton to which items can be attached. Skeletal Meshes are typically used for characters, weapons, vehicles, and any other items that require complex animations beyond simple translation and rotation. For a guide on creating and importing skeletal meshes, see the FBX Skeletal Mesh Pipeline page. Physics Assets Physics Assets are objects which contain the physical setup for a Skeletal mesh which is used for calculating their physics simulation and collision. They are essentially a collection of rigid bodies joined together by constraints to mimic the shape and desired movement abilities of the Skeletal mesh. Physics Assets are created and modified in the PhAT physics editor. This is a tool for graphically editing the bodies and constraints that constitute the Physics Asset. For a guide to using the PhAT editor, see the PhAT User Guide. Animations Animations, or animation sequences, are collections of keyframe data that specify the translation and rotation of a single bone at a specific time. Each animation sequence contains all the necessary keyframes for all of the bones in the skeleton for the particular animation. Unreal Engine 3 uses these animation sequences to control the animation of Skeletal Meshes in the game. A collection of related animation sequences can be grouped into a container called an AnimSet. Usually, all of the animations for a particular Skeletal Mesh or group of Skeletal Meshes sharing a common skeletal setup will be in a single AnimSet. Animations can be previewed and modified in certain ways within the AnimSet Editor in UnrealEd. For a complete overview of viewing and modifying animations, see the AnimSet Editor User Guide. For a more detailed overview of the animation system within Unreal Engine 3, visit the Animation Overview page. For a guide to importing animations into Unreal Engine 3, see the FBX Animation Pipeline. Animation Trees Unreal Engine 3 uses the idea of 'blend trees', called AnimTrees, to blend together multiple sources of animation data. This method allows you to cleanly organize the way in which a number of animations are blended together, and lets you easily add more animations in predictable ways as you progress. These are typically used for players and other characters to control and blend between the different movement animations. AnimTrees also provide the ability to use Skeletal Controllers and Morph Targets, which are detailed below. Vehicles also make use of AnimTrees for any animations that may be needed and for visible localized damage through the use of morph targets and direct control of bones through skeletal controllers. For information on creating and editing AnimTrees, see the AnimTree Editor User Guide. Skeletal Controllers Skeletal Controllers, or SkelControls, allow you to modify a bone or set of bones in a skeletal mesh programmatically. For instance, there are skeletal controllers for transforming a single bone arbitrarily, forcing one bone to look at another bone or other location, setting up an IK limb solver, and so on. SkelControls are set up in the AnimTree Editor and can be connected together to form chains, each affecting a single bone, where each SkelControl is applied to the result of the previous. For an overview of Skeletal Controllers and using them to manipulate Skeletal Meshes, see the Using Skeletal Controllers page. Morph Targets Morph Targets are a way to modify a Skeletal Mesh in real-time, but with more control than bone-based skeletal animation. A static Morph Target is a version of an existing skeletal mesh that has the same underlying geometry in terms of vertices and faces, but the vertices have been positioned differently. For example, you might create a 'smiling' version for a character in your 3D modeling program, and import that as a 'smile' morph target. Then in the game you can apply this morph target to modify the vertices on the face and make your character smile, but with a great deal of control over how each vertex moves. Morph targets are also commonly used for creating the appearance of damage on vehicles or other objects. For more information on Morph Targets and their use, visit the Morph Targets page. Sockets Sockets are essentially named locators that can be attached to, and offset from, a bone of a Skeletal Mesh. These provide a simple means of attaching other objects, such as particle effects or other meshes, to Skeletal Meshes in specific locations that can be set up by artists in the AnimSet Editor and referenced by programmers in code. A detailed overview of sockets can be found on the SkeletalMesh Sockets page.
The Materials system in Unreal Engine 3 is an extremely flexible system based on creating materials by linking nodes, called expressions, together to form networks and effects. Each expression evaluates down to a piece of shader code that performs a specific function. These can be chained together with the output of one expression used as the inputs of others. The final result of each expression chain plugs into one of several inputs on the base Material node. The inputs available are:
A Particle System is a content asset that consists of any number of emitters used to create effects such as fire, explosions, energy beams, etc. Emitters are objects that create, or "emit", particles. Each emitter emits a single type of particle, or TypeData, that all share the same underlying rules for appearance and behavior specified by modules. Particle systems are created through the Content Browser and constructed using Cascade, the particle editor. camera-facing quad. When using the sprite data type, each particle emitted by the emitter will be a sprite with the emitter's material applied. This is the default type of all emitters, so there is no need (nor is it possible) to manually apply this type to an emitter in Cascade. You can see the underlying geometry of each sprite in the image above. The image also shows that no matter how the camera is positioned, the quads always face towards it. Sprite emitters are often used to create volumetric effects, such as smoke, fire, dust, etc., as well as other things like sparks, weather effects (rain, snow), flares, and many more. Mesh Emitters using the mesh data type emit particles that render a StaticMesh instead of the default sprite. Mesh emitters are commonly used to create debris from explosions or destruction, such as rock chunks, vehicle parts, or splintered pieces of wood, in addition to other effects that require the three-dimensionality provided by the use of StaticMeshes. AnimTrail The AnimTrail data type causes multiple particles, with quads extending between them connected end-to-end, to be emitted to form a trail in response to the animation of a SkeletalMesh. These can be used to enhance the appearance of motion due to animations, such as a streak from a slashing sword. See the AnimTrails page for more information on creating and using AnimTrail emitters. Beam Beam emitters emit multiple particles, with quads extending between them connected end-to-end, between a source and target location to form a beam. These emitters can be used to create lasers or other beam-type effects. Ribbon The ribbon data type emits particles where each particle is connected to the previous one, with quads extending between them connected end-to-end, to form a ribbon-type effect. Ribbon emitters can be used to create more generic or arbitrary trails (as opposed to the animation driven AnimTrails mentioned previously). PhysX The PhysX data types emit either sprites or meshes in a similar manner to the standard Sprite or Mesh data types, but the particles are controlled by a fluid physics simulation.
- How long each particle lives
- Where particles are spawned from
- Direction and speed particles move
- How large are the particles
- The Color of the particles
- Direction and speed particles rotate
- Whether particles collide with geometry
Post Process Effects are used in Unreal Engine 3 to apply full-screen effects, such as bloom, depth of field, motion blur, etc., to the rendered scene. These effects are created by chaining together different modules, each of which performs a specific effect (though some modules combine multiple effects for performance efficiency). Post process effects are created in the Content Browser and edited in the Post Process Editor. This node-based editor allows the various available effect modules to be added and arranged For additional information over specific types of effects available in Unreal Engine 3, see the following pages: Bloom Bloom is a type of glow effect that occurs when looking at very bright objects that are on a much darker background.
|Without Bloom||With Bloom|
|Without Blur Effect||With Blur Effect (BlurKernel = 2)|
|Without DoF Effect||With Dof Effect|
|Scene Depth Visualized||Distance-Based Desaturation||Sepia Tone Scene Tinting|
|Without Motion Blur|| With Motion Blur|
(and quick camera movement)
|Without Ambient Occlusion||With Ambient Occlusion||Ambient Occlusion Only|
Audio playback in UE3 is handled by a collection of sound actors, through Kismet, or through code. For an overview of the audio system in Unreal Engine 3, see the Audio System page. Sound Waves Sound waves are the base asset that contain the actual sound wave data to be played. These are imported through the Content Browser. SoundCues The behavior of the sounds being played is defined within SoundCues. SoundCues are objects that allow for chaining together nodes which perform some operation on a sound wave or define some aspect of the sound's behavior. Most of the engine relies on SoundCues to play sounds instead of the basic sound waves. For more information on creating and editing SoundCues, see the SoundCue Editor User Guide. Sound Actors Sound Actors can be placed into levels to create environmental and ambient sounds. There are several different sound actors provided with Unreal Engine 3. Some act on sound waves directly while others require SoundCues.