The Input object is responsible for converting input from the player into data in a form Actors can understand and make use of.


The PlayerInput Object is responsible for converting input from the player into data that Actors (like PlayerControllers or Pawns) can understand and make use of. It is part of an input processing flow that translates hardware input from players into game events and movement with PlayerInput mappings and InputComponents.

For an example of setting up Input, refer to the Setting Up Inputs documentation.

Hardware Input

Hardware input from a player is very straightforward. It most commonly includes key presses, mouse clicks or mouse movement, and controller button presses or joystick movement. Specialized input devices that don't conform to standard axis or button indices, or that have unusual input ranges, can be configured manually by using the RawInput Plugin.


PlayerInput is a UObject within the PlayerController class that manages player input. It is only spawned on the client. Two structs are defined within PlayerInput. The first, FInputActionKeyMapping, defines an ActionMapping. The other, FInputAxisKeyMapping, defines an AxisMapping. The hardware input definitions used in both ActionMappings and AxisMappings are established in InputCoreTypes.


Map a discrete button or key press to a "friendly name" that will later be bound to event-driven behavior. The end effect is that pressing (and/or releasing) a key, mouse button, or keypad button directly triggers some game behavior.


Map keyboard, controller, or mouse inputs to a "friendly name" that will later be bound to continuous game behavior, such as movement. The inputs mapped in AxisMappings are continuously polled, even if they are just reporting that their input value is currently zero. This allows for smooth transitions in movement or other game behavior, rather than the discrete game events triggered by inputs in ActionMappings.

Hardware axes, such as controller joysticks, provide degrees of input, rather than discrete 1 (pressed) or 0 (not pressed) input. That is, they can be moved to a small degree or a large degree, and your character's movement can vary accordingly. While these input methods are ideal for providing scalable amounts of movement input, AxisMappings can also map common movement keys, like WASD or Up, Down, Left, Right, to continuously-polled game behavior.

Setting Input Mappings

Input mappings are stored in configuration files, and can be edited in the Input section of Project Settings.

  1. In the Level Editor, select Edit > Project Settings.


  2. Click Input in the Project Settings tab that appears.

In this window, you can:

Change the properties of (hardware) axis inputs: AxisConfig.png

Add or edit ActionMappings: ActionMappings.png

Add or edit AxisMappings: AxisMappings.png


InputComponents are most commonly present in Pawns and Controllers, although they can be set in other Actors and Level Scripts if desired. The InputComponent links the AxisMappings and ActionMappings in your project to game actions, usually functions, set up either in C++ code or Blueprint graphs.

The priority stack for input handling by InputComponents is as follows (highest priority first):

  1. Actors with "Accepts input" enabled, from most-recently enabled to least-recently enabled.

    If you want a particular Actor to always be the first one considered for input handling, you can re-enable its "Accepts input" and it will be moved to the top of the stack.

  2. Controllers.

  3. Level Script.

  4. Pawns.

If one InputComponent takes the input, it is not available further down the stack.

Input Processing Procedure


Example - Moving Forward

This example is taken from the First Person template provided with Unreal Engine 4.

  1. Hardware Input from Player: The player presses W.

  2. PlayerInput Mapping: The AxisMapping translates W to "MoveForward" with a scale of 1.


  3. InputComponent Priority Stack: Proceeding through the InputComponent priority stack, the first binding of the "MoveForward" input is in the AFirstPersonBaseCodeCharacter class. This class is the current player's Pawn, so its InputComponent is checked last.

    void AFirstPersonBaseCodeCharacter::SetupPlayerInputComponent(class UInputComponent* InputComponent)
            // set up gameplay key bindings
            InputComponent->BindAxis("MoveForward", this, &AFirstPersonBaseCodeCharacter::MoveForward);

    This step could also be accomplished in Blueprints by having an InputAxis MoveForward node in the Character's EventGraph. Whatever this node is connected to is what will execute when W is pressed.


  4. Game Logic: AFirstPersonBaseCodeCharacter's MoveForward function executes.

    void AFirstPersonBaseCodeCharacter::MoveForward(float Value)
            if ( (Controller != NULL) && (Value != 0.0f) )
                // find out which way is forward
                FRotator Rotation = Controller->GetControlRotation();
                // Limit pitch when walking or falling
                if ( CharacterMovement->IsMovingOnGround() || CharacterMovement->IsFalling() )
                    Rotation.Pitch = 0.0f;
                // add movement in that direction
                const FVector Direction = FRotationMatrix(Rotation).GetScaledAxis(EAxis::X);
                AddMovementInput(Direction, Value);

    Blueprint Implementation:


Touch Interface

By default, games running on touch devices will have two virtual joysticks (like a console controller). You can change this in your Project Settings, in the Input section, with the Default Touch Interface property. This points to a Touch Interface Setup asset. The default one, DefaultVirtualJoysticks is located in shared engine content (/Engine/MobileResources/HUD/DefaultVirtualJoysticks.DefaultVirtualJoysticks). There is also a Left Stick only version, LeftVirtualJoystickOnly, for games that do not need to turn the camera.


Note that you will need to check the Show Engine Content checkbox in the object picker View Options settings to see these.


If you do not want any virtual joysticks, just clear the Default Touch Interface property. Additionally, you can force the touch interface for your game independent of the platform it is running by checking Always Show Touch Interface (or by running the PC game with -faketouches).

Enhanced Input Plugin

For projects that require more advanced input features, like complex input handling or runtime control remapping, the experimental Enhanced Input Plugin gives developers an easy upgrade path and backward compatibility with the engine's default input system. This plugin implements features like radial dead zones, chorded actions, contextual input and prioritization, and the ability to extend your own filtering and processing of raw input data in an Asset-based environment.

Select Skin
Help shape the future of Unreal Engine documentation! Tell us how we're doing so we can serve you better.
Take our survey

Welcome to the new Unreal Engine 4 Documentation site!

We're working on lots of new features including a feedback system so you can tell us how we are doing. It's not quite ready for use in the wild yet, so head over to the Documentation Feedback forum to tell us about this page or call out any issues you are encountering in the meantime.

We'll be sure to let you know when the new system is up and running.

Post Feedback