A guide to Unity for Unreal Engine developers

1 Introduction

  • Unity is the world's most widely used real-time 3D engine, providing a high-quality creative experience for all developers.
  • This article provides detailed tutorial guidelines for Unreal Engine developers who want to switch to Unity to help developers get familiar with it quickly and other cutting-edge devices reach a wider group of players. Apple Vision Pro engine to create high-quality content with high image quality and low energy consumption, and publish it cross-platform to any mainstream platform, includingUnity engine. Use theUnity
    1
  • Before starting to use Unity, users need to install the engine version that suits their needs. This can be done through Unity Hub, a standalone application that simplifies the way you find, download, and manage Unity projects and installation content. In the "Installs" page, click the "Add" button to Get the latest version of Unity.
  • Address: https://unity.cn/releases
    2
  • The recommended IDE (integrated development environment) for Unity programming isVisual Studio, the latest version of Unity is packagedVisual Studio Professional 2019. However, since Unity compiles all scripts through the Unity Editor, there are no strict requirements on which IDE to use. That said, you can use any code editor you like, and many tools have ready-made Unity integration.
  • We will install Visual Studio and recommend using this IDE because it provides convenient features such as Intellisense and auto-completion, as well as debugging tools such as breakpoints and watchpoints. Since Visual Studio is pre-installed, no configuration is required. However, if you want to know more, you can check out the official guide and browse some tips. Address

2. Editor

  • When opening Unity for the first timeUnity, the user will see a message similar to Unreal Similar layout. The editors for both engines include fully modular, customizable windowing systems. This allows users to move, resize, and replace tabs and panels within the interface. Below are the important views in Unity and their counterparts in the Unreal Editor.
    3

2.1 Scene view (viewport)

  • The Scene view is Unity's viewport that allows you to intuitively navigate and edit your scene. Selecting a game object displays the familiar 3D transform handles, the type of which can be selected using the buttons in the upper left corner of the toolbar (these buttons can also be used to select pivot options, switch between world/local orientation, etc.). In the center of the toolbar are play, pause, and frame skip buttons, allowing you to test your game directly in the editor. The scene view helper icon located in the upper right corner of the Scene view indicates the orientation of the view. Click an axis to align the view to that axis, and click the center cube to switch between front and perspective views.
  • Game Object:Link
    3

2.2 Game view (Play in Editor)

  • By default, the Game view is located in the Scene view Later, Unity’s “Play in Editor” function is provided. In Unreal, when the game is launched in the editor, the game plays in the active viewport. Unreal has PlayerPawn and the viewport acts as the actual game view. Unpossessing allows you to edit levels while the game is running. Unity separates these two "modes" into Scene view and Game view. The Game view captures the cursor and responds to input just as the game is built. Switching to Scene view allows updates to be made at runtime - often placing them side by side to facilitate quick iteration.
    5
    6

2.3.Hierarchy window (World Outliner)

  • In Unreal, all Actors in the active sublevel are listed in the World Outliner. This gives you a way to organize, filter, and set the visibility of Actor . In Unity, this corresponds to the Hierarchy window, which provides the same search and visibility functionality while also providing a way to manage active scenes and add new Game object methods.
    7

2.4 Project window (Content Browser)

  • Unity'sProject window corresponds toUnreal's< /span>, used to display all available resources in the project. It provides a search feature that lets you filter and save searches to find resources more easily. Additionally, any external packages used by the project will have their resources displayed in a separate folder below the project resources. Content Browser
    8

2.5 Inspector (Details)

  • The Inspector functions identically to the Details panel in Unreal. It allows you to view and edit component properties when clicking on a game object or prefab. Unlike the way Unreal opens a new window to edit asset settings, when you select an asset in the Project window, Unity displays relevant information and settings for the asset in the Inspector.
    9

2.6 Console (Message View/Output Log)

  • The Console tab is located behind the Project window and is used for debugging output for the game and editor. Through the "Debug" class in C#, you can output a series of messages using the following functions:
    • Log()
    • LogWarning()
    • LogError()
      10
  • In the top menu bar ofConsole you can clear or filter messages and enable "Pause on Error” function. The debug function also has a Context parameter that allows you to associate game objects with messages. When a message with a Context is double-clicked, the game object is focused in the Scene view and Hierarchy. Debug.Log("Just a Log!");Debug.LogWarning("Uh oh, a Warning!");Debug.LogError("Oh no! An Error!") a>

2.7 Where is the Modes panel?

  • There is no Unreal Modes tab in Unity. Most object placement is done directly in the Project window. Special interactions such as leaf painting or landscape sculpting are often done in separate tool windows, or contextually after selecting the relevant object in the scene.

2.8 Other instructions

  • Can be accessed from the toolbarProject Settings by selectingEdit > Project Settings< /span>…; Here the user can edit input, physics, collision layers, editor behavior, etc.
  • Set by selectingEdit > PreferencesEditor Preferences. This allows users to change external tools, hotkeys, and colors.
  • All tool windows available in a Unity project can be found via the Window menu option. This includes the default engine windows (Scene, Inspector, Hierarchy), as well as any windows added by plug-ins or project code. If users want to restore a closed tab, they can find it here.
    11

3. Projects and Resources

  • Unity projects are set up similarly to Unreal projects, but there are significant differences in how assets are managed.

3.1 Where resources are stored

  • In Unity, all assets, including source code, are stored in the "Assets" folder instead of "Contents" ” and “source code” are separated.
    12
  • The only exception is the "Package" folder, Unity'sPackage Manager Use this to store installed packages (similar to Unreal's Plugins folder).
  • Package Manager:https://docs.unity3d.com/Manual/Packages.html

3.2 How resources are stored

  • In Unreal, resources are stored asUAsset, which is a unique resource of Unreal that allows resources from different sources and file types to be combined Imported as a uniform type. UAssetsStore both the data required by the asset and any engine-related data such as texture filtering or mesh collision. This also means that Unreal doesn't actually store raw assets in its project structure. Unity stores source files directly in the project and contains engine and editor-specific data for related assets in separate ".meta" files. Behind the scenes, Unity processes the imported assets into an optimized, game-ready format, which is what the engine actually uses at runtime. These processed resources are stored in the Library folder, which is used as a cache and does not need to be added to the source control system.
    13

3.3 Supported resource formats

  • Unity supports a wide range of file formats:
    14

3.4 Scene (Map)

  • Unity’sScenes is equivalent to Unreal’s Map File, which contains all the data for a specific level. When working in the editor, you are usually editing a .scene file of some kind (unless you are editing a single prefab in Prefab mode, see the "Using Prefab Mode" section for instructions). Like Unreal, you can load multiple scenes simultaneously. Scene files have a convenience: by default, they are registered on your computer as Unity resources. When clicked on them in your computer's file browser, the Unity Editor opens directly.

4. Actors, game objects, and components

4.1 Game Objects and Actors

  • In Unreal, the basic entity that exists in the game world is the Actor. In Unity, this corresponds to GameObject. Actors are similar to GameObjects in that they both accept components and can move, rotate, and scale in the world using their transforms (in Unity, transform components).
  • But there is an important difference between Unity and Unreal.

4.2 Actors in Unreal

  • Unreal has specialized Actor, such as Pawn and Character. Actors in Unreal can be extended and specialized in code to have special functionality built into the Actor itself.

4.3 Game Objects in Unity

  • Unity's game object is a sealed class that cannot be extended or specialized; the behavior of the game object is completely defined by its components. In Unreal you have a player character Pawn whereas in Unity you have a game object with player character components.
  • Can be accessed via the "GameObject" menu in the menu bar or by clicking Hierarchy Use the plus button (+) at the top of the window to create game objects. This will instantiate the selected game object into the scene. You can then move it around or attach it to other game objects.
    15

4.4 Components

  • Both Unity and Unreal use components, but their implementation is slightly different due to the way game objects work. Components in Unreal Unreal has two types of components: Actor components and Scene Components. The Actor component simply adds behavior to the Actor, while the Scene component also has transformations and exists in the world as a descendant of the Actor. Static mesh components are a common type of Scene component. Multiple static mesh components can be attached to an Actor to create more complex shapes in the world.

4.5 Components in Unity

  • Unity components function like Actor components, which means they do not have any physical presence in the world. Typically, the only entities in Unity that have transforms are game objects. To get functionality like the Scene component, you can drag one game object onto another in the Hierarchy window to create a hierarchical view of the game objects.
    16

4.6 Example: Creating a house in both engines

  • A useful example that highlights this difference is creating a house in both engines:
    • In Unreal, you will make a "House" Actor, which has static mesh components such as floors, walls, roofs, etc.
    • In Unity, you will create a "House" parent game object. Then under the "House" game object, add child game objects such as floor, wall, roof, etc. - each with its own mesh renderer component.

4.7 Adding components in Unity

  • Can be accessed through the Component menu in the menu bar or through the Inspector Select the Add Component button to add the component to the game object. Clicking the Add Component button displays a search widget that you can use to find the component to add. Here you can also select the New Script button to immediately create a new component script and add it to the game object.
    17
  • Components can also be added at runtime. To do this, use the AddComponent() function, where "T" is the type of component to add.

5. Blueprints and Prefabs

  • In Unreal, one of the capabilities of Blueprints is to create Actor instances with unique components and properties for use in your project. Blueprints you create are stored as resources and can be placed and generated at will.

5.1 Prefabs in Unity

  • In Unity this is done using prefabs. A prefab is a hierarchical view of game objects saved as assets. Prefabs can be dragged and dropped directly from the Project window into the Scene view, or they can be generated from a reference in a script. When a prefab asset is updated, all instances of that prefab in all scenes are updated. However, if you just change the properties of a prefab instance in the scene, it will retain those modified properties.
    17
    18

5.2 Edit prefabs using prefab mode

  • Blueprints have their own resource window for editing themselves, and similarly, Unity providesPrefab mode that allows you to View prefab assets outside the scene. This allows you to make local adjustments and add child game objects. Prefab mode can be accessed by double-clicking a prefab in the Project window or by clicking the right arrow next to a prefab instance in the Hierarchy.
    19

5.3 Nodes

  • Unlike Blueprints, which have an embedded visual scripting system, Prefabs do not have any scripting capabilities or features. All of a prefab's behavior comes from the components of the game objects it contains. Create custom behaviors by writing C# scripts.

5.4 Nested Prefabs (Child Actors)

  • In Unreal, a useful component of Blueprints is the Child Actor component, which allows you to use an Actor as a component of another Actor. This is used for situations where two Blueprints must exist independently but are intrinsically linked - for example, a player character holding a sword. This is similar to Unity's nested prefab feature, which allows you to place prefabs inside other prefabs while still maintaining a connection to the original prefab. This means that if a child prefab is updated, all other prefabs nested within it will also be automatically updated.

6. Script programming in Unity

6.1 Similarities to Unreal Scripting

  • Unreal uses C++ for behavior and Blueprints for scripting, while all of Unity's scripts are written in C#. However, like Unreal, Unity scripts are primarily used to handle game events such as frame updates and overlaps. You can find some examples below:
    21
  • For more information on how and when Unity events are executed, see Execution Order of Event Functions in the Unity Manual. https://docs.unity3d.com/Manual/ExecutionOrder.html

6.2 Use Monobehaviour to write component scripts

  • As mentioned earlier, Game Objects are sealed classes and cannot support custom behaviors like Actors. Instead, all their behavior comes from components. Component classes can be created by extending Unity's MonoBehaviour class. MonoBehaviour is the base class for all component scripts and allows you to attach your code to game objects.

6.3 Example: Analyzing a Unity component script

  • Let's analyze the following component script, which logs various messages based on the events received:
    62

63

  • The script is set up as a fairly genericC# that extendsMonobehaviour class, but there are a few important things to note: serialized fields

  • At the top of the class body, the script defines two string variables for the component to log when it starts and when it is hit. However, these two string variables are not defined anywhere in the code. This is because these variables are serialized and can be configured as properties in the editor using the Inspector .
    64

  • This is very similar to the usage of UPerties in Unreal. In Unity, you can make a variable appear in by adding the "[Serialize Field]" attribute above the variable declaration. In Inspector. By default, public variables are serialized, private variables are not, so there is no need to use this attribute for public variables. Even if the variable is serialized, you can still initialize it, as shown by the hitLimit variable. This will serve as the variable's default value when displayed in the Inspector .

6.4 Event methods

  • Here are the functions Unity will call in response to specific events:
    • As soon as the component's game object is activated in the scene, Start() is called.
    • Called whenever a collider on this game object is hit by an object with a Rigidbody component attached< a i=3>OnCollisionEnter().
    • Update() will be called every frame.
  • Note: If the Update() function is not needed, it is best to remove it from the script. This is similar to setting CanActorEverTick to false in Unreal and helps avoid unnecessary calls every frame.

6.5 Where is the equivalent of UObject?

  • Unreal's base object class is theUObject class, which can be extended to create regularActor/ Component Object outside the pattern. These objects are not generated into the world, but can still be referenced by other objects/Actors, useful for containing data without polluting the level.

  • 使用 ScriptableObject

    • Unity's ScriptableObject supports the ability to create data objects without spawning them in the scene. Like UObject , ScriptableObject stores data and reduces dependencies between game objects. https://docs.unity3d.com/Manual/class-ScriptableObject.html
      But in Unity, ScriptableObject can also be instantiated as a resource. This is similar to data assets in Unreal. This is a very powerful feature that completely separates static data from game objects.
  • Example: In-game potion shop

    • Imagine that you want to create a shop in your game that sells potions. Each potion is a prefab that stores the appearance of the potion and the scripts that control the potion's behavior when used. When players enter the store, they may see a menu that lists 30 potions for sale, along with the name, price, and description of each potion. Storing this UI data on a prefab means that Unity needs to load all 30 potion prefabs into memory in order to get the name and price needed for the UI. But doing this also loads all the visual effects and script data for the potion, which the UI doesn't need at all. To avoid loading all this unnecessary data, we can use a ScriptableObject that contains the name, price, description, and a reference to the prefab that contains the potion's visuals and behavior. 2> To separate UI data from game data. This way, lighter description data can be loaded quickly and shared throughout the UI, and more complex prefabs only load when the player actually equips the potion in game.

6.6 Common script use cases

  • Here are some common use cases and patterns for Unreal and its Unity counterpart:

  • Create object instance

    • In Unreal, this is done via theCreateActor Blueprint node orUWorld::SpawnActor() C++ function is completed. In both cases, you need to pass in a class reference and initialization data, such as name and location. In Unity, instantiation of game objects is done using the Instantiate() function, which accepts a prefab reference and a starting position/rotation. If you only need an empty game object, you can also use "new GameObject()" to quickly instantiate a new game object instance.
  • Conversion between types

    • In Unreal, type conversion is mainly done through the generated Blueprint conversion node or the Cast() function in C++. In Unity, you can convert using the "as" keyword, or use a c-style conversion. In both cases, if the conversion fails, the result is null.
      25
  • Destroy and disable objects

    • Both engines have garbage collection capabilities that clean up unused references. In Unreal, some object types also have explicit Destroy functions that mark objects for deletion. In Unity, the UnityEngine.Object base class has a static Destroy function , when an object reference is passed in, this function will destroy the object. This can be used by game objects and components, and by any object that inherits from the UnityEngine.Object base class. Game objects can also be disabled using SetActive(false) . Components can also be disabled individually, which still allows code execution but prevents calls to Update and OnCollisionEnter Wait for Unity event methods.
    • 静态 Destroy 函数:https://docs.unity3d.com/ScriptReference/Object.Destroy.html
  • Access components

    • In Unreal, to find the component attached to an Actor, you can use the GetComponentByClass node in a Blueprint, or use the node in C++ a> method) and store the result in a variable. Frequent calls to GetComponent can impact performance because it requires iterating through every component on the game object, so storing references where possible is an easy way to optimize your code. Start once (usually after GetComponent which returns the first component of the type found on the game object . Unlike Unreal, you cannot automatically access a game object's components by name. To solve this problem, you can simply call GetComponent() function. Both methods accept a class type that they use to find matching components. However, since components can be defined using names in both C++ and Blueprints, if you know the type of the Actor, you can simply access the component by name. In Unity, you can do this using the generic function FindComponentByClass
  • Use tags

    • Unreal has a GameplayTag system that can be used to compare tags between objects for quick identification.
      Insert image description here
    • Unity has its own game object tagging system. Tags can be selected using the "Tag" drop-down menu in the Inspector, or Create new label. It can then be accessed using GameObject.tag or GameObject.CompareTag() the data. Tags can be edited in the project settings by going to Edit > Project Settings…> Tags and Layers.
  • Find game objects and components

    • In Unreal, you can use GetAllActorsOfClass to search for Actor types in the world and then filter the results. In Unity, you can find game objects by name using GameObject.Find(string name) . You can also use GameObject.FindWithTag(stringtag) to search by tag. To find objects by component type, you can use the generic function FindObjectsOfType() where T is the component class to be found. This will return an array containing the search results. In both engines, frequent calls to functions that find objects in the world can have a high performance cost and should not be used in code that is called every frame.
    • Generic functions:https://docs.unity3d.com/Manual/GenericFunctions.html
  • Raycasting (tracing)

    • In Unreal, ray casting and shape casting are done using the Trace function. Both shape and ray tracing support tracing by channel or object type. The cast outputs a Hit Result structure that contains all relevant information about the hit result. Unity has several functions for ray casting:
      11

    • Also, you can useRaycastAll() or SpherecastAll() Return all hit results, not just the first hit. For performance-sensitive contexts, these functions also have non-garbage-allocated versions (for example: Physics.OverlapSphereNonAlloc).

  • Input and input management

    • In Unreal, create actions can be set using Input Actions/Axis. These settings allow you to define various bindings for player actions (for example, "Jump" or "Throttle ”). Input actions can then be bound to functions to enable code to react to input. Unity uses a similar system: code can read input from defined Axes using the **Input.GetAxis()** function. These settings can be found in:Edit > Project Settings…> Input Manager > Axes.
      111

    • Input.GetAxis("Horizontal") Default bound to the A/S keys and the left/right analog axis on the controller. Options such as deadband, sensitivity, inversion, etc. can be set for each axis. Although called an axis, it also supports numeric input (using the Input.GetButtonDown() function). Additionally, explicit keys can be queried using Input.GetKeyDown() .

  • Asynchronous code (delay/timeline)

    • In Unreal, Delay and Timeline nodes provide simple ways to control event timing and modify properties over time. In Unity, these types of deferred execution can be handled using coroutines. Coroutines are special functions that execute independently of regular code. You can delay or pause the coroutine at will using the "yield" instruction. All coroutines need to return an IEnumerator, which allows you to use yield Return some kind of pause or delay. The following example will print the log after a 5 second delay:
      29
  • event system

    • In Unreal, you can create and bind custom events to your classes using Blueprint's event dispatcher system or C++ delegates. In Unity, there are several ways to create and bind events. The latest example is the UnityEvents system, which provides a powerful way to bind handlers to events using the Inspector. When you define serialization events in code, you can see the exposed fields in the Inspector (as described in "Serialization fields"). You can drag and drop game objects that you want to react to events onto fields in the Inspector. If you need a more lightweight event system, Unity also supports delegates as well as generic C# events.
      31

7. Other engine properties

7.1 Physics and collisions

  • Unreal's simulation and collision properties are built directly into the original component, which manages the channels for interacting with the component as well as data such as physical materials and masses. The collision boundary is defined by the visual grid used by the component. Unity's built-in physics engine uses the Rigidbody component and the Collider component to control physics simulations . Depending on the shape of the game object, there are some specialized colliders, including: box, sphere, capsule, and grid.
  • The Rigidbody manages the dynamic simulation of the game object, while the Collider provides the shape properties. The physical layer of interaction is defined at the project level.
    31

7.2 Basic animation

  • In Unreal, skeletal animation is created using animation sequences/montages, typically using animation blueprints and state machines to control this animation.
  • Unity's Mecanim animation system works in a similar way. Mecanim allows you to import various animation clips and control their playback using a script-controllable state machine.
    31

7.3 Multi-object animation and movie animation

33

  • Unreal's primary tool for controlling movie animation and multi-object animation is the Level Sequencer. In Unity, Timeline is a very convenient tool. Similar to the Level Sequencer resource, the Timeline resource is a collection of property animations. Timeline works with a variety of systems including: animation, particle effects, sound, cameras, transforms, materials.

7.4 User Interface (UI)

  • In Unreal, most user interface (UI) is controlled using Unreal Motion Graphics UI Designer (UMG). UMG is a retention-mode UI system. When using UMG, you create UI objects in a hierarchical view, and each object handles its own data and events. UMG uses special blueprints called widgets to enable you to lay out and script your UI in a single resource.
    36

  • Unity also has a retained mode UI system based on the Canvas component called Unity User Interface (Unity UI). Unlike UMG, this system requires no separate resources: just game objects with UI-specific components to control rendering, interaction, and layout. In a Hierarchy, all UI game objects are placed under another game object with a Canvas component, which manages how the UI is rendered and how to interact with it.

  • Unity UI:https://docs.unity3d.com/Packages/[email protected]/manual/index.html
    Adding behavior to the UI is done by writing C# scripts or dragging and dropping UnityEvents (for components like Buttons) in the Inspector.

Guess you like

Origin blog.csdn.net/backlighting2015/article/details/134546343