Introduction

Before I start with useful tips and tricks for Blender I will briefly share with you my history with Blender.

A few years ago I had a serious addiction with Blender which is used to create 3D models and animations, I used it a minimum of 4 hours a day trying to recreate all kinds of things that crossed my mind, particularly nothing artistic but I was able to create structures, furniture and other types of objects based on reference images.
3D modeling was something that made me surprise myself of my own capabilities, every time a rendering was completed I felt very proud of my creation. In retrospect I wasn’t doing very amazing things but they were things I had made myself from scratch and that was amazing.
So much time and effort spent 3D modeling with Blender and texturing with Substance Painter paid off and today I can include those capabilities as part of my work as a freelance developer.

Below we are going to review 10 useful tips and tricks for using Blender that have helped me to speed up and improve the modeling process, allowing me to accomplish tasks faster or achieve better results.



#1 – Focusing the camera on the selected object in Blender

We start with a shortcut to center the view or even the rendering camera on the selected object. An extremely important trick because it is something that greatly improves the agility when using Blender. With this shortcut you can say goodbye to all that time trying to correctly place the camera on a 3D model or even on a vertex, the center of an edge or a face.

To use it simply select an object or an element of the mesh and press the dot on the numeric keypad, you will see how the camera is centered on the selected element and also, when rotating the camera, the selected element is the center of rotation.

#2 – Hide all objects or geometry except what is selected in Blender

If you are working on a Blender project that has many objects or an object that has a particularly complex mesh, it can be very useful to temporarily hide certain objects and leave visible only what you need to work with. With this simple shortcut you can easily hide selected objects in Blender and when you need to reveal all the hidden objects again.

To isolate elements in Blender simply select the object or mesh element you want to isolate and press SHIFT+H, this will hide all other elements that are not selected. To make all hidden elements visible again press ALT+H.



#3 – Tip to quickly parent and un-parent objects in Blender

When parenting objects one of them becomes the parent object and the other object or objects we choose become the children, this causes the child objects to be automatically affected by the transformations received by the parent object, for example a movement applied to the parent will cause all the children to move together, the same happens for rotations and scale changes.

To quickly parent one object or set of objects to another in Blender you have to go to the Outliner window where all the objects are located, select the ones you want to parent and then drag them to the object you want to parent them to while holding down the SHIFT key, optionally you can press ALT to keep the transformation of the parent objects.

#4 – Render image with transparency in Blender (works for Cycles and Eevee)

In many occasions it is very useful to render only the visible parts of a 3D model and make the rest of the rendering transparent, for example when you want to create a GIF of yourself dancing and place it in an article about Blender tips.

In the properties window go to the render properties tab and there go to the “Film” section, you will find a checkbox called “Transparent”, checking this will make the parts of the render where there is no 3D model transparent. Make sure you use an appropriate image format that supports transparency, such as PNG.



#5 – Display the normals of 3D models in Blender

The normals of a 3D model are a mathematical element that allows to know in which direction is pointing a particular face of a 3D model, sometimes in the modeling process certain normals can be inverted, that is pointing towards the inside of the 3D model and this can bring problems with shading, which means problems in the visualization of a material applied to the 3D model and also erratic behavior with light sources. Another important problem arises if we are creating these 3D models to use in a game engine like Unity, in this engine 3D models are rendered with “backface culling”, this means that if we have an inverted face in the graphics engine will be invisible and we will see through it, to solve this just correct the normals of the 3D model, but first we need to be able to see these normals.

To activate the normals of a 3D model it is necessary to be in EDIT MODE. Then in the upper right corner of the Viewport window click on the arrow that displays the “Viewport Overlays” window, almost at the end of it we will find the “Normals” section where we have 3 icons to display the normals, usually I choose to display them in the center of the faces. We can also adjust the length of the normals.

#6 – Know the number of vertices, edges and faces in our scene in Blender.

When we are creating 3D models we may be interested in knowing information about the geometry of the objects we are creating, for example how many vertices, edges, triangles or faces our model has, this can help us to determine if there is any problem with duplicate vertices and also keep track of how many polygons our 3D model has, if we are creating 3D models to use in a graphic engine like Unity it can be important to keep the amount of polygons within a reasonable number according to the model we are creating, especially if the application is for a mobile or virtual reality device, where there are certain limitations with the hardware.

To display information about the number of vertices, edges, faces and objects in Blender we go to the upper right corner of the Viewport window, click on the arrow that displays the “Viewport Overlays” window and check the “Statistics” box at the top of the window.



#7 – Applying the same material to other objects in Blender

When we select an object and we want to apply a color or give it a metallic appearance for example what we do is create a new material, which by default starts with the “Principled BSDF” Shader and we have different values to configure the material as we wish. But what happens if we have a second object and we want it to have the same material? We might be tempted to create a new material and configure it with the same parameters, it is even possible to copy the parameters of one material and apply them to another.

But there is a better alternative, in Blender we can make that two objects have the exact same material applied, that is to say that one or several material slots are pointing to the reference of the same material, in this way we can create a particular instance of a material that we could call “Pine Wood” for example and reuse that same material in all the objects that need the pine wood texture, this not only avoids that we have many unnecessary copies of a material but also allows us to modify the material and that the changes are applied automatically in all the objects where that material is used.

In this case the video is more illustrative but let’s try to summarize the procedure. With an object selected we go to the Materials tab (sphere icon with checkered texture), then if we click the + sign what we will do is create a new “Slot” for a material within our object, here there are two options, one is to click on “New” which creates a new instance of a material, completely independent of the others, the other option is to select an existing material (which interests us in this case), for this we click the icon to the left of the “New” button and select from the list the material we want to assign to the slot.

#8 – Show animation bones always in front of other objects in Blender

When creating animations with Blender using animation bones it is very useful to be able to see these bones at any moment even if they are hidden inside another object or obstructed by an object.

With the “Armature” object selected, go to the “Object Data properties” tab (which has a humanoid icon and is located above the tab with the bone icon), then go to the “Viewport Display” section and check the “In Front” checkbox.



#9 – Quickly create Edge Loops in Blender

When we gain some experience with Blender we come across the concept of “Edge Loop”, basically it is a set of vertices on a surface that are connected together and the last vertex of the set is connected back to the first one, the key is that of all the possible connections that can be drawn and that meet these conditions, the Edge Loop is like the loop that is connected in the most coherent way in relation to the other surrounding sets of vertices, it is a concept somewhat difficult to explain but it is easy to understand once we start working with them. An example of edge loop can be one of the rings that forms a sphere or a donut in Blender (the correct name is torus but it looks like a donut), each ring is a set of vertices connected forming a loop and this is an edge loop.

To quickly create an Edge Loop in Blender, select an object, go into EDIT MODE and press CTRL+R, then move the cursor to the part of the geometry where you want to add the edge loop, at this point you can scroll the mouse wheel to increase the number of loops to add or manually enter a number by keyboard.

#10 – Easily select Edge Loops and remove them in Blender

There is a quick way to select Edge Loops which allows us to apply transformations on the model, for example increase the size of a particular Edge Loop or move that Edge Loop in one direction and we can also get rid of that Edge Loop in a way that we keep the rest of the model intact, the latter is especially useful when we want to drastically decrease the amount of polygons of a 3D model to use it in a graphics engine like Unity for example.

To quickly select an Edge Loop in Blender we have to be in edit mode of an object, then hold Left ALT and left click on one of the edges that belongs to the Edge Loop that you want to select, if you click on a vertex of the Edge Loop you may select another Edge Loop that goes through the same vertex, so to be sure to select the correct one it is better to click on the edges.



Definition of runtime

In programming, RUNTIME is the time interval from the moment the operating system starts executing the instructions of a given program until the end of its execution, either because the program was successfully completed or because it was terminated by the operating system due to a runtime failure.

Runtime in Unity

When we are developing a game or application in Unity, the runtime of our program starts when we press the Play button until we press Stop and also when we make a compilation for windows for example, the runtime starts from the moment we run the application until it is finished.

It is important to understand this concept of runtime in Unity because we have to be able to handle situations that will occur during the execution of the program, for example enemies that appear in the middle of the game, these enemies will need to be provided with certain information that was not possible to give them at the time of game development simply because they did not exist at that time, so the person responsible for creating these enemies must also give them the information they need, for example the reference of the player they have to attack.

Introduction

The PlayerPrefs class of Unity does not have a specific method to store vectors, however it has three functions that allow to store data of type integer, float and strings, the types of variable int and float are known as primitive variables and with them it is possible to reconstruct other more complex data structures, and is that a vector of R2 or R3, if we think about it, is nothing more than an array of three variables of type float, that each one occupies a position or has a meaning.



Unity package to download with implemented solution

Below you can download the Unity package with the example we are going to analyze in this article, so you can import it and test it directly on your own computer.

Fig. 1: Files that are added to your project when importing the download package.

Analysis on how to save a Vector3 with PlayerPrefs

The basic idea is to decompose the vector into its components that are float data and then store those components individually with PlayerPrefs, then, when loading the data, retrieve each component from memory and create a new vector using those components.

In the scene that comes in the download package the solution is already assembled, in figure 2 we see how the hierarchy is composed, we have a GameObject called “SaveDataScript” which is the one that has assigned the Script that we are going to analyze (see figure 3) and this is responsible for saving and loading data. Then we have another GameObject called “ObjectToSavePosition” that is a cube to which we will save its position to be able to load it when starting the scene. Notice that in the inspector in figure 3, our Script has a reference to this object, this will allow you to read and modify its variables or execute functions on this GameObject,

I always insist that understanding the concept of reference in programming is a very important pillar in the development of applications with Unity, in this link you will find a playlist of my channel about references in programming and different methods to find them.

Fig. 2: GameObject of the scene that will contain the script to save and load a Vector3 in Unity.
Fig. 3: Script to save and load a Vector3 in Unity seen from the inspector.



Script that is responsible for saving and loading the vector

In figure 4 we see a part of the data saving script that comes in the package, we can see the GameObject type variable called “objectWithSavedPosition” that appears in the inspector in figure 3 and also the Awake and Start methods, which are functions use for initialization that Unity executes automatically at different times within the life cycle of the application. Inside the Awake function is executed a custom function called “LoadData” which is responsible for loading the information, this “LoadData” function is defined by ourselves, we see it later.

The data loading is something that normally happens when starting a scene, sometimes problems can arise depending on where the data loading is done, remember that the Start functions are executed one after another for each GameObject in an order that we can not predict or that would be tedious to predict, imagine that a Script in its Start function uses variables from another Script that has not yet loaded its data!

Fig. 4: Script to save and load Vector3 in Unity, variables and initialization.

En los juegos se suelen tener atajos para guardado y carga rápida, en la figura 5 tenemos unas instrucciones que hacen precisamente esto, noten que al presionar F5 se ejecuta una función llamada “SaveData” que se encarga de hacer el guardado, sería conveniente que el guardado de todas las variables necesarias se haga dentro de esa función o que en el interior se llamen a otras funciones que se encarguen de guardar otros datos, de esa forma una vez que ejecutamos la instrucción SaveData estamos seguros de que toda la información se ha guardado, lo mismo para la función “LoadData” que se encarga de leer los datos guardados en la memoria e inicializar las variables con esos datos.

In games we usually have shortcuts for saving and fast loading, in figure 5 we have some instructions that do just this, notice that when pressing F5 a function called “SaveData” is executed that is in charge of saving all the relevant information and the same for the “LoadData” function when pressing F8.

Fig. 5: Functions for testing the saving and loading of a Vector3 in Unity.

Example on how to save a Vector3 using PlayerPrefs

Figure 6 shows the content of the SaveData function, which is responsible for saving the vector data that will later allow us to reconstruct it. Note that first a decomposition of the vector to be saved is made in its X, Y and Z variables, which are stored in the temporary variables “positionX”, “positionY” and “positionZ”.

The saving of data with PlayerPrefs is done in the last three instructions, using the “SetFloat” function from PlayerPrefs,note the name that is passed as a label of these saves, these names must be used in the data load to retrieve them.

Fig. 6: Instructions to SAVE a Vector3 data type with PlayerPrefs in Unity.



Example on how to load a Vector3 using PlayerPrefs

Figure 7 shows the content of the LoadData function that is responsible for loading the data stored in memory and make a reconstruction of the vector, the load is the reverse process, first we retrieve the data from memory, we do this with the “GetFloat” function from PlayerPrefs, passing the label that was used for each data and in this example I include the value 0 in case there is no previously stored information, this allows us to call it directly in Start or Awake and to make sure there are no conflicts the first time the application is run.

The next instruction is in charge of creating Vector3 from the data retrieved from memory. Vector3 and in general most classes have constructors that allow us to create the data by giving them initial values, we use that constructor with the “new” keyword.

The problem does not end here, as we have read the information, create the vector but we have not yet told our GameObject to position itself on that Vector3 coordinate, this is done in the last instruction in Figure 7.

Fig. 7: Instructions to LOAD a Vector3 data type with PlayerPrefs in Unity.

Introduction

In this article we will see how to set up a Unity project to create builds for Meta Quest 2, for this we will use the “Oculus Integration SDK” package for Unity which contains everything you need to start creating virtual reality apps for Oculus, it also comes with 3D models and useful examples. Let’s go through each necessary configuration, create a build and run it on an Meta Quest device.



All the IMPORTANT information is summarized in the following TUTORIAL FROM MY YOUTUBE CHANNEL


Prerequisites

Android modules installed in Unity

In order to compile applications for Meta Quest you need to have the Unity engine installed with the Android modules, which consist of the following three applications:

Fig. 1: These modules installed in Unity HUB are required to compile for Meta Quest.

Oculus Quest SDK – Oculus Package for Unity

Let’s download and import the Oculus Quest SDK package for Unity which comes with many useful Assets that will help us to create VR applications for Oculus.

Create a developer account for Oculus

An Oculus developer account is required to be able to publish applications in the Oculus Store.

Oculus Developer HUB (Auxiliary software to develop and publish applications for Oculus)

Oculus Developer HUB allows you to recognize the Oculus device from your computer, configure it, access to captured images and videos, and publish applications to the Oculus Store.

Android App

The Oculus mobile app allows us to configure our Oculus device and enable developer mode for our device.

Oculus App for PC (Optional)

The Oculus app allows us to use PC virtual reality applications with our Oculus, which can be done via cable or with the AirLink connection.

How to configure Unity to export applications for Meta Quest (Oculus Quest)

Now let’s see what parameters we have to configure in the Unity engine to be able to export for Oculus, for this we will create a new project and see the step-by-step.

Set Android platform as build target

Oculus Quest devices use Android as the operating system so first let’s go to File > Build Settings and change the target platform to Android, as seen in Figure 2.

Fig. 2: Changing the target platform for compilation.

In this case I am going to use the Oculus Quest 2 device, with the device connected I open the Oculus Developer HUB application to check that the operating system recognizes the device, if we are successful we should see our device as in figure 3.

Fig. 3: Device window of the Oculus Developer HUB program.

Also in Unity, in the Build Settings tab you should see the device in the “Run Device” field, as shown in Figure 4. If you do not see the device you have to go back to the prerequisites part, probably you have not enabled developer mode or you have to enable USB debugging inside the Oculus device.

Fig. 4: The connected device is displayed in the Build Settings window.



Import of the Oculus SDK package for Unity

Import the Oculus SDK package (download link at the top of the prerequisites). In this case we are going to add all the files that come in the package.

Fig. 5: Window to import the Oculus SDK package for Unity.

At this point, several dialogs appear asking us how we want to proceed.

In general for all messages I choose the recommended and most up to date options, in the case of figure 6 we click on “Yes”, in the case of figure 7 we click on “Use OpenXR”.

Fig. 6: Message that appears when importing the Oculus SDK package.
Fig. 7: Message that appears when importing the Oculus SDK package.

You may be prompted to restart the Unity editor, in which case restart the editor.

Fig. 8: Message that appears when importing the Oculus SDK package.

Figures 9 and 10 are another example of messages that may appear, I apply the same criteria, choose the most up to date options and restart if necessary.

Fig. 9: Message that appears when importing the Oculus SDK package.
Fig. 10: Message that appears when importing the Oculus SDK package.

At the time of recording the video and writing this article, when going through the whole process of configuring the Oculus SDK package for Unity, after restarting the editor for the last time an example scene opens showing an Avatar with LipSync (lip sync with microphone input, Figure 11). We are going to compile this same scene so we open the Build Settings tab (under the File tab) and click on the “Add Open Scene” button to add the open scene to the compilation we are going to create.

Fig. 11: Example scene that opens when Unity is restarted.
Fig. 12: The open scene is added to the compilation.



Setup of the XR Management plug-in

The next step of the configuration is to go to the “Player Settings” window, we can do it from “Edit” and we also have a shortcut from the Build Settings window as shown in figure 13.

Fig. 13: Shortcut to the Player Settings window to configure compilation parameters.

In the Player Settings window go to the “XR Plugin Management” item and click on the install button shown in Figure 15.

Fig. 14: Section XR Plugin Managment in Unity.
Fig. 15: Installation of the XR Plugin Plugin Managment in Unity.

Once the plugin is installed, click on the “Oculus” checkbox that appears and this action causes a new element called Oculus to appear below the plugin (figure 17), click on the Oculus element.

Plugin XR Plugin Management
Fig. 16: Activate the Oculus checkbox of the Plugin.
Fig. 17: Go to the new Oculus tab of the Plugin

We make sure that the Quest and Quest 2 checkboxes are checked so that the plugin is applied to these devices.

Fig. 18: Make sure that the Quest and Quest 2 boxes are checked.



Configure appropriate Color Space

Before compiling an application for Oculus Quest in Unity we have to make sure to change the “Color Space” parameter found in the Project Settings/Player window and within the “Other Settings” drop-down menu, as shown in Figure 19. If this step is not done we will have errors in the console when compiling the application.

Fig. 19: Changing the Color Space parameter to Linear in order to compile for Oculus.

Virtual reality application compilation and testing in Meta Quest 2

Once we have configured everything in the Unity engine we proceed to create a build of the virtual reality application and test it on a device such as the Oculus Quest 2, for that we go to the Build Settings window and making sure that in the “Run Device” parameter we have our device (make sure the device is connected via USB or AirLink), we click on Build and Run, choose the destination folder of the APK file and give it a name, as shown in Figures 20 and 21.

IMPORTANT: If the device does not appear in the “Run Device” field you can click the Build button, export the APK file and then install that file via the Oculus Developer HUB software by dragging the APK file to the window shown in Figure 3 at the beginning of this article.

Fig. 20: We create the compilation for Oculus by clicking on Build and Run.
Fig. 21: We name the APK file to compile.

When the process finishes the application runs automatically on the Oculus Quest 2 device, the result of this is shown in the following image:

Fig. 22: Test of the example scene in Oculus Quest 2.



Where is the Unity application installed on Oculus

As we are doing a test, this virtual reality application was not validated by the Oculus store and therefore our device places it in a separate section of applications with “Unknown origins”, in that section we can find the application that was installed from Unity, see figures 23 and 24.

Fig. 23: The test applications build from Unity are located in the “Unknown Origins” section.
Fig. 24: Virtual reality application for Oculus Quest 2 compiled with Unity.

For the applications to appear in the main section we must upload them to the oculus store.



Introduction – What is Occlusion Culling in Unity?

The Unity engine has a system called “Occlusion Culling” that allows to automatically hide objects that the camera is not seeing directly. In this article we are going to see how to prepare a project in Unity so that Occlusion Culling is applied, some details about its operation and configuration parameters.

Important Detail – Pre-calculations (BAKE) are made in Occlusion Culling

Occlusion Culling works with objects that are marked as “static” in the scene and it is necessary to pre-calculate all the data, this is known as “bake”, this implies that every time you introduce a new object that should be turned off if the camera is not seeing it, or if you change the position of one of these objects, you must precalculate again the data for Occlusion Culling, if you don’t do it some objects that are behind the camera or behind some object may not disappear or even worse, you could have objects that disappear in front of the camera when they should remain visible.

Where is the Occlusion Culling configuration window in Unity?

The Occlusion Culling configuration window is located in Window > Rendering > Occlusion Culling, in my case I usually place it next to the inspector window for convenience, as shown in image 2.

Image 1: How to open the Occlusion Culling window in Unity.
Image 2: Occlusion Culling window next to the inspector in Unity.

How to add objects to the Occlusion Culling system in Unity

For an object to belong to the Occlusion Culling calculations and therefore disappear when the camera is not looking at it, each object must be marked with the “Static” parameter in the inspector, as shown in image 3. In general, the important objects to mark as static are those that have some kind of Renderer component assigned to it, such as a Mesh Renderer component or a Sprite Renderer component, since these objects are the ones that have a visual representation on the screen.

Image 3: “Static” property of any GameObject that allows to include it in the Occlusion Culling system.

IMPORTANT: Objects that are part of the Occlusion Culling system must not change position at any time and if they do, the data must be recalculated as we will see below.

How to apply Occlusion Culling in Unity

Once the previous step of marking all the static objects we are going to make a “Bake” of the information to apply Occlusion Culling, for this we go to the “Occlusion” window that is observed in image 4, in case of not seeing the window consult above in this article.

In this window basically we are going to configure two parameters, “Smallest Occluder” and “Smallest Hole”, the first parameter refers to the smallest object in the scene (in scene units) that can obstruct the camera and make everything behind it not to be rendered, the smaller this parameter is the longer the calculations will take so we will have to do some tests to determine an appropriate value, in principle we can consider the values of the image 4.

TO APPLY THE OCCLUSION CULLING CALCULATIONS IN UNITY CLICK ON THE BAKE BUTTON TO THE RIGHT OF THE CLEAR BUTTON

Image 4: Occlusion window to apply Occlusion Culling in Unity.

The “Smallest Hole” parameter is used in case of having open spaces in a 3D model, consider the example illustrated in image 5-A, in which Occlusion Culling has been applied with the parameters of image 4.

Image 5-A: Scene set up to illustrate Occlusion Culling.
Image 5-B: Perspective of the camera in image 5-A.

As we see in image 5-B the camera can see through the hole, however the cube that is selected in image 5-A is not visible, this is because the Occlusion Culling system is considering that this cube should be hidden, since the smaller hole is set to a size of 0.5 Unity units (image 4), while the hole in image 5-B has a size of 0.1 by 0.1 Unity units.

Modifying a bit the parameters taking into account these particular models and the fact that we should be able to see objects through that hole, we make a new Bake with the parameters of image 6 and as seen in image 7 now the Occlusion Culling system does not hide that object because the camera could see it through that hole.

Image 6: New parameters to correct the problem of objects not visible through holes in Unity.
Image 7: With these new parameters the camera can see the cube through the hole.

Anne & the Raccoons

Plot

Anne lives a peaceful life on her farm, tending to her crops and her garden. Suddenly the peace comes to an end as a large number of raccoons show up on her farm and mess up everything they find. Anne has worked very hard on her farm and does not plan to sit around and do nothing.

How to play

Control Anne by pressing the WASD keys or the arrow keys. Collect the objects with the E key. Get the farm up and running and if you have time, find the vegetables that the raccoons messed up. Each station is normalized with a particular object so you will have to have this object to scare away the raccoons and normalize the station.

Developed for the Ludum Dare 51 Game Jam with theme "Every 10 seconds" by:

MARTÍN HEINTZ

MUSIC COMPOSER

GAMEDEVTRAUM

UNITY DEVELOPER

STEFF

UI/UX DESIGN

ÁLVARO

GAME DESIGN

Mc. Rooties

Plot

Mc. Rooties is looking for a kitchen assistant to replace their former employee who is rumored to have fled in tears and screaming out the back door of the restaurant. Your exceptional knife skills have made you a name for yourself and you are the right person for the job.

How to play

Control the knife by moving the mouse and cut the roots at the point indicated by a red line, the knife only cuts with a quick movement, be careful not to damage the vegetables.

Developed for Global Game Jam 2023 with the theme "Roots" by:

CIAN

ILUSTRATOR

MANU

GAME DESIGN

VALEN

UNITY DEVELOPER

GAMEDEVTRAUM

UNITY DEVELOPER

MR. DORIAN

MUSIC COMPOSER

AGRAULIS.C

ILUSTRATOR

Introduction

In this article we will see how to work with the Text Mesh Pro components from a Script, in addition you will find a video from the channel in which we will create a TEXT OBJECT to display in a Canvas and another Text Mesh for the 3D space and we will create a Script inside which we will modify the text that these components show, as an extra we will also modify the color by code.

This is how you import TextMesh PRO tools in Unity and create a script to write Text Mesh PRO by code in Unity:




First Step: Create Text Mesh Pro objects for World Space or for the Canvas

You can use Text Mesh PRO to display a text in the user interface or to display text in the world (like a 3D model), for the first option you need to create Text from the UI menu, these kind of text should be placed inside a Canvas and it will be overlayed with the game view. You can find the Text for the worldspace in the 3D Objects menu.

Let’s analyze both cases

We start by creating the Text objects that we will later modify from a Script, we are going to create two types of Text Mesh Pro objects, one to use in the user interface and another to use as a 3D object in the scene.

Creating Text Mesh Pro Text for the user interface

In Unity the Text Mesh Pro objects that are in the UI section must be placed as children of a Canvas object, so let’s assume that we already have one of these objects in the scene. To create a new Text Mesh Pro object we go to the hierarchy, right click on the Canvas (or any child object of the Canvas), go to the UI section and choose the “Text – Text Mesh Pro” option, as shown in figure 1.A.

Fig. 1.A: Option to create a new Text Mesh Pro text for the user interface.


Creating a Text Mesh Pro Text for World Space

The other option to write text on the screen is to use a Text component of Text Mesh Pro as a 3D object and therefore located in a position in the world, this object will be found in the “3D Object” section of the creation window, as shown in figure 1.B.

Fig. 1.B: Option to create a new Text Mesh Pro text as a 3D object in the scene.

First time using Text Mesh Pro

In case we have not configured Text Mesh Pro yet, we will get the window shown in figure 2 where we will be given the option to import the necessary components to use Text Mesh Pro, we click on “Import TMP Essentials“, as shown in figure 2. The second button to import examples and extras is optional.

Figure 2: Window for importing Text Mesh Pro package into Unity.

Result of the creation of objects

Once the objects were created, I made a few modifications in the inspector (font size, text) and the result is as follows:

Fig. 3.a: Text Mesh Pro objects in the hierarchy.
Fig. 3.b: Text Mesh Pro objects displayed in the scene.

Once the objects have been created and Text Mesh Pro imported we can start using the Text Mesh Pro Text component from the inspector or write it through a Script. In figure 4 we see the Text component in the Inspector window, it has many more configuration options compared to the old text solution.

IMPORTANT

In figure 4 we see the field to edit the text that appears on the screen, currently has written the value “Canvas Text”, that is the field that we want to edit by code and to do it we will have to edit a variable called “text” that is define in that component.

Fig. 4: Text Mesh Pro component in the inspector.


Script for writing text in Text Mesh Pro component

In order to write a Text Mesh Pro component by code I will create a script and assign it to some GameObject of the hierarchy, as shown in figure 5. In this case my script is called “ModifyTextMeshPro”, inside this script I will modify the texts.

Fig. 5: We create a script and assign it to some object in the hierarchy.

Import TMPro namespace in our Script

To be able to use the Text Mesh Pro components comfortably, it is convenient to import the “TMPro” namespace by adding in the header of our script the line “using TMPro;” as shown in figure 6.

Fig. 6: We declare that we are going to use the namespace “TMPro” in the header of our script.

Declaration of the variables to be used

We are going to declare two variables of type “TMP_Text” where the references of the Text components that we want to modify will be stored, in this case the names of my variables will be “canvasText” and “worldText“, in these variables I will place the Text Mesh Pro Text components of the canvas and the world space respectively.

IMPORTANT DETAIL

The names “canvasText” and “worldText” are the names I chose for these variables, you can use any other name as long as it contains allowed characters.

Fig. 7: Declaration of the variables to be used to modify the Text Mesh Pro text.

Initialization of variables (Assignment of references)

The initialization of this type of non-primitive variables is crucial, if we do not take care of putting inside the variable the precise object we want to refer to, we will get a null reference exception.

There are many ways to initialize the variables, in this case I will do it in one of the simplest ways which is by dragging the GameObjects that contain the Text components I want to modify to the variable spaces in the inspector.

Fig. 8: The appropriate GameObjects are dragged into the spaces in the inspector to initialize the variables.


The declared variable does not appear in the inspector

In the case that the variable does not appear in the inspector it is usually because its visibility is private, it can be solved by declaring the variables as public as shown in figure 7, adding the word “public”, or they can also be declared as private but indicating that they are serialized by the inspector, as follows:

[SerializeField]
TMP_Text canvasText;

Or:

[SerializeField]
private TMP_Text canvasText;

Another reason why the variables do not appear in the inspector can be when there are errors in console and the changes made in the scripts are not updated, to solve this we will have to solve all the errors that there are in console, once made this Unity will compile and the new modifications will appear.

Code instructions for modifying Text Mesh Pro text via Script and tests

Once we have initialized the variables we can use them, in this case if we want to modify the text displayed by the Text Mesh Pro component, we must modify the variable “text” defined inside it, for this we use the dot operator that allows us to access the variables and public functions defined inside an object,

Fig. 9.A: Writing a Text Mesh Pro text from a Script in Unity.
Fig. 9.B: Pressing play shows how the texts on the screen are modified.

Extra: Change the color of a Text Mesh Pro text by code

Fig. 10.A: In lines 22 and 23 the color of the Text Mesh Pro texts is changed by code.
Fig. 10.B: When pressing play we can see how the colors of the texts on the screen change.


Introduction

In this article we are going to see how to know the state of any GameObject through code in Unity.

The GameObjects are the elements of the scene that are used to model everything that exists in our project in Unity. One of the basic properties that these GameObjects have is their activation state, in the inspector window can be seen as a checkbox that when checked the GameObject is ACTIVE and if the checkbox is unchecked the GameObject is INACTIVE.

All the IMPORTANT information is summarized in the following TUTORIAL FROM MY YOUTUBE CHANNEL


Procedure to know if a GameObject is ACTIVE or INACTIVE in the scene

Let’s assume that there is already a script created and assigned to a GameObject in Unity so that its instructions can be executed.

Step 1 – Define REFERENCE

To solve practically any problem in Unity we have to start with the variables that we are going to use, in this case we need to have the reference of the GameObject that we want to analyze, in other words, in our script we will define a global variable of type GameObject, for example in the following way:

public GameObject myGameObject

Step 2 – Initialize REFERENCE

The variable we defined in the previous step will not automatically point to the GameObject we are interested in knowing if it is ACTIVE or INACTIVE in the scene, we have to make sure that happens.

There are several ways to initialize a variable, click here to see a playlist with several videos where you can see methods and examples on this topic, in this case we are going to go to the inspector where we have the script assigned and we are going to drag manually the GameObject that we are interested in analyzing to the GameObject type variable that appears in the inspector, in our case it is called “myGameObject”. This way now that variable will point to the GameObject that we are interested in.

Step 3 – How to READ the GameObject status

Having solved the previous two steps we can now use the variable we defined to know if the GameObject is ACTIVE or INACTIVE in the scene. For this we use the following instruction.

myGameObject.activeInHierarchy

The previous instruction is equivalent to a boolean value, if the GameObject to which the variable “myGameObject” is pointing is active in the scene the expression will give as result “true”, while if it is inactive it will give as result “false”. Therefore we can use this expression to do whatever we want, for example define an IF statement and print a message in console if the object is active and a different message if it is inactive, for example:

if(myGameObject.activeInHierarchy){
Debug.Log(“The GameObject is ACTIVE”);
}else{
Debug.Log(“The GameObject is INACTIVE”);
}

Introduction

In this article we are going to see how to change the cursor image when the pointer hovers over a button in Unity. In addition, a download package is provided with the scripts, sprites and the scene with all the elements already configured.

Download the Unity Package

Here you can download the Unity Package with the example on how to change the cursor image in Unity.

In this video we see how to change the CURSOR IMAGE on HOVER in Unity.


Fig. 1: Scene from the Unity package, the cursor is on the screen and it’s an arrow.
Fig. 2: Scene from the Unity package, the cursor is on the button and now change to a hand.

Default Cursor Settings in Unity

This is something that can be done within a Start function, but if our game or application will have a custom cursor, for example a custom arrow, it is convenient to define it within the project parameters, going to Edit > Project Settings, the window shown in Figure 3 is displayed, there we can assign the default cursor Sprite, as well as its Hotspot coordinates.

Fig. 3: Default cursor settings in Unity.

Determining the Hotspot of our customized cursor

The hotspot of the cursor is the offset vector measured from the upper left corner of the Sprite and it indicates the point of the image where the tip of the cursor is considered to be. The script that comes in the package has fields defined to assign the Sprite of two cursors as well as the vectors of their respective Hotspots, as seen in the inspector in Figure 4.

To determine the Hotspots you have to enter the game mode and try different values until the cursor matches, another way would be to know exactly how many pixels horizontally and vertically is the point of the cursor.

Fig. 4: Inspector of the Cursor Manager script included in the Unity package, Sprite assignment and Hotspot configuration.

How to change the cursor IMAGE in Unity

To change the image shown by the cursor just execute the instructions shown in figure 5, in lines 31 and 37, the SetCursor method inside the Cursor class, passing as parameter the sprite you want to show, the Vector2 with the position of the Hotspot and the cursor mode.

Fig. 5: Functions to change the cursor image in Unity.

When to change the cursor image in Unity

In this particular case we are interested in constantly showing a default cursor until the pointer is positioned over a button, at that moment we want to show another different cursor to give feedback that the user can interact with that element. For this we need to be able to detect exactly when these events occur and for that I assign to each button an “Event Trigger” component and adding the “Pointer Enter” and “Pointer Exit” events.

To both events we assign the GameObject that has the Script with the functions that we want to execute (these functions must be defined as public) and then using the drop-down menu we choose the functions to execute in each event.

Fig. 6: Assignment of script functions to each event.

Description

This solution consists of a scene with two buttons, one with the text “Play” and the other with “Exit”, the first button does nothing, the second one displays a confirmation dialog that asks us if we are sure we want to exit and has two buttons, the NO button and the YES button, the first one closes the confirmation dialog and we return to the menu, the second one closes the application or game in Unity.

Download Unity Package

You can download the Unity Package to import it in your own project.

In the following video I explain how to exit the game in Unity with confirmation dialog




Introduction

In this article we will see how to export an audio file in Reaper, how to export in MP3 and WAV format, which are two of the most popular formats, but you will see that it is very simple to choose the format you need. For this it is important to be clear about the export format you need and also its sample rate or bit depth.

Before we move on, here’s a song we were commissioned to produce for an event, let us know what you thought in the youtube comments!

How to make a RENDER of the project in Reaper

With our Reaper project ready to export the first step is to go to File > Render or use the shortcut CTRL+ALT+R, as shown in Figure 1.

The “Render to file” window will be displayed where we will be able to configure all the export parameters. In this window there is everything we need so let’s see in detail each section of the Reaper export window.

Fig. 1: Exporting a project in Reaper.
Fig. 2: Ventana “Render to File” de Reaper.

Render entire project, separate tracks or selected time in Reaper

In the first part of the Reaper Render window we can start configuring from where we want to export our audio, inside “Source” the default option is “Master mix” and it refers to the output from the Master track in the DAW, that is, all the tracks that are not muted at that moment and are sending their signal as a last instance to the Master.

Fig. 3: First part of the Reaper Render window.

We can export the audio tracks separately with the option “Selected tracks (stems)“, see the options listed in figure 4.

Fig. 4: Selecting which tracks of the project to export.

In “Bounds” we can configure which part of the timeline we want to export, whether it is just the first two bars, two minutes, the whole project, regions, etc.

Fig. 5: Selecting which time regions of the project to export.

Output section of the Reaper export window

EnIn the “Output” section we write the name of the file(s) and choose the directory where they will be exported. The “Directory” section contains the folder where our audios will be sent, we can change it by clicking on “Browse”. In “File name” we write the name for the files. Finally the “Render to” field shows the final result of the configured parameters.

Fig. 6: Export directory and name of the final files.

Audio export options in Reaper

Within “Options” we can configure the “Sample rate“, how many channels we want and other more advanced settings such as “Resample mode” or “Normalize/Limit“. In image 7 you can see that the “Sample rate” is set to 44100, which is the standard quality used in CDs. Depending on the project to be exported, these options must be configured correctly since they could have inconveniences that could damage our export or lead us to undesired results.

Fig. 7: First part of the Reaper’s Render window.

En el apartado “Primary output format” (POF) y “Secondary output format” (SOF) podemos elegir dos formatos para exportar nuestro proyecto en Reaper, en este caso nuestro formato primario será WAV con una profundidad de bits (WAV bit depth) de 24 bits PCM.

In the “Primary output format” (POF) and “Secondary output format” (SOF) section we can choose two formats to export our project in Reaper, in this case our primary format will be WAV with a bit depth of 24 bits PCM.

Fig. 8: Choosing the primary project export format in Reaper.
Fig. 9: Bit depth for exporting project in Reaper in WAV format.

Hacemos click en “Secondary output format” (SOF), esta opción nos permite generar un segundo archivo con un formato distinto en el mismo proceso de Render de un proyecto de Reaper. Para el formato secundario elegiremos MP3 (encoder by LAME project), como se observa en la figura 10.

We click on “Secondary output format” (SOF), this option allows us to generate a second file with a different format in the same Render process of a Reaper project. For the secondary format we will choose MP3 (encoder by LAME project), as shown in figure 10.

Fig. 10: Selection of a second format for exporting the project in Reaper, MP3 format.

Start project export process in Reaper

Once all the Render parameters of the project are configured, we click on the “Render 2 files” button, in this case since we have chosen to export two different files in the same process. Clicking on this button will start the rendering process.

Fig. 11: Confirm project export in Reaper.

The window shown in Figure 12 appears, showing the rendering process of the project in Reaper. Once the process is finished, we will be able to find our two files in the folder we chose. The final window will show the drawing of the audio waveform where we will be able to see the clips, besides Reaper provides us with this same information in the lower text box: Peak, Clip, RMS, LUFS, etc. If we click on “Show in browser” it will take us directly to the directory where the files are located.

Fig. 12: Audio export process in Reaper.

Tsunami

Plot

You are a worker dressed in a teddy bear suit who is advertising on the street when suddenly a tsunami is seen on the horizon and the city begins to be devastated. You fearfully seek shelter by entering a building, but this is only the beginning, you won’t be safe until you reach the top of the building.

How to play

Control the character with the WASD keys or the arrow keys and press SPACE to jump. Press E to lift a box to move it. Collect the keys that unlock the doors to continue the ascension. Reach the roof.

Developed for the Ludum Dare 50 Game Jam with theme "Delay the inevitable" by:

MARTÍN HEINTZ

MUSIC COMPOSER

GAMEDEVTRAUM

UNITY DEVELOPER

STEFF

UI/UX DESIGN

ÁLVARO

GAME DESIGN

Introduction

In this article we are going to see how to connect animation bones to a 3D model so that they can deform the model, this is useful to create poses and animations using keyframes.

PREVIOUS ARTICLE: HOW TO CREATE THE ARMATURE THAT WE ARE GOING TO USE




Starting point – We already have an armature

We start with a 3D model and the Armature object we want to connect to the model, you can check the previous article on how to create animation bones.

Fig. 1: We start from a 3D model together with an animation skeleton.
Fig. 2: The 3D model has subdivisions so that the animation bones can deform it.

Parent 3D model to animation bones

What we have to do now is to link the 3D model to the Armature object, this is done by parenting them, it is important to parent the 3D model to the animation bones, that is the Armature object has to be the parent object, note that in figure 3 the objects have been selected incorrectly, in this case, the correct way to select them is as shown in figure 4, in which the active selection is the Armature object. For more details see the article on how to parent and un-parent objects in Blender.

Fig. 3: Both objects are selected, the active selection is the 3D model.
Fig. 4: Both objects are selected, the active selection is the animation skeleton.

In the window to parent the objects we select one of the options within “Armature Deform”, each one will have different effects, in this case we are going to use the basic form that would be “With Empty Groups” (figure 5) in which groups of vertices are created within the 3D model with the names of the bones.

Fig. 5: Window to parent objects in Blender.

Once the 3D model is linked to the animation bones, select the object and go to the window where the vertex groups are located by clicking on the icon shown in figure 7.

Fig. 6: Once the relationship has been established, the 3D model is selected.
Fig. 7: Vertex tab of the selected object.


In the “Vertex Groups” section we can see the vertex groups that have been created by linking the model to the Armature with the “Empty Groups” option, at this point these vertex groups have not been assigned any vertex of the model so we are going to start assigning them.

Fig. 8: Vertex groups were created with the names of the animation bones.

Select the model and enter the edit mode, notice in the “Vertex Groups” tab that the “Weight” field appeared, which was not there before, the Weight value allows us to indicate what percentage of influence will have a particular bone on each vertex, where the value 1 indicates total control by the bone and 0 indicates that the bone does not have any control over the vertex.

Fig. 9: Enter the edit mode of the object related to the Armature.
Fig. 10: The vertex group window changes slightly in Edit Mode.

Before assigning the vertices to each group I am going to show the names of the bones, for that I select the Armature object, go to the Armature properties and in the Viewport Display window check the “Show Names” box as shown in figure 11.

Fig. 11: Activate this option to display the names of the animation bones in the Viewport.


Assigning vertices to bones

We are going to select the vertices at the top of the 3D model, those shown in Figure 12.

Fig. 12: A set of vertices is selected to link to an animation bone.

Con los vértices seleccionados vamos a la ventana “Vertex Groups” y seleccionamos el grupo al cual asignar los vértices, en este caso el grupo “Bone.001” y acto seguido pulsamos el botón “Assign” que se muestra en la figura 14.

Fig. 13: Select the group corresponding to the bone to which you want to assign these vertices.
Fig. 14: The selected vertices are assigned to the selected group.


Test if the bone deforms the assigned vertices

Before assigning more vertices we can do a test to see if what we did worked, for that we select the Armature object and go to pose mode, as shown in figure 16.

Fig. 15: In object mode the animation skeleton is selected.
Fig. 16: Change to Pose mode to animate the bones.

In pose mode we select the bone and rotate it, if everything went well we should see that the bone deforms the vertices that were assigned to the group, as we can see in figure 17.

Fig. 17: When rotating a bone in Pose mode, the linked vertices rotate together.


Assigning the remaining vertices to the animation bones

Now we are going to select the vertices of the lower part of the model which will be totally controlled by the bone at the bottom.

Fig. 18: The assignment process is repeated for the vertices below.

Vamos a la ventana “Vertex Groups”, seleccionamos el grupo correspondiente y hacemos clic en el botón asignar.

Fig. 19: Now the group corresponding to the lower bone is selected.
Fig. 20: The selected vertices are assigned to the selected group.


Make two animation bones control a group of vertices

For the remaining vertex loop what we are going to do is that both bones can control the vertices, for this we have to repeat the previous process but change the Weight parameter. We start by selecting the set of vertices, a quick way to select an Edge Loop is by pressing ALT and clicking on one of the edges of the loop we want to select.

Fig. 21: The vertices in the middle that are at the same distance from both bones are selected.

Then go to the “Vertex Groups” window and change the Weight value to 0.5.

Fig. 22: Change the weight with which the vertices are assigned.
Fig. 23: The weight is set to half.

Then we select the first group and assign the selected vertices to it.

Fig. 24: The first group of vertices is selected.
Fig. 25: The selected vertices are assigned to the group with the previously selected weight.

Then we select the second group and also assign the selected vertices to that other group.

Fig. 26: The second group of vertices is selected.
Fig. 27: The selected vertices are assigned to the group with the previously selected weight.

As the vertices were assigned to both groups in each of them with weight equal to 0.5 with this we have established that each bone will have a 50% influence on these vertices. In Figure 28 you can see the result of all this.

Fig. 28: The vertices of the middle respond to both animation bones.

One thing that could have been done is to assign the intermediate vertices weights of 0.25-0.75 in each corresponding group.



Introduction

In this article we are going to see how to create an ARMATURE in Blender (animation bones) that can be used to animate a 3D model using keyframes. If you already know how to create bones in Blender and want to know how to connect these bones to a 3d model, check this other article instead.




Starting point

We start from the 3D model shown in figure 1, it is a cylinder to which subdivisions have been added as seen in the article on adding and removing edge loops.

Fig. 1: We start with a 3D model of a cylinder with some subdivisions.

Creating the animation bones

In object mode we will place the 3D cursor over the origin of the object to which we want to add the skeleton.

Fig. 2: We make sure that the 3D cursor is over the origin of the model.

Press SHIFT+A and add the object “Armature” as shown in figure 3, with this we create the skeleton, it is likely that the animation bone is hidden as in this case, to view it you can enter the Wireframe mode as shown in figure 4.

Fig. 3: The animation skeleton is created.
Fig. 4: We enter Wireframe mode to see the Armature object that appeared inside the model.

To visualize the animation bones more comfortably I will make the animation bones be displayed in front of the 3D model as shown in figure 5.

Fig. 5: We set up the Armature to be seen in front of the model.



Adding animation bones

We have created the Armature object, if we select it we can enter the edit mode as shown in figure 6, this allows us to add new animation bones, subdivide them, rotate them, etc.

Fig. 6: Entering the Edit mode of the Armature object.
Fig. 7: The tip of an animation bone is selected.

To obtain the two animation bones seen in figure 8 what I do is to select the tip of the first bone as in figure 7 and then press the E key, the new bone that appears is analogous to the extrusion of a face. I will place the tip of the second bone at the top of the 3D model as shown in figure 9.

Fig. 8: A second animation bone appears when extruding.
Fig. 9: The second bone is positioned at one end of the 3D model.

Then I will take the tip of the first bone and place it approximately at the center of the 3D model as shown in Figure 10.

Fig. 10: The tip of the first bone is placed in the center of the 3D model.



Pose mode to animate bones

If the Armature object is selected, the “Pose Mode” option will appear when changing the working mode.

Fig. 11: With the Armature selected we go to Pose mode.

In Pose mode we can give the bones the rotation we need and make animations with keyframes and the timeline, the problem is that we have created the animation bones but we have not linked them to the 3D model nor have we established how those bones will deform our 3D model, as shown in figure 12, when we rotate the animation bone in pose mode, the 3D model remains unchanged.

Fig. 12: In Pose mode you can change the position, rotation and scale of the animation bones along with other properties.

PART 2: HOW TO CONNECT THIS ARMATURE TO A 3D MODEL



Exit mobile version
Secured By miniOrange