A few weeks ago I had the pleasure of attending Unity’s Developer Day 2019 in Brighton.
The decision to attend was a sudden one, as the opportunity presented itself just a few days prior to the event. Nevertheless I was very excited to see what’s in store, as I’ve never attended such an event before. I’ve arrived at the venue in Brighton just in time for the doors to open. I got my ID tag, a Unity t-shirt and a few little gifts. Then I was let inside the conference room in which I’d spend the next 7 hours.
Throughout the day there were 6 lectures and multiple opportunities to mingle with indie devs presenting their games. Here are some short summaries, along with my thoughts!
Profile your game with Intel
Intel has released a set of free piece of software that can arm you with the tools you need to optimise your game. They’re called Graphics Performance Analysers and can profile a build of your game, allowing you to see whether it’s CPU or GPU bound. If it’s the latter you’re in luck, because the program can give you insight into what exactly might be taking precious milliseconds in your rendering pipeline. In the presentation, all of the tweaks were done in the Unity Editor and it’s settings, bringing the power of optimisation to not only coders, but artists and designers too.
Explore mixed reality
A company called Magic Leap presented its fresh product – so fresh it’s not yet available in Europe. They want to get in here badly though, and they are starved for developers making content for their device. The Magic Leap One headset offers a small window into mixed reality and has a couple of tricks up its sleeve. Spatial recording builds the geometry of your surroundings a few seconds after turning it on, which your application can use to place interactive content around the room. It’s also capable of finger tracking, gesture recognition and eye tracking. It comes with a small discman-sized attachment that contains the battery and CPU. I was able to have a go at the device and after a few minutes of playing virtual angry birds on a real table I came to a conclusion that nobody is going to pay $2200 to play games on their coffee tables. Instead of games, I see much more potential for tools. AR powered tools could be very useful for on site jobs like interior design or maintenance. Anyway, the product itself is not consumer ready yet, but maybe that’s your chance to be one of the first suppliers of software on this brand new platform.
Generate animations at runtime
New tools in Unity allow you to create animations at runtime, saving your time and increasing your options. By setting up angles and constraints on bones you can blend in between animation clips or create new ones. There were a few examples given, but two stuck with me the most. One was setting up your main character’s head to look at interactive objects within his reach. Instead of intrusive UI, upon walking near an object, the hero started looking with interest toward an object on the table. His head was turning around within its constraints, while his body performed locomotion movement. It was said you could also use runtime animations to create grabbing motions, without the need to prepare lots of differently angled animation clips beforehand. The second nice example was a skeletal creature resembling a snake, that had its bones set to lag behind the head. This created a serpent-like behaviour of an oceanic monster. I really liked that and could only imagine the amount of work it’d take to create that by hand. The new tool also comes with the ability to bake your creation into an animation clip, if you want to reuse it, or better yet – iterate on it with more runtime generation.
Scale multiplayer servers
Multiplay graced us with something that was more of an advertisement than a lecture, but I appreciated some tech insights anyway. These were the guys that took care of the server scaling part for the popular Apex Legends. Well, they must be doing something right, because Apex’s “mic drop” launch resulted in 50 mln players in the span of 24 days. Multiplay helped EA and Respawn run servers, with both bare metal and cloud options. One of the cool features they offer is zero downtime patching. They spin up additional servers with the update of your game and slowly transition players that finished a game on the old patch onto the servers with the new patch, keeping them in game, while shutting down no longer needed old version servers. Nice.
This was my favourite talk of the day. An InnoGames representative explained why prototyping is important and how it should be approached. At the company they focus on fast development that takes from 4 days to 4 weeks, with the goal of having a user-testable build at the end. During that time they experiment a lot, without having a clear goal in mind. This helps them explore ideas, before fully committing to them – they said out of 6 prototypes, only 2 became products, but those products were successful. The speaker offered quite a few tips for making that process as fast as possible, with the main theme being “forget all the rules that you enforce in production code”. Here getting things done fast is more important than the code being optimised or well structured. You’re writing throwaway code and if the concept you’re testing is worth pursuing, you’re gonna start over and do it properly.
Build Tiny projects
The last lecture was about Project Tiny, a highly experimental way of delivering games of minimal size. Building with Tiny allows you to have a build size of 2-5MB and your whole game is embedded in a java string, which means you can run it almost anywhere. These games have unused elements of the engine stripped out of them, but you still can have things like In App Purchases, accounts, multiple levels et-cetera. There are a few cons. The whole system is still in preview and not production ready. Your game must be using the new Entity Component System (which is a very good thing to learn!). The entire use process feels weird, like you’re working around the Unity engine. You have to create a project within your project, it used typescript (C# was introduced only a month ago) and basically it looks like an external tool that they really wanted to put inside of Unity. But hey, it brings you new opportunities and learning ECS is something I’ve wanted to do anyway.
On the subject of games…
Before I finish up I wanted to mention 2 of the few games I had the pleasure of playing during that day. One, because of its uniqueness. Created by GamingGarrison, Midi the Cat is a platformer game, but you’re controlling the character using a MIDI keyboard by playing musical notes. You need to time your key presses to the rhythm on screen, and have to perform different actions, from simple 1 key press to move the cat, to complex multiple key presses to perform things like climbing, sneaking or talking. Thankfully, the game aids you by displaying all of the possible moves and instructions on the screen at all times. I definitely had fun finishing this little demo, even without any prior music experience.
The second game that pulled me in was NeuroSlicers, a cyberpunk RTS without direct control of units, which ticks all the right boxes for me. It has a focus on deck building, capturing terrain and making quick tactical decisions. I’ve played 3 matches and still wanted more. NeuroSlicers will probably hit virtual store shelves in about 18 months. There’s gonna be an alpha and a beta in the meantime, which I will try to get into.
That was it for my first Unity Developer’s Day. I had great fun playing and learning. Definitely going next year if I have the chance!