Dashy block is a little sidescrolling mobile platformer game I made for when I'm on the
train. It is the first game I ever made for Android.
In Dashy Block you tap and hold the screen to jump, the longer you hold the higher you jump.
You can also perform a dash once every 5 seconds by shaking the phone.
The goal is to collect coins, but each coin you pick up makes you go faster.
There are also obstacles that spawn on every couple of platforms that you should try to avoid.
Hitting an obstacle costs you a live, if you run out of lives the game is over. If you fall
out of the map, the game is over. This means sometimes you will have to choose between hitting
an obstacle or falling out of the map.
Tech
I made this little game in the Unity Engine. It was the first project I ever did for Android.
From this project I learned how to make an infinite sidescrolling level, use physics to drive
the game, and how to develop for other platforms in a game engine.
Date: September 2015
Formula Frosty
Group Project.
Overview
Duration:
8 weeks
Team:
~12 people
Role(s):
Generalist Programmer
Tech:
Unreal Engine, Perforce
Platform(s):
PC, XBox
Trailer
The BUAS Game Marmalades
The BUAS Game Marmalades were described to me as "like a game jam, but chunkier".
All IGAD students were divided into groups of 3 people; a designer, an artist and a programmer.
We then had one week to come up with a concept and a first prototype to show to the panel of teachers.
The teachers then greenlit half of the projects and redlit the other half. The students from teams that
got redlit got merged into the greenlit teams. In total, we had about 6 work weeks to create a full game.
My first team's idea did not make it through the selection, and I was added to the Formula Frosty team.
The total size of the team was thirteen people, five of which were programmers.
Concept
The pitch concept of Formula Frosty was a couch-party last-man-standing racing game with a heavy emphasis
on the strategic use of player abilities. The top-down camera follows the player in first place. If your
kart goes off-screen you explode. You get three lives, after that you are out of the race.
My role
The first thing I worked on was the respawn mechanic. When the player dies their kart respawns after
a couple of seconds just behind the player in first place. Before respawning you see a "ghost"
of the kart at the position you will respawn.
I was also responsible for hooking up the UI elements to functions, worked on an intro cinematic and
getting the driving to feel right.
After that most of the game's functionality was in place, but there were a LOT of bugs that needed fixing
so I was a full-time quality assurance.
Reception
The teaching team had urged us many times to get rid of the attacking abilities and make it a sudden-death
racing game. In the end we decided to simplify the abilities and most teachers agreed that this was a good
compromise.
Students who played our game at the PlayDay all reacted very positively to the game. People got very competitive,
which is exactly what we were trying to achieve.
Retrospective
This was my first group project and I still had a lot to learn about working in a multi-disciplinary team.
Communication in the team got worse as the team got bigger. In the end we managed to make a fun little game
and a lot was learned from the experience.
Date: February 2019
Breda University, Year 1 Block D
Plutocrat Renderer
Physically Based Renderer.
Overview
Duration:
3 months
Team:
Just me
Role(s):
Programmer
Tech:
Visual Studio, Git, RenderDoc
Platform(s):
PC, Linux
Goal
My goal was to write a graphics API agnostic- and physically based renderer.
The application should have a graphical user interface through which the user
can create a scene by loading in models and textures. The application should be able
to switch graphics APIs on the fly.
Features implemented
Physically Based Rendering
Image Based Lighting
Renderer Agnostic
Graphical User Interface
Multisample Anti-Aliasing
Features planned
Vulkan implementation
GLTF file loading
Post-processing
Scene saving
Screenshots
What is Physically Based Rendering
According to Wikipedia: "Physically based rendering (PBR) is an approach in computer graphics that seeks to render graphics in a way that more accurately models the flow of light in the real world."
Well that sounds great but what does that actually entail?
In the real world, objects look shiny or mat because of the roughness of the object's surface.
Tiny imperfections in the surface of the material change the direction in which light is reflected.
Therefore a smoother surface equals a clearer reflection.
The Fresnel effect
When light hits a surface it is split into two directions. One part of it is reflected off of the surface,
the other part enters the surface. The light that enters is either absorbed or bounces around and comes
back out at another location and direction.
How much light enters a surface and how much is reflected directly off depends on both the material of the surface
and the angle at which the light hits the surface. The shallower the angle, the more light is reflected directly off
of the surface. This is called the Fresnel effect. You can see this effect when looking at, for instance, water.
When standing in clear water, looking directly down you can see to the bottom but looking further ahead you see
the reflection of the sky.
Metal vs Non-metal
Another factor is whether a material is a metal or not. In real life, metals absorb all light that enters its surface.
You can only see the light that is reflected off of the surface. This is called specular reflection. On non-metal surfaces,
light can penetrate the surface and bounce back out. This is called diffuse reflection. Only a small portion of reflected light
of non-metals is specular reflection.
Smooth plastic
Rough metal
Lessons learned
If I were to write an API agnostic renderer again, I would start with the DirectX or Vulkan implementation
first, rather than OpenGL. This is because DirectX and Vulkan have a lot of features and data structures that
OpenGL doesn't have. Basing your architecture on DirectX, you can easily emulate all the missing structures
in OpenGL. But if those structures are not present in your architecture it is a lot harder to add them later.
Date: June 2020
C++ Raytracer
Solo School Assignment.
Overview
Duration:
8 weeks
Team:
Just me
Role(s):
Programmer
Tech:
Visual Studio, Perforce
Platform(s):
PC
The Assignment
The goal of our assignment was to make a CPU raytracer from scratch in C++ and make it
render as fast as possible while also keeping it as true to nature as possible. We were
allowed to use a library to create the window context, but had to write our own math libraries.
What is a raytracer
A raytracer, or ray tracing renderer, is a type of renderer that traces the path of light to
create realistic looking images. This method of rendering is a lot heavier on the hardware than
conventional rasterization rendering, but can produce more advanced reflections and refractions.
Recursion
The way a raytracer works is by shooting a "ray" from the camera through every pixel.
When this ray hits something, that object's color is stored and two new rays are created;
one reflecting off of the surface of the object, and one going through the object for transparency.
When these two rays hit something they in turn create two new rays and so the cycle continues.
This process is called recursion. The cycle is broken when a ray doesn't hit anything, or when a certain
limit is reached.
Refraction
One thing a raytracer can do very well is simulate the refraction of light going through a transparent object.
In the image above, you can see a simulation of light going through a glass ball. As you can see this gives a
magnifying glass effect.
Bounding Volume Hierarchy
Testing if a ray is hitting a geometric shape is very fast on modern computers, but testing if one ray hits one of
ten thousand geometric shapes will start taking a bit longer. Now imagine you have to do this for every pixel on your screen.
On a typical monitor you would have to test 1920 x 1080 x 10000 rays, or 20.736.000.000 at least.
To fix this issue, I used a Bounding Volume Hierarchy. This structure splits your scene into two volumes, then splits those volumes into
two and so on. You then end up with a binary tree. When tracing a ray, you can then eliminate the volumes the ray doesn't go through.
Without the BVH the big O notation for tracing a ray was O(n) where O stands for complexity and n is the number of objects to check. With the BVH this turns into O(log n).
Date: June 2019
Breda University, Year 1 Block A
Temple of Giants
Group Project.
Overview
Duration:
8 weeks
Team:
5 designers, 6 artists, 5 programmers
Role(s):
Gameplay Programmer
Tech:
Unreal Engine, Perforce, Jenkins
Platform(s):
PC
Trailer
Concept
Temple of Giants is a 3D puzzle platformer set in ancient Greece. The player is able
to create stone platforms, lift and shoot rocks, and push big stone objects to create
a path through the temple. At the end of the puzzle the player has to defeat Medusa by
activating all the crystals to destroy her.
The team
Our team was made up of five designers, six artists and five programmers.
My role
My role on the project was gameplay programmer, playtester and quality assurance. I worked on
traps, the pushable blocks, character movement and bug fixing.
Reception
The teaching team was very enthusiastic about the game throughout the development process.
Unfortunately due to the Corona virus we couldn't have a PlayDay as we usually do, but students who played the
game were very positive about it.
Awards:
- Best Art Year 2
- Best Game Year 2 Teacher's Pick
- Best Game Year 2 Audience Pick
Play the game
Date: June 2020
Breda University, Year 2 Block D
The High Command
RTS Game.
Overview
Duration:
1 month
Team:
Just me
Role(s):
Designer, Programmer
Tech:
Unity Engine, Visual Studio, Blender
Platform(s):
PC, Mac, Linux
Concept
The High Command is a real-time strategy game in which you control a small squad of mixed special forces.
The most important aspect of the game is unit placement. The player must try to put their units in places
that provide both cover and good line of sight.
The game features LAN and Online Multiplayer 1 versus 1 battles on three different maps.
In the prep-phase, players have to choose which units they want. Each special force type has
their advantages and disadventages such as range, run speed, fire speed, cost, etc. When both
players are ready, they have to run their units to strategic positions and the battle starts.
Technical Challenges
Multiplayer
This is the first multiplayer game I ever made. Unity had just released their new multiplayer system
which is what I used for this project. The multiplayer is peer-to-peer based with a master server for
the lobby.
Visibility system
The biggest challenges of this project were the visibility system and the cover system. In order for the
player to see an enemy unit, he must have a unit in range with line of sight. My first implementation of
this system ray casted from each friendly unit to each enemy unit every frame to determine who sees who.
While this worked fine, it was not a very optimized solution.
My first optimization was to first check which
units are in range of each other and then ray cast for line of sight. This was already a lot more efficient,
but if all players were in range it was still costly. The next optimization I did was to only update the
visibility when units move. The way I implemented it was that if one unit moves, it makes a call to the visibility
system to update.
Looking back at this now I realize that this made the system redo the visibility checks multiple
times per frame. If I had to write such a system again I would instead make the visibility system check whether any
units have moved and update accordingly once per frame.
Cover
Because I didn't really know any vector math at the time, the best cover system I could work up was to have trigger boxes
that, when a unit is inside them, influence their chance of getting hit. This also meant that if I had, say, a crate to
hide behind, the player would also receive the cover bonus when being fired upon from the back. Therefore the system only
really worked for buildings.
Arid is a gritty, exploration-survival experience that challenges players in surviving the most arid place in the world. By using your skills and adaptation, you must face the loneliness, the extreme temperatures, and the mysteries of the Atacama Desert.
My role
I was the main gameplay programmer on Arid. Our team's 3 other programmers helped with gameplay as well, but I was the only dedicated gameplay programmer.
As such, I was responsible for a lot of big gameplay systems. The biggest systems I worked on were the attribute system, the objective system and the 3C's.
Attribute system
The attribute system was the underlying system for health, stamina, hunger, thirst, etc. Our designers needed to be able to work on these attributes (in Blueprint)
without having to write (or rewrite) any complex logic. We wanted to minimize the amount of systems running on tick so the whole system is event based. Attributes had
to react to changes in value and the reaching of the min- and max values. What happened when a certain value was of course reached was different per attribute.
Attribute base properties
Attribute base events
I decided to write an AttributeBase class in C++ with the common properties of all attributes and 6 events exposed to blueprints. OnBeginPlay to initialize some values, OnMaxValueReached
(and OnMinValueReached) to call once when the maximum value was reached, OnMaxValueTick (and OnMinValueTick) to call every tick while the attribute was at max value,
and lastly OnTick for the attributes that really need to run every tick (this one was rarely used).
Objective system
The objective system was made to guide the player. It started out as a very simple system where the player would always have a single objective, but grew out to support
multiple subtasks per objective and waypoints on the screen to show the way. The whole system was written in a data-driven way and the objectives were loaded
from a Google Sheet using a tool made by our tools programmer Tim van den Bosch.
Objectives can be completed through picking up items, walking into a certain area, getting your health to a certain value, or any number of other things. Because I wanted
to keep systems loosely coupled, I decided to implement the Publish-Subscribe pattern. In this pattern any system can publish a message on
one or more channel of the Message Broker. Any systems subscribed to that channel of the Message Broker will receive the message. What makes this pattern so good is that
the publisher and the subscriber don't need to know each other.
3C's
The Three C's are Character, Controls and Camera. My job was to help our designers to achieve the game feel they were looking for. This included slowing the player down on
ramps, slower sideways walking and even slower backwards walking, smooth crouching and camera swaying.
I also worked on the rope climbing mechanic. Getting the hands to line up with the rope and adding smooth transitions when dismounting the rope were the two most challenging
parts of this mechanic.
Retrospective
Arid was an awesome experiment that taught me a lot about long-term development, planning and teamwork. I had never before worked on a project of this scope and didn't
realize how much of a difference it would make.
Setting up work pipelines is incredibly important when working with a big team for a long time. We had pipelines for submitting code, requesting features, reporting bugs,
fixing bugs and more.
Releasing a game is exciting, but also a big responsibility. I always felt very guilty whenever a bug I introduced negatively impacted the experience of one of our players.
Even if it was early access. Some bug fixes we did made old saves unusable, which was also very unfortunate.
I learned a lot from working on Arid and I am very proud of the result.
Rule a fantasy realm of your own design! Explore new magical realms in Age of Wonders’ signature blend of 4X strategy and turn-based tactical combat.
Control a faction that grows and changes as you expand your empire with each turn!
My role
I started at Triumph Studios as an intern gameplay programmer assigned to Age of Wonders 4 (then an unannounced title). After 1 year of internship I graduated from BUAS
and was hired as a full-time gameplay programmer. Some of the key features I have worked on are the victory conditions, sieges, audio implementations, and many modifiers and effects.
Victory conditions
Victory conditions is a term we use for every method of winning the game. There are the standard conditions like defeating every player or having the highest score at the end of the game,
but there are also more complex victory conditions like having your empire reach a certain size and building and defending the 'Beacons of Unity'.
Victory conditions overview
The victory conditions are set up in a data-driven and event-based way. We have a central victory condition manager that receives events from all over the game and
forwards them to each victory condition. The victory condition then updates each player's data accordingly and sends notifications to the players.
Sieges
When an army attacks a walled city, a siege begins. Each turn of the siege, some damage will be done to the city's defenses. How many turns this process takes depends on the
upgrades built in the city, what Siege Projects the attacker chooses. There are also Siege Breaker units that can be used to speed up the process. The siege projects can also
add effects such as demoralizing the defenders in combat.
Audio
When I joined the team, the game didn't have many sound effects. Units could play SFX through the animation system, we had a couple of audio events set up from previous titles
but that was it. Triumph had also recently switched to NoesisGUI, for which we did not have an SFX implementation yet.
I came up with a solution to link SFX to NoesisGUI controls directly in the xaml file. All the UI designer had to do was add an "SFX.Click" or "SFX.ValueChange" tag and fill it in
with the appropriate SFX Tag.
After that I became something of a go-to audio programmer. I worked together closely with Paradox's Audio Department to help bring our sound to the next level. I implemented parts of
FMOD's API such as Snapshots and VCAs and set up designer-friendly ways to link sound effects to in-game events.
Play on Steam
Date: August 2022 - Now
Triumph Studios
About
A little about myself
Ever since I was young, computers and video games fascinated me. I started experimenting in Gamemaker when
I was 13 years old and then moved on to Unity, which taught me C#.
I studied Media Technology at the GLR, which is a more generalized study (websites, applications, games) and then Game Architecture
at Breda University to specialize further in game programming.
I am a very inquisitive person, always looking to learn and improve.
My perfectionism is both my greatest strength and weakness.
If it's worth doing, it's worth doing well. I like working together to find the best possible solution.
I am currently working at Triumph Studios as a Gameplay Programmer on Age of Wonders 4!