Hello! I am reading Computer Science at the University of Southampton. I write code, design games, and occasionally tweet. Why not checkout my CV, connect on LinkedIn, subscribe to my RSS feed, or send me an email?

# Colourful Consoles with Bash

Posted August 14, 2017. 228 words.

With bash it is trivially easy to produce nice, colourful console output with the code below. Simply paste it into the top of your script, and then you can colour your text by just printing the variables.

For example, if you want bold yellow text with a red background use echo "${BOLD}${YELLOW}${BRED}Critical Warning!${CLEAR}". Additionally, you can ${ITALIC}, ${UNDERLINE}, ${INVERT}, or ${STRIKE} text as you see fit. Once you are done with formatted text, use ${CLEAR} to clear all formatting. Lastly, ${RESET} and \${RULE} to reset the screen and create a horizontal rule. Vertical rules are left as an exercise for the reader.

# Visiting Paris

Posted July 15, 2017. 87 words.

Magnifique!

# How to play ███

Posted July 9, 2017. 208 words.

███ is a great game. ███ demands that you play it. ███ can be played with any deck of cards from any game, assuming it has suits and values. For example, ███ works well when played with Star Realms cards, and ███ could work well with Magic, or even scraps of paper.

### Rules

• Deal out a number of cards to each player, the number doesn’t really matter in ███.
• Players take turns placing down a card in the centre that is either higher or lower value and of the same suite, or the same value and any suite. This pleases ███.
• If a player cannot play they must pick up. They have failed ███.
• Shuffle the deck when it runs out. ███ must continue.
• When a players hand empties, it is their turn to think of a new simple, secret rule. Contradictions please ███.
• Do not explain this rule.
• Enforce the rule viciously.
• Do not mention the great ███.
• Do not criticise ███.
• Do not explain ███.
• Do not argue about ███.
• Do not make a mistake playing ███.
• Do not fail ███.

Any mistake while playing ███ requires punishment, that player must pick up an extra card.

# The Village Fete has Arrived

Posted June 25, 2017. 5 words.

# Third Year Project

Posted May 20, 2017. 40 words.

It’s done, it’s over! Months in the making, my dissertation is finished an available from lect.me. My advice for future students, is to start early. Projects like these always take longer then you expect.

# Designing Games with Unity

Posted May 19, 2017. 481 words.

Having previously created games in my spare time and in competitions, I chose to team up with three different partners to create games focusing on gameplay, narrative experiences, and innovative technology using Unity. It was hard, took a lot of work, but in the end it was one of the most satisfying modules I ever took at University. Shout out to Rikki Prince, Dave Millard, and Tom for running such and excellent module.

#### Planet Deathmatch

A fast paced, Quake inspired, local multi-player, little planet deathmatch infinite arena shooter. Hone your skills, then compete against your friends to see who can dominate the playing field. Supports up to 4 player split-screen, bring an Xbox controller. A student game created at the University of Southampton by Matthew Consterdine and Ollie Steptoe.

Featuring a number of classic weapons:

• Shotgun: The short to medium range wild card, capable of one shotting your target, or missing entirely.
• Launcher: Fires explosive rockets, knocking back all the enemies in your way. Just be careful not to get caught in the blast.
• Pistol: Are your opponents not on fire? Well, that’s where the pistol comes in, it fires incendiary rounds igniting targets.
• Axe: A visceral weapon that can end your opponent in a couple of hits.

Well, what are you waiting for? Play today!

#### Littlest Billy-Goat

A fully narrated re-telling of the fairy tale classic. Single player, play with a mouse/keyboard or Xbox 360 controller. A student game created at the University of Southampton by Matthew Consterdine and Jeff Tomband. Download and play.

#### Let it burn!

Using your flame-thrower, wrack up points and burn the forest down. Single player, play with a mouse/keyboard or Xbox 360 controller. A student game created at the University of Southampton during the Southampton Code Dojo. Burn down everything!.

#### Last the Night

Last The Night is a procedurally generated first person survival game in which the player fights for their life after having crash landed on a mysterious, unknown planet. Armed with only a pistol, the player must fight off the various monsters inhabiting the planet, and only once the sun rises will they be safe.

With seed based world generation, there are literally millions of planets to explore with no two being the same, and with the addition of Easy, Medium and Hard difficulties, advanced players can challenge themselves whilst beginners can get a feel for the game. Last The Night features 17 different types of monsters, keeping the player guessing at all times.

A student game created at the University of Southampton by Matthew Consterdine and Ed Baker. Do you think you’re brave enough to last the night?

# Machine Learning with MATLAB

Posted November 24, 2016. 4094 words.

I decided to investigate Machine Learning using MATLAB.

#### Posterior Probability

To compute the posterior probability, I started by defining the following two Gaussian distributions, they have different means and covariance matrices.

Using the definitions, I iterated over a N×N matrix, calculating the posterior probability of being in each class, with the function mvnpdf(x, m, C); To display it I chose to use a mesh because with a high enough resolution, a mesh allows you to see the pattern in the plane, and also look visually interesting.

Finally, I plotted the mesh and rotated it to help visualize the class boundary. You can clearly see that the boundary is quadratic, with a sigmodal gradient.

#### Classification using a Feedforward Neural Network

Next, I generated 200 samples with the definitions and the function mvnrnd(m, C, N);, finally partitioning it half, into training and testing sets. With the first of the sets, I trained a feedforward neural network with 10 hidden nodes; with the second, I tested the trained neural net, and got the following errors:

• Normalized mean training error: $0.0074$
• Normalized mean testing error: $0.0121$

These values are both small, and as the testing error is marginally larger than the training error, to be expected. This shows that the neural network has accurately classified the data.

I compared the neural net contour (At 0.5) to both a linear and quadratic Bayes’ optimal class boundary. It is remarkable how significantly better Bayes’ quadratic boundary is. I blame both the low sample size, and the low number of hidden nodes. For comparison, I have also included Bayes’ linear boundary, it isn’t that bade, but still pales in comparison to the quadratic boundary.

To visualize, I plotted the neural net probability mesh. It is interesting how noisy the mesh is, when compared to the Bayesian boundary.

Next, I increased the number of hidden nodes from 10, to 20, and to 50. As I increased the number of nodes I noticed that the boundary became more complex, and the error rate increased. This is because the mode nodes I added, the more I over-fitted the network. This shows that it’s incredibly important to choose the network size wisely; it’s easy to go to big!

After looking at the results, I would want to pick somewhere around 5-20 nodes for this problem. I might also train it for longer.

Training Error Testing Error
10 Nodes $0.0074$ $0.0121$
20 Nodes $0.0140$ $0.0181$
50 Nodes $0.0153$ $0.0206$

#### Macky-Glass Predictions

I was set the task of first generating a number of samples from the Mackey-Glass chaotic time series, then using these to train and try to predict their future values using a neural net.

Mackey-Glass is calculated with the equation:

For the samples, I visited Mathworks file exchange, and downloaded a copy of Marco Cococcioni’s Mackey-Glass time series generator: https://mathworks.com/matlabcentral/fileexchange/24390

I took the code, and adjusted it to generate $N=2000$ samples, changing the delta from 0.1 to 1. If I left the delta at 0.1, the neural network predicted what was essentially random noise between -5 and +5. I suspect this was due to the network not getting enough information about the curve, the values given were too similar. You can see how crazy the output is in the bottom graph.

Next, I split the samples into a training set of 1500 samples, and a testing set of 500 samples. This was done with $p=20$. I created a linear predictor and a feedforward neural network to look at how accurate the predictions were one step ahead.

• Normalized mean linear error: $6.6926×10^{-4}$
• Normalized mean neural error: $4.6980×10^{-5}$

This shows that the neural network is already more accurate, a single point ahead. If you continue, feeding back predicted outputs, sustained oscillations are not only possible, the neural net accurately predicts values at least 1500 in the future.

In the second and third graphs, you can notice the error growing very slowly, however even at 3000, the error is only 0.138

#### Financial Time Series Prediction

Using the FTSE index from finance.yahoo.com, I created a neural net predictor capable of predicting tomorrows FTSE index value from the last 20 days of data. To keep my model simpler and not overfitted, I decided to use just the closing value, as other columns wouldn’t really affect the predictions, and just serve to overcomplicate the model.

Feeding the last 20 days into the neural net produces relatively accurate predictions, however some days there is a significant difference. This is likely due to the limited amount of data, and simplicity of the model. It’s worth taking into account that the stock market is much more random and unpredictable than Mackey-Glass.

Next I added the closing volume to the neural net inputs, and plotted the predictions it made. Looking at the second graph, it’s making different predictions, which from a cursory glance, look a little more inline.

However, I wasn’t sure so I plotted them on the same axis, and, nothing really. It just looks a mess. Plotting the different errors again gives nothing but a noisy, similar mess. Finally, I calculated the total area, the area under the graph and got:

• Normalized close error: $9.1066×10^5$
• Normalized close+volume error: $9.1180×10^5$

This is nothing, a different of 0.011×10^5 is nothing when you are sampling 1000 points. It works out to an average difference of 1.131, or 0.059%.

From this I, can conclude that the volume of trades has little to no effect on the closing price, at least when my neural network is concerned. All that really matters is the previous closing values.

Overall, there is certainly an opportunity to make money in the stock market, however using the model above, I wouldn’t really want to make big bets. With better models and more data, you could produce more accurate predictions, but you still must contest with the randomness of the market.

I suggest further research before betting bit.

# Hackard Dell Management Essay

Posted November 8, 2016. 1632 words.

As a preface, this essay was coursework for one of my less favourable modules. I figured I would include it here for completeness sake, bar other reasons.

Hackard Dell, a fictional hyper-conglomerate had been the market leader in both desktops for well over the last decade. However, with worldwide shipment on the decline for the last 7 months 1, to remain relevant, a paradigm shift is required. Gone are the days of the desktop, it is time to embrace the new reality.

1. Gartner, “Gartner Says Worldwide PC Shipments Declined 5.2 Percent in Second Quarter of 2016,” 11 July 2016. Available: http://gartner.com/newsroom/id/3373617. [Accessed 3 December 2016].

# Oh Dear

Text. Posted June 24, 2016. 0 words.

# Aqua, an imperative language, for manipulating infinite streams.

Posted April 28, 2016. 1503 words.

This is the user manual for the Aqua programming language created as part of Programming Languages and Concepts. Visit the project on Github.

Aqua is a C­like imperative language, for manipulating infinite streams. Statements are somewhat optionally terminated with semicolons, and supports both block ( /* ... */) and line comments ( // ...).Curly brackets are used optionally to extend scope. Example code can be found in the Appendices.

Before continuing, it’s helpful to familiarise yourself with Extended BNF. Special sequences are used to escape.