The title of this paper follows a classic structure - a technical main title followed by a descriptive subtitle, so everyone understands exactly what it does. The best part is it 'just works', even on photos you would expect to break it. A texture artist can avoid the tedious and time consuming activity of preparing tileable textures and focus on actually texturing.

To explain the technical part of the title, the 'obvious' solution to making tileable textures is to run a texture synthesis algorithm (in this case PatchMatch) with the constraints wrapping around the edges, so the output tiles. Sometimes this works, but more often than not it fails. This is because humans are really good at spotting the odd one out. This is where 'stationarization' comes in - a stationary detail is one repeated throughout a texture. A non-stationary detail is the odd one out that humans notice. So this algorithm adjusts the texture synthesis to avoid including non-stationary details. And that just works!

During production we didn't create a video, as Joep made this great online supplementary results navigator, so we decided to make up for it with a particularly crazy spotlight video:

(Extended edition. Original had to be 30 seconds long, so this adds back the stuff we had to drop, plus includes a title slide.)

Stuart came up with the initial storyboard, and Joep was our 'actor', whilst I did all the VFX and 3D graphics. Probably went too far, but they were my weekends and I like a good dose of absurdity:-)

You may find everything on the project page, but here is the paper:

"Texture Stationarization: Turning Photos into Tileable Textures." by Joep Moritz, Stuart James, Tom S. F. Haines, Tobias Ritschel & Tim Weyrich. Computer Graphics Forum (Proc. Eurographics), 36(2), 2017 (if 75 megabytes is too much there is also a low resolution version)

You can also download the supplementary material, though I would strongly recommend browsing it online instead.
Five Years of 3Dami
As tradition dictates here is my annual 3Dami post, about three months late. However, the lateness is advantageous, as there are now three things worth mentioning:

1. This years event was smaller than last year - we could not get the funding unfortunately. We still ran three teams at UCL, and whilst the budget was tight the event ran the same way it usually does. Further improvements were made to the process of running the event - we may not have mastered funding, but I think it's safe to say that after five years we have got pretty good at making films with college students. This was also the first year we went all out on Cycles - after last years experiment using the CS cluster we had the confidence to dial everything up to 11, with one team rendering frames that took as long as four hours. The final films are available on the 3Dami website.

2. I, alongside Peter and Monique, gave another talk on education at the Blender Conference 2016, this time on teaching younger students, or pre-3Dami students as I like to think of them;-) Here it is:

3. I finally got around to moving the custom render farm and asset manager we use as 3Dami from a zip file on the 3Dami website into a proper repository on GitHub: Render Farm Asset Manager. Its a little unusual, but hopefully others may start using it, and even contribute new code back to it!
A few days before the disaster that was the brexit referendum, in frustration at the narrative in the media and on twitter, I released a short, a Monty Python-lite stab at two of the politicians involved. Its pretty rough, but then I made the whole thing in a single week, alongside a (more than) full time job:

Below is a description of how I made this short, plus various ramblings. In case anyone is wondering yes, my experience of running 3Dami proved invaluable - it may exist for teaching students, but its also one hell of a lesson in making animated films quickly for the teachers. Particularly when you're responsible for finding and eliminating bottlenecks from the process. In fact, the below is inspired by the fact I write extensive notes during each 3Dami, which I use to identify how to improve next years event - this is me doing the same for my own process. The below is extended from notes made whilst making it, so it should be accurate. Actually wrote this post a while back, a few days after the referendum, but delayed publication as did not want this to be the first thing on my blog during SIGGRAPH, to make life easier for anyone coming to my website to find the handwriting project.

Friday 10th June:
I had been meaning to try a 'digital cut out' approach for a while (using Blender, of course), and had nothing in particular planned for that evening, so decided to make a character. Cut-out to me means Monty Python and South Park - selecting David Cameron was an obvious choice, as he is a politician richly deserving of being squished by an oversized foot, or uttering a catch phase such as 'hunt the poor', before murdering Kenny (Pre-brexit vote. Now any of the above would feel too kind.). Plus there are plenty of photos of him to be stolen from the internet.

I actually created a front view of him - only one head and one hand type plus a suit I drew. This proved to me that the idea would work, but hand drawn didn't work for the suit, even if it meant that the tie could be easily animated. Not that I rigged it - in fact none of the work I did Friday made it into the film. Its value is it got me thinking; with the referendum coming up and the increasing dominance of the racism-driven brexit message I wanted to contribute my voice to the chorus.

The need for Boris was immediately obvious - Cameron and him were past allies, now at each others throats. Boris pulling off a murder-suicide also made sense - as an analogy for what Boris is doing to their careers it was a perfect fit. This got mulled over for a while, but just wasn't funny. Then the idea of having Boris as a pig, being ridden by Cameron popped into my head. This occurred late in the evening whilst I was reading Sandman (started that evening, so right at the beginning!). With this idea it all fit together - I stopped reading Sandman and went straight to my computer and blocked out the entire film in 3D, with ellipsoids to represent the two characters. Then I went to sleep - I was hosting a party the next day.

For completeness, the original script included them hanging off the edge of the cliff due to Cameron trying a last minute save. With Boris holding on with his mouth. In Cameron's groin region. It was actually a solution to my desire for Cameron to hit the ground before Boris - a severed penis blood rocket driving him into the ground would at least visually explain why he hit first. The intention was to have a literal crown of money, worn by Cameron and then by Boris (bounced onto his head when Cameron hit), with Boris happy, if only for the second before he went splat. Dropped because it distracted from the whole point, was a little crude, and I don't think normal people care about the rules of physics as much as I do.

Sunday 12th June:
Nothing happened until the afternoon. Party had eaten Saturday, and was a total wash out - was rather mopey that morning. Plus I was feeling ill after eating most of the left over cheese. But after lunch (last of the cheese, despite already feeling ill. Can't help myself.) I sat down and made Cameron from the side - had a break for supper but went all the way to midnight. Cut out 9 different hands and 13 different heads, most of which can't be seen in the short, plus a suit to use. This was all done using the masking tool in Blender - best tool I know for cutting out part of an image, even though it's meant for video! Then I used Krita to clone/airbrush gaps created by layering, most extensively behind the arms of the suit. The 'Images as Planes' plugin pulled all of the parts back into Blender (all saved as 16 bit pngs with alpha, at 1K-4K resolutions. Sod knows why.), and then I rigged it. Rigging took a bit of experimentation - rigging may be my strong suit, but 2D was new to me. I ultimately made the rigs IK only, and only used skinning on the limbs; body, head, feet and hands are just parented to bones. Didn't do any weight painting - automatic was good enough, but had to switch deform off for all bones except the two for the current limb, as the thing is layered with about a cm between each layer, which would confuse the automatic weight painting otherwise. Probably the most useful thing I learned is using mesh shapes for the bones, with left/right indicated by letters (that don't overlap), so you can actually tell which one you have selected/select the one you want. Its impossible to tell otherwise, and hence immensely frustrating to use. As an additional note, creating a switching texture in cycles is much harder than it aught to be - the node group for Cameron's head is nightmare inducing (obviously it is driven via a bone for animation.), and full of sin curves and greater than nodes to extract the binary digits from a continuous input.

Monday 13th June:
Had one of those 'finding out about a 11pm deadline at 5pm' type days at university - thank you SIGGRAPH for hiding that fact and failing to send out promised reminder emails. By the time I got back I had no energy and needed sleep, but went and found images of heads for Boris, which ultimately got filtered down to three as I had a very good idea what was required for him. Specifically, neutral, happy and confused. Apparently 'kissing a lizard' and 'confused' look the same when done by Boris. With a burst of energy I cut them out and airbrushed out the lips of a lizard that overlapped the lower lip of Boris. Making this film may have actually been more surreal than watching it.

Tuesday 14th June:
Finished Boris. Finding a good pig at a high enough resolution from the side proved hard, until I found out that breeders post such photos, often raw from the camera. Cut the pig out using Blender, then filled in gaps (behind the legs) and airbrushed in some alpha blending in Krita. Rigging was trivial - got to bed at a reasonable time as the whole thing only took three hours. In hind sight I really should have tinted the pig to be pinker, and removed its expanded buttocks or used a female pig - I'm fairly sure that people only know its a pig because the title tells them it is.

Wednesday 15th June:
Until this point the entire enterprise had been half-serious, but now I had both characters it was time to go all in with some planning. With the referendum a week and a day away and a desire to get it done in time that people may see it before voting, I set a deadline of Sunday, so I could share it first thing Monday. With only evenings and a weekend time was extremely tight - I knew I would have to be ruthless, giving everything a fixed amount of time and then using whatever I had once the time elapsed. Plan was simple - set today, then Thursday, Friday and Saturday for animation, with rendering starting as soon as each shot was done. Sunday to finish rendering and do audio/credits.

I got home a little early, so had about 4 hours to work on the set, after subtracting time for supper. Still, it had to be simple, so I started by setting up the simple sky that is built into Blender. I then used a dot product to define a small circle of the sky to be the sun, and manually aligned its angle with that of the built in sky - really wish there was a (sensible) way to get/set the angle of the sky texture from the normal node. I kept the falloff of the dot product, but scaled the part above the threshold back up to 1, with clamping to get rid of the negative values. This creates the correct falloff of the real sun. Then I multiplied it with a black body node set to 5800K, so it was the correct colour, and scaled it to a value of 800, to get a sharp midday sun when added to the built in sky. Its a simple technique, but surprisingly effective - it looks like a real cloudless day, and casts good sharp shadows that fade out realistically and have a blue tint. After the fact maybe 500-700 would have been better however - shadows were a little too sharp.

Terrain has two parts. The running track/cliff/splatting area (never shown) part, and the background. For the main area I started with three polygons - a 200m X 50m section for the running bit, a 100m X 50m drop for the cliff, then a 100m X 50m stretch for splatting. I then grabbed my tablet, and went into sculpt mode with dyntopo on and just smashed it, mostly with clay strips, until I had a suitable shape. I also used the grab tool to bend it around on the side facing the camera, so you wouldn't see the edge of the set. Then detail was added - switched off dyntopo and added the multires modifier, before working it until it ended it up just south of 1 million polygons.

For the background I happen to have a 2m resolution height map of all of England from the environment agency floating around, so I just chopped out a 3km X 3km square and used it. Everything was made to scale btw - makes life so much easier. Had to remember to set the maximum clipping distance of the camera to 5km in every file though! Square was from the top of the peak district btw. - apparently you run away from the EU towards the Atlantic, whilst just south of the Scottish border. Makes a kind of sense I guess:-P

The material of the background is a simple mix of diffuse/glossy. However, the colour fades towards the horizon colour depending on the distance the ray travelled, as a cheap mist effect. Additionally, a voronoi texture was fed into a constant colour ramp to give some variability to the colour - not sure this really worked, but was out of time so it ended up as its current blandness. I shouldn't have included the glossy - the idea was to grab the colour of the sky to help with the mist, but with the sun so sharp it added a lot of noise. Renders would have been cleaner if I had stuck to diffuse only.

The cliff material is rather more complicated, and far beyond what it makes sense to describe here. A complex set of procedural textures were used to add fine shape detail (bump mapping), and also drive colour/specularity variation. Additionally, the material blends to a cliff colour based on the z component of the surface normal, run through a colour curve to control where the transition is and how sharp it is. These are all techniques I have a lot of experience with, so I dialed it in really quickly. Used my favourite trick of feeding a blend of a noise texture's colour and the object coordinates into a voronoi cells texture. The problem is the render speed took quite a hit due to evaluating all of those procedural nodes.

There are two particle systems, one distributing grass, the other rocks. In both cases location was controlled with weight painting - it was always my intention to show a transition from lush to dead when going past the 'leave EU' sign, so grass fades out at the sign, and rocks fade in. Most of the rocks are in the unseen splatting area. I stole the rocks from another project - they were designed to be much further away in a desert, so they are a bit out of place, but meh. The grass was modelled super quickly - simple geometry, with 9 variants (same as for rocks), and then a procedural material that makes them stripy and adds colour variation between individual strands. Being me I used PBR techniques, transparency and translucency. They utterly murdered my render time - should have used something simpler, as the ability to render each shot more than once would have been nice.

Only needed one prop - the 'Leave EU' sign. I think it was my biggest mistake in this project - clean, neat and a futuristic material on the text. It should have been as rotten as their racism, falling to bits and propped up desperately by Farage. Wood is a horrendously complex procedural from my mannequin teaching characters. The text looks futuristic because of the inclusion of a retro-reflective term (like real road signs) to make it brighter - that is what causes those sexy red highlights in the close up.

Thursday 16th June:
I had tweeted a render of the set (with jumpers!), and got a complaint about it lacking clouds, so first thing that evening I added just one. Somewhere between taking the piss/being lazy/not having the time to mess around:-P Take one cube with a subdivision modifier dialled to 11, then displace modifier with the cloud texture, using gradient weight painting to make it bumpier on the top/flatter on the bottom. Simple cycles scatter material with one really powerful light, rendered to an image with transparency and loaded into the set using the 'Images as Planes' plugin with emission. Its actually positioned 3km away from the main area of the set, much like a real cloud would be - there is parallax when they are falling off the cliff, except as the most distant object you don't see it. Much like you don't see cloud parallax when jumping off a cliff in real life. I also quickly added a dead tree (sapling plugin) near the sign, with a procedural texture. Really should have been more dead.

I was flagging a bit after the above, but animated the first two shots - it was actually one animation used for both. Walk cycle for the pig took very little playing around with to get right - after watching a few YouTube videos it was trivial. Its not like I was after realism, and the visual style meant I could get away with absolutely no polish! A quick pass over Cameron added a little bit of swaying backwards and forwards and some slipping to his hand/foot positions. I just did 10 seconds of walking (no NLA - surface is not flat so had to hand tune each), then created two separate shot files that linked and set as background this scene. Camera in one file tracked Cameron, hinting at the existence of Boris, the other to provide the visual punchline and track them both. I made the decision to stick to an 80mm F1.2 camera throughout when creating these shots (everything was to scale, so using a real lens made sense. Even if you would need a stack of ND filters to shoot with that lens wide open in that much light!), and spent some time playing with render settings - settled on 200 samples as the lowest I was willing to go, even though my maths said this was a bad idea if I wanted to be done on Sunday. I guessed that some of the later shots would be quicker to render as they wouldn't have the grass, which proved to be right, but I was concerned for a while. Unusually, I tightened up the width of the point spread function, so its Blackman-Harris with a width of 1, to keep the character images a little sharper. Its subtle, but I think it helped sell the cut out style.

Friday 17th June:
I set the first two shots rendering on my two home computers (laptop and desktop), before leaving for uni. It may sound like overkill, but I used exr files. Even for silly little things such as this I would argue they are the only real choice, especially as with the Pixar compression options and just colour they end up smaller than png files, but with a greater dynamic range. I of course added a bunch of extra channels to give myself the freedom to make tweaks in comp, though ultimately didn't use them. Both shots were done when I got home.

I got back late, so had little time to work on it before bed. I composited the first two shots - just a filter to clean up some of the rendering noise, glare, a vignette and a bit of grading. Then I worked on the pig gallop, as I was worried about how it would work, and the transition from trotting in particular. It took me to bedtime, but I ended up with something that I felt was appropriate. Utterly ridiculous of course. I imagine if a biologist were to do the calculations they would conclude that a pig that size throwing itself around like that would die of heatstroke. But Boris feels like a creature that lives for suicidal absurdities. Last thing I did was spin up an Amazon webservices instance and set the rearing up shot rendering - a safety move to give myself a bit of breathing space.

Saturday 18th June:
Saturday was all about the animation - I just hammered my way through, but then it wasn't that hard as I now had the walk, run and transition all worked out. This was good, as I suck at animation, and usually rely on reference. Hard to find in this case! Left my laptop rendering all day while using my desktop, though its rather old and useless these days, so it didn't get much done. Had my desktop rendering in the background as well, with the nice value set to 20 so I could continue to work. Also comped everything. I changed the grading from the point of them running past the 'leave EU' sign, to something more depressing (warm tint to cool tint), and then had to fade the grade with a mix node between the two grading paths for the sign spot shot. Should have also done it for the running past shot.

The assassination of Jo Cox occurred on the Thursday. I had seen the news on twitter in the afternoon, and was on a crowded Thameslink train home when her death was reported. The crowd might as well have not been there at that moment, as it was when I realised how far off the rails the brexit campaign had gone. I remember thinking that, irregardless of the result, the country had been broken - that the hatred had taken over. When I regained my focus I was looking at the other people in the train wondering who else had been infected by that hate. At that moment they may as well have all been aliens in people suits. I also remember trying to remember the last time a politician in this country had been assassinated. I couldn't (Google tells me I was 7 last time it happened, and has only happened 8 times in over a century).

By Saturday I was feeling uncomfortable about showing the murder-suicide of a pair of MPs, however despicable they may be. Even posted to Facebook asking peoples opinions (they were leaning towards not showing it). The ending got cut. I had got as far as the impact and the blood spray, but not the larger chunks and resulting mess. Going on the internet to find material for that would not have been a pleasant experience, and truth be told that may have factored into my decision. But before Thursday I was definitely going to show that splat. Given more time I would have done some kind of comedic non-death. Swallowed by an Angela Merkle whale and squeezed through her gut covered in proto-shit for instance. But time was not on my side, so a fade to black was the only move.

Sunday 19th June:
All I did on Sunday was render and edit. Mostly render. Included going on a long walk to break up the waiting and reading some more Sandman. I can't claim to really know how to edit, and certainly not for comedic timing, so that was just me throwing what I had together and adding a little audio. I have a phobia against music on YouTube, so did not have any (With CC you have a 50:50 chance of the video being marked as violating copyright. YouTube, like most big media companies, does not give a shit about content creators. I have the musical ability of bubble wrap, so creating my own was not an option.). Grabbed some CC0 audio of pig grunts, birds chirping, and wind rushing by. Only new content I created was the credit roll - it took a lot of editing to get the words of the 'why I cut the ending' bit right, but nothing technically interesting. Uploaded to YouTube at the end of the day, but did not share the link, as I left it there ready for Monday morning.

Monday 20th June:
It was already up on YouTube, so all I did was tweet it at 9am, in the vague hope of maximum impact. A few friends watched it, some shared it, but unsurprisingly it did not go very far. I then shared it each day in the run up on twitter at a different time, with various hashtags. I did get someone from the brexit camp retweeting it and insulting me, and someone sent a fairly rude private message that I just deleted. Guess I am proud that at least two brexit people watched it, though its clear that it didn't change either of their minds. Probably reinforced their word view in fact, and after decades of everyone and their dog blaming everything bad on the EU, however absurd, it's maybe not that surprising.

In the end it only got 200 views before the referendum. Most would have been university colleagues, many of whom could not vote anyway, as they are not British citizens. It should be noted that universities will suffer badly from this decision, yet the people at them are mostly unable to vote against it. Gove's anti-intellectualism was built into the vote. But I am happy I did something, however ineffective. Its much the same reasoning for why I went to the pro-EU march in London that occurred two weeks later. The symbolism of trying, even though you know it won't make the slightest bit of difference, is good for your conscience.

In my continuing quest to upload all of my handwriting project code I have now got the utility tools up, including the required support modules:

hg: My homography module - pretty simple, though it did grow a bit beyond its original purpose. Includes the obvious code for constructing them and applying them (2D case only). Also includes some basic image querying stuff, as they also need access to the b-spline code. Plus an nD Gaussian blur, for no good reason.

ply2: I usually stick to json and hdf5 files, but hit a problem with the handwriting project, as neither was a good fit. json does not really do large amounts of data, whilst hdf5 is not human readable and has poor text support. Instead of creating an entirely new file format I decided to extend the ply format, as it could almost do what was required. I called it ply2, but have done so without the permission of the original developers. My apologies to them if they don't like this! Main changes are an additional type line in the header, to support its new role of containing stuff that isn't a mesh, plus support for typed meta lines in the header, as the comment system is crap. More importantly, I added support for elements/arrays with an arbitrary number of dimensions, string support (utf8), and cleaned it up. The module includes a specification that details this all properly.

handwriting/corpus: Builds a corpus from a load of books downloaded from Project Gutenberg, for the purpose of generating a text sample an author writes out, so the system can learn their handwriting. To use it you will need to download the documents,

handwriting/calibrate_printer: Does a closed loop colour calibration of a scanner-printer pair (I am taking closed loop here to mean the calibration is relative to the devices, not a specified standard). You print out a calibration target then scan it in. A GUI then allows you to learn a colour transform that, if applied to an image in the colour space of the scanner, will adjust it to be as close as possible when printed with the printer. This works only if you use the same scanner to obtain your handwriting samples as to scan in the calibration target. Uses thin plate splines internally.

All Posts