Lego
I arrived at home Friday night to find my sister playing with the Lego. Pieces everywhere, building in progress. It wasn't long before we were all involved not one of us below the age of 23. I think I'd forgotten how great a toy it is and how well the stuff ages, or doesn't. I mean sure, some of the white pieces had yellowed, but they were my Dad's and I'm pretty sure they were yellowing when we acquired them from him. The 90's pieces look just like they always did.
Now there was an aim to build the original sets, but this meant breaking up some quite well remembered models. The most memorable, and yet possibly the least impressive, were these little guys:
Being the 3D geek that I am I had to capture these fellas. So using a mixture of LeoCAD and Blender I reconstructed the robots that had so much life when we were young.
Threads
For a while now I've wanted to write a thread or string simulation and I never realised how easy it would be once you get Verlet integration down. Over the past few days commuting in and out of London I wrote this little program. The actual code is pretty simple, less than 200 lines of Python.
These are some of the results:
The important bit of the main loop is this:
1 : for thread in self._threads: 2 : for strength in range(10): 3 : thread[0].x = 0 4 : thread[0].y = 0 5 : for i in range(len(thread)-1): 6 : dx = thread[i+1].x - thread[i].x 7 : dy = thread[i+1].y - thread[i].y 8 : d = math.sqrt(dx*dx+dy*dy) 9 : if d == 0: 10 : continue 11 : diff = (d - self._preferred) * 0.5 12 : thread[i].x += diff * dx/d 13 : thread[i].y += diff * dy/d 14 : thread[i+1].x -= diff * dx/d 15 : thread[i+1].y -= diff * dy/d 16 : for particle in thread: 17 : x0 = particle.x 18 : y0 = particle.y 19 : particle.x += (particle.x - particle.x0) 20 : particle.y += (particle.y - particle.y0) 21 : particle.x0 = x0 22 : particle.y0 = y0
With a simple particle class:
1 : class Particle(object): 2 : def __init__(self, x=0, y=0): 3 : self.x = x 4 : self.y = y 5 : self.x0 = x 6 : self.y0 = y 7 : 8 : def setPosition(self,x, y): 9 : self.x = x 10 : self.y = y
I've simplified some of the calculations. As you may have noticed there is no time step. However, it does run in real time, even on my netbook and is quite fun to play with.
Inspiration
I haven't got a whole lot to show you at the moment as I have a few more long-term projects kicking off but I wanted to share with you some inspiration that has kept the creative side of me alive when I've needed it most.
I'm sure many will have heard of Coheed & Cambria, but if not I urge you to check them out. Their sometimes childlike, sometimes monsterous vocals combined with dramatic and thrilling orchestrals have taken my mind in some amazing directions when working on a project. What can I say, I am a fan.
They were superb live.
In other inspiration-related news I found a great book, "Botany For The Artist" by Sarah Simblet. I have a fascination for plants and this just jumped out and screamed, "buy me!" And guess what, Amazon had it on sale... win! I eagerly await its arrival.
All in all I may not have completed much recently but I've certainly been branching out and exploring new ideas. With the potential for another game on the horizon and maybe one day eventually, a short film might surface. For now I shall indulge and experiment and share with you my discoveries. It's all progress in the end.
Rosegarden
Had a bit of trouble getting Rosegarden to work before, but I seem to have it working now. Hope this helps anyone else who is suffering similar issues. (This is for installing on Ubuntu 9.10, "Karmic Koala").
Open Synaptic (System > Administration > Synaptic Package Manager)
- Install Rosegarden (and dependencies, including qjackctl)
- Install qsynth
- Install fluid-soundfont-gs
- Install fluid-soundfont-gm
Open qjackctl (Applications > Sound & Video > JACK Control)
- Click Setup:
- - Disable realtime (was getting "cannot use real-time scheduling" error)
- - Enable: Soft Mode, Force 16bit, H/W Monitor
- - Close Setup (OK)
- Click Start
Open qsynth (Applications > Sound & Video > QSynth)
- Go to Soundfonts tab
- Open... "/usr/share/sounds/sf2/FluidR3_GM.sf2"
- Open... "/usr/share/sounds/sf2/FluidR3_GS.sf2"
- Click Restart
Open Rosegarden (Applications > Sound & Video > Rosegarden)
- Ignore warnings:
- - "System timer resolution is too low" (or try what it says, I haven't yet)
- - "Newer version available"
- - "Helper programs not found"
- File > Open... "/usr/share/apps/rosegarden/examples/aveverum.rg"
- Studio > Manage MIDI Devices...
- - Set each device to use "Connection": "Synth input port"
Alex's Head
Unlike the UK it seems brain scan data belongs entirely to the patient, and not to medical practitioners, when that scan is done in Switzerland. This is great news for anyone who likes to play with images.
The program took a day and a bit to write (that's where the weekend went), and here are some of the results:
I took the original images (512x512 jpegs), converted them to 100x100 pngs, and loaded a series of about 100 slices. Rendering the resultant 1 000 000 points still gave a responsive application, but any more pixels and things began to slow down.
The voxels are 1-byte greyscale values. The colours are achieved by passing the value through a colour ramp (of hue cycles). By showing only certain ranges of the data set different parts of the head can be displayed.
Thanks go to my brother Alex (and his parents) for allowing me to use these scans.
Attractive Attractor
These images were produced by constructing a Clifford Attractor and iterating a point sereral million times over it. The resultant positions of this point are rendered as small, semi transparent particles. Where the points overlap they add up to increase the brightness in the final render.
A snippet of the code used to produce the result above:
1 : A, B, C, D, E, F, G, H, I = (1.4, 1.1, -1.2, 1.6, 0.9, -1.3, -0.9, 0.9, 1.1) 2 : x = y = z = 0 3 : points = [] 4 : pointcount = 3000000 5 : for i in range(pointcount): 6 : x2 = sin(A * y) + D * cos(G * x) 7 : y2 = sin(B * z) + E * cos(H * y) 8 : z2 = sin(C * x) + F * cos(I * z) 9 : x = x2; y = y2; z = z2 10 : points.append((x, y, z))
This particular render was done using Aqsis. I have recently found it very difficult to achieve a similar effect with Pixie so it is certainly great to have the choice. Pixar®'s PRMan® renderer does the job just fine also, but it does cost a fair amount more.
The core RenderMan® command is very simple:
1 : Points "P" [ 2 : 1.600000 0.900000 -1.300000 3 : : : : 4 : 0.975489 -0.138367 0.255526 5 : ] "constantwidth" 0.01
Surround this by a few translates and rotates to make it look good (and spin on a nice axis), and you're done.
You can pick up the Aqsis renderer at aqsis.org→. The code was written in Python with no additional libraries. The idea originated after seeing
this→and deciding to try it for myself.
Site Design Updated
Well, I hope you like the new site design. I thought the dark theme more suiting to the display of images. Now I just need to show you some images...
New Year, New Ambitions
Working in VFX has been fantastic so far, but I still have a deep hunger to learn and experiment with my own projects. This year I plan to feed that hunger.
My time spent at Bournemouth really showed what could be achieved when I found something I love doing and has opened up even more avenues to explore in the VFX world.
With my housemates also in the CG field there are lots of projects flying around so I've taken the opportunity to play with RenderMan (via the open source Pixie renderer) to show off some of their work, and to make some pretty pictures to post up here.
Credit to Sophie Shaw for the models.
The wireframes are a little off with the RiCurves intersecting the geometry but it's a good start I feel. No doubt there will be further tests to follow as I experiment further with what Pixie can do...
A New Era
A bit dramatic but it is certainly a big change. The masters course at Bournemouth has now come to its end with our final piece having been handed in – phew! To say the least it is taking some getting used to, but London awaits.
In between looking for accommodation I thought I'd update a little of my site, ready for the year ahead.
Masters Showreel (Preview)
I have been putting together a reel to show off some of the more interactive things I've been working on over the past year. There are still a few things to add but you can enjoy the preview here:
I've uploaded a few images from my sketchbook and some screen grabs of the tools I've been working on this year. Hope you like... there is more to come.
Updates: Portfolio and CV
As we near the end of the course here in Bournemouth I decided, despite being busy with the Masters project, to update my site. More updates will follow soon but for now you can follow the progress of this project over at:
bandwagongame.blogspot.com→I've uploaded a few images from my sketchbook and some screen grabs of the tools I've been working on this year. Hope you like... there is more to come.
IE, Safari, Firefox, Linux, Mac, Windows
It would appear the days of equal standards are still to come where browsers are concerned. I knew I should have used a template but I like to remain up-to-date and web development is not something I have done for a while.
Developing on Ubuntu, testing with Firefox in Linux, and Firefox and Safari on the Mac, I thought I was fairly safe. A few tweaks to get IE sorted and we're away... alas, no.
It seems even Firefox is different for Windows users, but after a few fiddly compromises, the site is working and looking almost the same across all major browser / operating system combinations. Happy times.
If there are any obvious errors, please do drop me an email.
RenderMan® API and shading language
Some stills from an animation using the RenderMan® API and shading language (SL). All done by Python scripts driving the animation and rendering process, dynamic tree generation, comet particles, sun particle noise effect.
Shaders were written for the various planet surfaces, the most detail being earth, which uses a noise-based colour ramp. A similar map drives the normal map and specularity.