With every new project, I attempt to push my knowledge of how to accomplish the desired result using the software tools at my disposal.

So this, one of my latest designs, was no exception.

The ‘Lucky AT-AT Foot’ is a play on the idea that someone might carry a lucky rabbits foot (actually nobody does this IRL) transposed to the not-really-rabbit-like-more-a-giant-killing-machine-on-legs vehicle from Star Wars.

The execution of this model looked deceptively easy, and to challenge myself that bit more I picked Blender as the tool to carry out the design.

If you haven’t tried Blender, well, let’s just say it has a fairly steep learning curve. It is also one of the most complete 3D rendering and animation tools available for free, with incredibly deep and complex processes for creating 3D scenes and movies.

3D Printed OSCARd

OSCARd a credit card sized, 3D printed, posable action figure by Steven.

But enough of that; I’m using Blender for the creation of something that will be 3D printed, and because of that, I can play a bit fast and loose with what makes a ‘good’ Blender model.

So the overall shape of the AT-AT foot is relatively simple. The details – simple shapes combined to form a whole – were straight forward.

The hardest part – for me at least – was getting the plate detail around the foot to follow the cone profile the foot consists of. I found a great tutorial on YouTube demonstrating the technique and eventually adopted this with a few modifications.

With a few tweaks to fix thickness issues for my target material of 3D printed steel, the model was ready to be uploaded to my preferred 3D print bureau, Shapeways. Of course other 3D print services are available depending on your requirements.

The Shapeways servers then perform some automated integrity checks on the model based on the chosen material.

With the model green lit for printing, I can have the model printed and shipped in a couple of weeks.

If I’m happy with the result, I’m able to offer it for sale to others through my Shapeways shop.

By Steven Gray.

 

I am really looking forward to the new Glasgow Mini Maker Faire this year, and am planning on being there helping my good pal Kyle McAslan help other people make his synths – sure to be noisy and creative!

The projects I have been working on of late have all been rather too large for mini makerfaires sadly, so difficult to share (ie. expensive to transport and run), but great to build! I’ve been involved in exhibitions at the Edinburgh International Science Festival for two years running – last year with Be The Goalie (www.cargocollective.com/bethegoalie), and then this year with An1mal.

Be the Goalie from Geraldine Heaney on Vimeo.

An1mal is an attempt at recreating animal behaviour in an interactive exhibition, using a camera, proximity sensor, microphone, plus animatronics, puppetry and theatrical techniques. It is basically a fantastical animal (an Elevark to be precise) that responds to the viewer’s facial movements, body movements and, in theory, sounds (although that didn’t make it into the exhibition piece as it turned out). It responds with a range of emotional outbursts – various flavours of uncertainty, anger, and happiness – interspersed with bouts of feeding, grooming and sleeping.

It is currently on show as part of EISF’s ‘Existence’ exhibition at the National Museum of Scotland, until 5pm this Sunday, the 15th April. It is a collaboration between puppet maker Fergus Dunnet (http://fergus-dunnet.weebly.com) , animal behaviourist Elaine Henley (http://www.dogbehaviour.org.uk) and myself (www.000111.co.uk) .


I won’t lie, this wasn’t the quickest or easiest process of design and making I’ve been through. There were times when things progressed really rapidly, and others where things slowed to a stall. And then went into reverse for a bit. The puppet contains 9 servo motors, and the days we got up and running with those and the MiniMaestro servo controller (https://www.pololu.com/product/1352) , and things started to move, were great.

An1Mal GIF - Find & Share on GIPHY

Other days, when I was trying to wrap my head around getting face tracking working using openCV/Raspberry Pi at a useable speed for an exhibition, I could maybe have done without. But then I’ve learned a heck of a lot from it, that will hopefully be useful for something else in the future! In the video below, the wee black spot between the eyes is the Raspberry Pi camera, which is on the end of a long ribbon cable running down into the perch that the animal sits upon, containing all the computers, audio equipment, power supplies and many of the servos.

Speaking of the workings, here is a peek inside at the mechanics:

Photos by Fergus Dunnet

It was an interesting change for me to work on a project where we were actively trying to hide the inner workings of things, as I am more used to making a feature of the innards of machines. So Fergus took the lead on that, as being a theatrical designer, he is more familiar with the art of misdirection (I think its fair to say?).

Hopefully that is a small insight into this project – grab a chance to see it at the National Museum of Scotland until Sunday, and I hope to see you at Glasgow Mini Makerfaire!

Roy Mohan Shearer
www.000111.co.uk

I’m a Computer Scientist by training and profession, but in my spare time I like playing music and writing poetry.  So, when making digital things, a fair bit of both music and poetry tend to feature.   Here’s a quick overview of a few recent projects.

Sheep Dreams

In this project a cuddly sheep is connected to an “EEG” device that displays its dreams on screen.  People can also listen to the sounds of the sheep’s dreams, which incorporate randomly chosen tunes like “Sweet Dreams are made of this”, “Together in Electric Dreams” and “Baa Baa Black Sheep”.  The visuals are provided by Electric Sheep‘s fractal animations, and the sounds by a Sonic Pi script that incorporates outside sounds “overheard” by the sheep into the music.    You can read more about the project here.

As it was put together in a bit of a hurry for a Raspberry Jam, it’s a bit minimal at the moment.  Future plans include adding the ability for observers to alter the sheeps dreams by interacting with it using touch, rather than just sound.  It would also be interesting to add a fractal element to the sounds of the dreams, matching the fractal visuals.

Sonic Pi Goes Bananas

I’m a big fan of music-coding program Sonic Pi and also of MaKey MaKey, a device that allows people to make music on all sorts of unlikely objects including bananas and labradors.  At the moment, Sonic Pi programs, unlike Scratch programs, can’t be directly controlled by a MaKey MaKey, which seemed a shame.

I’d recently been working with Pygame Zero, a simplified version of Pygame, that makes it easier for new programmers to make games (for this book, which I co-wrote with some friends).  I wrote a small program that sent messages to Sonic Pi whenever an object attached to the MaKey MaKey was touched.  More information and the code can be found here.

Poems in the Gaps

Screen Shot 2018-03-14 at 14.22.47

Blackout poetry:  not something you write to occupy yourself when there’s a power-cut, but poems created from other texts by obliterating most of the original words. Looking at some blackout poetry recently, I thought  it would be interesting to code a digital blackout poetry-making tool.  This avoids having to destroy your books, and is handy if, like me, you can never find a pen when you need one.  

The current version still needs a few bugs sorted out and I’m planning to add share on Twitter and Facebook buttons (currently you have to screenshot your poem and post the resulting pic).  It would also be interesting to have a way of making it more physically interactive, maybe involving projecting it on a wall.  In the meantime you can have a go at the beta version here.

Read more from Claire Quigley here.