The Wonders of Internet Collaboration

When I was a child I dreamt of many things. From acting, singing, drawing and dancing, I wanted to do it all! Just like most children, the world was my oyster and I could see no reason why any of this could be unattainable.10326369_1507031152850623_853988464_n
But then I grew older and started to become more aware of how others perceived me and what made me happy in life. I started pushing my wants and needs aside to focus on a more ‘realistic’ way of life and, just like the grades that defined my education, I became critical to the point where I was too anxiously crippled to even attempt to chase those dreams that I had once seen as such a positive goal for my life.

As the years passed my innocent and simple approach to living a life filled with creative ambition seemed to fade into a distant memory. I couldn’t see the point in spending time on something that I could not receive a physical reward from and I had forgotten the simple pleasures of simply doing something for no reason other than to feed my own creativity and happiness. Life got in the way, there was hardly any time in the day to do what society needed of me to squeeze anything else in between. Work, school, friends and family. Simple in it’s own right, but none of it was solely for me. None of it was fulfilling the dreams and hopes that remained locked away in the back of my mind, next to childhood memories of happier times.

It wasn’t until I forced myself to leave these comforts, to stop living the life that felt almost too perfectly laid out for me, that I realised what I had been missing since becoming an ‘adult’. By locking away these passions I had forgotten who I was, I had locked my own uniqueness away. By this time I had packed up my belongings and moved hours away from my home town, distancing myself from loved ones and the life that I had become so accustomed to living. From this distance I noticed the small child inside of me that was screaming for release in every birthday present that I meticulously handmade, and every school assignment that I found any excuse to put in extra work for if I could turn it into an elaborate video, drawing or music piece. These were the parts of me that defined who I was, but because I could find no purpose for them I had laid them to rest for all these years.

So, without my old life to distract me, without worry of others judgment from those who I admired around me, I finally opened up that dusty box of hopes and dreams and spread them all over the internet. It sounds ironic when I explain how a few physical people could make me feel so anxious about my creations, yet I could share them with a global audience online, but the Internet hosts a completely different environment. People here can see your creations, be that drawing, singing, animations, writing, and sure they can judge but they are miles away. They don’t know who you are personally and that makes their judgment less personal and more constructive, which only pushed my creativity even further. On the Internet you cannot help but find inspiration around every corner, through videos and blog posts, people from all around the world doing what they love for no purpose apart from the fact that it simply makes them happy and they want to share this with others. They’re proud of who they are and what they can achieve. They understand that they’re ‘work’ might not appease everyone, it might not meet the expectations of an ‘A*’ grade. But that’s not the point. The point is… Nothing really. That’s the point. Why should we feel the need to constantly impress others if we are already satisfied?

I digress, there is more that the Internet has to offer than just a place to host our uniqueness and creativity. The Internet is able to bring inspiration to others, to make us all realise that it’s okay to let their inner child out. But even better, the Internet makes it easier to achieve our goals through the wonders of global collaboration. Do you want to animate but you can only draw? Do you want to make a film but can’t act? Do you want to sing songs but can’t play an instrument? If you’re brave enough to show your dedication to your dreams online then through the wonders of the Internet, there’s someone, maybe multiple people and amazing communities out there who can help you achieve your goals.

TLDR: I have been singing ever since I was a child, choosing to write songs and poems in every note book that I could find and putting on performances in front of my friends and family every chance that I could get. This filled my life with joy and passion until I was told that this way of life could not last, I could not make a living from it and I would never be able to surpass the competition that would surround me in this industry. My mindset around these passions changed from naive happiness to suddenly being aware that however I spent my time should be supplying me money and a means of living. But that simply isn’t so. I need to supply myself with happiness over anything else (of course money is a factor that we all have to be aware of, however it doesn’t always have to be the driving force).

Through the internet I have found a community of so many wonderful, tight-knit and likeminded people that don’t judge me for spending my time doing what I love for that simple fact. In fact that’s what brought them together, that’s what makes them so unique to me, and that’s what helps me to achieve my hopes, dreams and goals that have been engrained in me ever since I was a child. From sharing what I love and what makes me happy I’ve attracted others who enjoy what I do to offer their talents and passions to for me collaborate with for no other reason than because it would make them happy. That’s beautiful.

The result of this? My first fully produced and edited music video. Yes, it might look silly to some, being dedicated to my favourite childhood game, but that’s irrelevant. I did what I love, I sang and I wrote lyrics. But I collaborated with a most talented and passionate music producer who I am now working on an EP with. I edited and created VFX. But I got to film with an equally talented and passionate videographer who made the video quality so much more crisp than I ever could have achieved on my own. Even better, I reached out to a community online who send me audio and video files of themselves to flesh out the song even more, becoming a harmony within the final song and appearing in the music video itself.

Not a penny spent, not a penny gained, but I’ve never been happier to set the child within me free.

facebooktwittergoogle_plusredditpinterestlinkedinmail

‘Who is Terror?’ – Interactive Video using Raspberry Pi

This past year I have been chosen as one of the first 9 Raspberry Pi Creative Technologists in the UK; each of whom have come from different creative backgrounds, such as animation, photography and even magic! Throughout this programme we were shown various uses of the Raspberry Pi, a miniature and affordable computer, as well as the basics of how programme in Python, create functioning circuit boards and use hardware form the Raspberry Pi’s GPIO pins. We learned this all with the ultimate aim of using this new technical knowledge to enhance our creative skills that we already had; helping us evolve from ‘Creatives’ to ‘Creative Technologists’! We then showed this newfound knowledge off with a project of our very own to exhibit at a Raspberry Pi Creative Technologist exhibition held and ran by ourselves!
rpctThe exhibition went extremely well, being held in Cambridge at Raspberry Pi Headquarters, and I couldn’t help but be so inspired by all of the creative projects surrounding me; from a projection mapped pop-up book to a html game that lit up a sculpture the more it was played. Every project was completely unique and equally amazing!

So what was my exhibition piece? AN INTERACTIVE VIDEO!

Before I go into details, watch this video to get a good flavour of the project and also see some fun clips from the exhibition 🙂

The Hardware

I used:

  • Raspberry Pi 2
  • Motion Sensor
  • Buttons
  • Bread Board
  • PiCamera
  • HDMI Cable (Connected to a monitor or, in my case, projector to play the video)
  • Speakers for local audio output, attached to the Pi

Before starting this programme I knew nothing about circuit boards or physical hardware engineering, but for this interactive video I knew that I needed some physical interaction to change the state of the video so I needed to use some hardware. I needed two buttons and a motion sensor to be hooked up to the Pi at the same time using the GPIO pins on the Pi itself.

motion sensorThe motion sensor could be connected straight onto the Pi using a 5v pin, ground pin and whichever number I wanted (and would have to refer to this number in my Python script). The wires should be connected correctly according to what is stated on the underside of the sensor board and to the correct pins, you can check this by seeing a Raspberry Pi 2 GPIO pin chart online. Luckily there is a lot of documentation of there of how to set this up such as this blog post on the Raspberry Pi blog itself.

ButonsThe buttons however needed to be attached to a bread board, which worked well for me as I could then decorate this bread board afterwards and stick it onto the outside of my decorated dolls house. For each button I needed one cable to be connected to a ground pin and another to be connected to a numbered pin of my choosing (like the motion sensor, I would refer to this pin number in my Python script). Again, there’s much documentation out there about how to easily set this up, such as this blog post on the Raspberry Pi website.

To double check you have everything wired up okay you can simply open up the Python editor on the Raspberry Pi and, looking through the GPIOZero documentation, use some simple test codes to check whether the inputs are working as the be expected. It’s worth noting here that the motion sensor does have two potentiometers (little orange screws) that allow you to adjust the sensitivity of the sensor and the detection time.

As well as this I attached the PiCamera straight onto the Pi by inserting it into the camera port, stated on the Pi board, with the blue strip facing towards the USB ports. To ensure the camera was enabled by the pi I opened up the terminal and typed sudo raspi-config to show the configuration settings and made sure that the PiCamera was enabled.

It’s worth noting that when running videos off the Raspberry Pi it’s wise to split the GPU! I split mine to 128 but it may need to be higher for larger video sizes.

This may all sound like a mess of wires and cables but here’s what the project looked like in the end:

physicalhouse

Hidden within that dolls house is: the Raspberry Pi, speakers, motion sensor and the PiCamera, hidden behind a physical cut-out of ‘Terror’. The HDMI cable connecting to the projector, through which the video will be playing, comes out of a hole at the side of the house. As you can see, I’ve been able to disguise the bread board well by turning it into a sign for the audience to know which button does what when interacting with the video.

The Code: Python

Packages and libraries used:

To ensure that I had what I needed installed on the Raspberry Pi, ready to use it in my code, I ran ‘sudo apt-get update’ and ‘sudo apt-get upgrade’ from the terminal (the sudo is needed to that the computer knows you’re a super user… Yes, kinda like a superhero!… But not really)

You should then be able to run ‘sudo pip install [package]’ from inside the terminal and be able to install the libraries and packages needed to put to good use!

Screen Shot 2016-05-04 at 05.10.47

Setting up the imports and variables is the most important part when coding, so we know what to refer to! To the left is all of my imports, ones that I have mentioned above plus a couple of extras. Includes such as ‘time import sleep’ are very handy as it allows you to sleep your programme while it runs, giving you more control over debugging and the timing in which different parts of the script run at. I’ve also included ‘from subprocess import call’ and ‘import os’ because I was finding that the omxplayer could be a little buggy with playing back audio files and would play them within the terminal but not the Python script, using this I could call a process through the terminal from my Python script.

As explained earlier I have set the variable names for each of my hardware components so that they are easily identifiable as I worked my way through the code. I have stated which GPIO pin each component is connected to so that the script can access them.

Screen Shot 2016-05-04 at 06.02.14After this I added simple functions such as the terrorcam() function which was called whenever the audience pressed the rightbutton and would use OMXPlayer to pause the video, then display a camera preview from the PiCamera with an overlay image of Terror’s haunting eyes staring out of the cracks of the dolls house (don’t forget the camera is within the dolls house, hidden behind a cut-out of Terror). As well as this it uses the terminal to play a sound clip of some unnerving breathing, sleeps for a second and then stops the camera previw and uses OMXPlayer to play the video again. To create the image of Terrors eyes to overlay over the camera preview I simply zoomed into my existing image of Terror and made sure that the image size was of the same output size of the camera preview; 1280 x 720.

Top Tip: If you’re struggling to debug, add print comments to every action the script has to take. That way you can see where the script reaches before it breaks and what needs your attention!

Tweeting an image to twitter with customised overlay

The internet is a wonderful place, full of open-source code and well documented tutorials! Much like RaspiTV’s well documented tutorial on how to tweet an image with overlay text and a logo, this I was very happy about as I really wanted the image of the person who peeked into the house to not only tweet them, but show them within the ‘Terror World’ that they had just seen within the video. So it’s as if I had brought them into the narrative itself!

whoisterrorAfter creating my own @WhoIsTerror twitter page and setting up an app through http://apps.twitter.com I was then able to retrieve my own Twitter consumer keys and access tokens to use the code presented within the RaspiTV blog post and looked in to ImageMagick to understand more about the positioning of an image on top of the picture that it taken by the PiCamera. By reading this documentation I saw that by changing RaspiTV’s code to ‘-gravity center -composite’ I could simply have my image overlaid directly in the centre of the image that had been taken by the PiCamera. All I then needed to make sure of was that my image was the same size as the output of the picture taken by the PiCamera so that it would match up perfectly.

The end result turned out looking great! A picture was taken and my scene, a .png image of the street of ‘Terror’ was placed over their faces, putting them right in the centre of the action!

terrortwitter

Looping the script

Surprisingly, (or unsurprisingly to some who know better), the most difficult part of the coding process was the part that I had assumed would be the easiest; getting the script to loop throughout the day so I didn’t have to keep running it each time a new person came to experience the interactive video.

There seemed to be a bug with the OMXPlayer being ran through Python that when the video finished playing it would kill the OMXPlayer entirely, disregarding the variable that I had originally set at the top of the script for my video file.Screen Shot 2016-05-04 at 06.04.59

So what the genius Ben Nuttall did was put the variable within the while loop, and changed the experience slightly so that the pi waited for motion and when triggered the tweetpic() function has to be called before looping again. This way the video file will always be assigned and ready to play!

Conclusion: Having a physical object made the whole narrative a lot more immersive and fun for the audience involved! 

If you would like to see how I created the illustrations and how I used projection to bring those illustrations to life around me then check out this video:

Feel free to contact me if you would like to chat about interactive videos, immersive experiences or anything of that ilk! All ideas, questions or proposals welcome 🙂

facebooktwittergoogle_plusredditpinterestlinkedinmail

Creating a 360 interactive music video for YouTube (on a budget)

After days of getting my head around what best video qualities to use for these super stretched-out 360 videos and how to create one from my digital assets as well as what in the world FFmpeg was, and ultimately battling with my computers constantly overloaded hard drive, I have finally created my very own 4K resolution 360 interactive music video!

Now I’m writing down all of the steps that I used to reach this point for my own benefit as well as yours as I intend to be playing around with 360 video a LOT more! This is only the beginning of a beautiful journey…

First things first, I don’t have the money for a beautiful 360 camera to record my videos on; I know, boo-hoo, poor me! Instead I can use my knowledge of the digital-verse to my advantage and create all the assets I need from there! Ultimately I’d love to play around with creating 360 videos from animations that are specifically tailored for that medium; but let’s not get ahead of myself, first of all I need to know the basics of how to even get a video to render in 360 in the first place!

My weapons of choice were Maya, to create all of my 3D assets and make the little scene that my audience will be able to look around in! Originally I wanted to also use a 360 camera in Maya to render out the 360 video but I also really like what can be done with animating video and adding effects like smoke and other things easily within Unity3D so instead I decided to go with exporting my scene into Unity3D, replacing all of the textures and adding the necessary tweaks, such as rippling water on top of my underwater scene, lighting, and then animating the main camera slowly around so that it pans the area while the audience can have a good look around.

Screen Shot 2015-12-09 at 16.24.57

After creating all of this I then found this wonderful little tool from the Unity asset store called VR Panorama 360 Renderer which made my life 10x easier! It’s not free but I felt it was worth the £30 or so that I spent, this tool allows you to render out your camera animation as stereo 360 panoramas and 4k videos. I tweaked the settings to allow high quality and have the settings ready for YouTube (how to set these is all in the Readme file of the VR Panorama 360 Renderer, with inbuilt settings for 4k YouTube format), saving them out as an image sequence. Then was the dreaded wait while my laptop went berzerk, making all types of strange noises! I must have done this at least 5 times until I was happy with the desired video at the end, some of the images would end up corrupt because of my poor lil’ computer, but if you have a much better one then this shouldn’t be a problem, I just had to have my video a bit shorter than I wanted but I’ll work on this for next time!

Anyway, once the image sequence has been rendered then it’s time to use FFmpeg, and wow is this useful once you get your head around how to use it! If you don’t already know then look up a little on how to navigate around your computer using the command line, there’s loads of useful videos and documentation online.

Here’s the command that I used to do the magic and create my video ‘test.mp4’ (later renamed) from my folder of images that I just made.

Screen Shot 2015-12-17 at 10.09.19

Let’s start from the beginning: I didn’t use macports, like a lot of websites will suggest, to install FFmpeg, my computer didn’t have enough memory for all that jazz so I simply got the FFmpeg file and moved it somewhere that i could easily find it, in my home directory, and then referenced this when using it to make the new video.

With this in mind, watch this video that helped me greatly with how to do this. The number 25 related to how many frames per second and image2 simply means that it’s looking for image files. My start number is the number I wanted my video to start from within the image sequence, as I said my computer corrupted a bunch so I had to start from a little way in and then have this correlate down the images to create the file. These image files were all named as such: ‘img_00645.jpg’, ‘img_00646.jpg’, ‘img_00647.jpg’ and so on. Because of this we need to tell the computer that the image numbers are 5 digits wide and integers counting up, we do this by stating ‘%05d’ in between the image name, so overall it is ‘img_%05d.jpg’. The next part is to say what

This would have been easy peasy, if I hadn’t wanted to add audio along with it. Now, all I have to edit video is iMovie, have I explained that I have little money yet? So my initial thinking was to simply load the new video into iMovie, add my song over the top and then share it out as a new video. NOPE, NOPE, NOPE! iMovie automatically crops the video, meaning that it will work as a 360 video, but there will be a huge line appearing where the video can’t stitch together correctly because it’s been cropped. Now we have a problem….

So back with my learning of FFmpeg, as it turns out, it can also add audio to a video, perfect! Below is what I used to add the .mp3 file to the video file, for some odd reason though, this only seemed to work when I output a .mov file, but for the YouTube 360 Python script to work (which is the LAST STEP, don’t worry you haven’t missed anything) we need a .mp4 format. *Sigh* more work… It’s fine, we’ll get there eventually. This command didn’t work without using the ‘-map’ which states which source is the video and which is the audio. More on that here.

Screen Shot 2015-12-17 at 12.06.57

I’d like to stress now that I know there are some programs that I could use to convert these, and I did try multiple ones, attempting to make my life easier, but all they did was lower the quality considerably which actually made my life harder. By attempting to get my head around FFmpeg I was able to keep a high quality which is very important when uploading 360 videos as because it’s so wide and get wrapped around it will seem worse quality anyway, we want to import the video in as high a quality as possible so that it’s still enjoyable to watch!

With the audio and video merged into an mov file, all there was to do now was to export that again as an .mp4 file using FFmpeg again:

mp4

Gosh, nearly there! And the quality is still amazing which baffles me! So now I had to actually inject the piece of Python script that would make YouTube see this video as a 360 one. YouTube have a video on how to do this here. Watch the video and then follow the github link to see that it’s completely different layout than the original video helping you understand. Oh, the joys of the internet. So instead of downloading it, go into the releases tab and download the previous version before it got all fancy and updated, it still works fine.

Once this is downloaded you can follow the video step-by-step, putting the Python script in your home directory and running it on your video that you want to render as being 360 for YouTube:

youtube

Then upload and viola!!! It should work! If it doesn’t then be patient, mine took about 10 minutes after processing to actually become 360, all that time I was scratching my head and ready to cry but all I had to do was be patient! The video was even in 4k resolution as well, even after all those changes!

Screen Shot 2015-12-17 at 12.21.54

EDIT

Okay so, I noticed that my song was a tad longer than the 360 video, which was fine because the video just stopped playing while my song trailed off to an end so it didn’t seem too abrupt or weird… But me being the silly perfectionist I am just needed to edit the video. I want a fancy fade in and a nice cut away at the end to round off the song. So I opened it back up in iMovie hoping to find a solution and BOOM! I realised that I could adjust the cropping… WHY DID I NOT THINK OF THAT IN THE FIRST PLACE?! The whole reason why I didn’t use iMovie was because it cropped the video but I can just manually have the whole video fit in the frame. What a numpty.

So after editing it how I liked I then exported it at the best quality that I could (high quality and 4k), took the same steps to add the python script to the .mp4 file and uploaded it to YouTube to see which video came out more crisp and clear.

And guess what, the iMovie video did come out as 4k as well it didn’t look as crisp as my original video at all, even when I watched them in Quicktime Player before uploading them I could tell a difference. You can have a look at the iMovie version but I have decided not to make it public. As well as this you can see a black circle where I have fitted the video to the iMovie editor, so I guess it wasn’t so simple after all!

Overall I’m very pleased with the outcome of my first 360 video. It’s well put together, showing no lines or seam edges and is probably the highest quality of video that I have on my entire YouTube channel! Now I just need to find a way for me to be able to edit the video and have full control over audio within the 360 video. I also want to experiment with putting videos inside of my 3d scenes, so it look someones perceptive of watching a television screen.

It hasn’t been easy but it’s sure been fun and I can’t wait to make more 360 videos! If you would like me to make one for you then do get in touch or if you want to come up with ideas about what we can create with this awesome medium then let’s chat!

facebooktwittergoogle_plusredditpinterestlinkedinmail

Open Frameworks Workshop with Joel Lewis [RPCT]

One of the major perks of being a Raspberry Pi Creative Technologist are the workshop weekends we attend, where we get to go to a totally different city and meet with inspiring and knowledgable individuals that can help us on our journey towards our final exhibition and showing off our digital projects!

Last workshop we visited London to spend the weekend over with Joel Lewis at Hellicar & Lewis; a craft, design and technology studio that specialises in engagement. Here, Joel opened our minds to the wonder that is open Frameworks!

Hellicar & Lewish RPCT

Open Frameworks is ‘an open source C++ toolkit for creative coding’. Yes, creative coding, that’s a thing! On first glances oF can seem quite intimidating with it’s countless amounts of libraries, add-ons and documentation attached to it. For us, taking a look through it all, we couldn’t help but get immediately excited about everything that oF had to offer us, from projection mapping and facial recognition to graphic rendering and animation as well as so much more! But without some guidance it’s easy to become overwhelmed by it all.

Lucky for us, Joel Lewis is an open Frameworks wizard and quickly squished any negative or fearsome thoughts we may initially have had by showing us some of the inspiring work that he and his team at Hellicar & Lewis had produced using this framework. They have created work for organisations such as an interactive arctic dome installation for Greenpeace, and commercial pieces for brands such as Nike with an interactive live broadcast for of Nike’s ‘Festival of Feel’. However, what impressed me the most was how they had used what the framework had to offer to create pieces of technology to help make people’s lives better; one major piece of work being Somability. This is a series of technology applications which included interaction such as visual amplification and rhythmic interaction, these put together promoted expressive movement and collaboration among people with profound and multiple learning difficulties.

Joel explained how using open Frameworks for his projects was like putting a puzzle together. Instead of having to get bogged down with the long-winded ‘codey’ and mathematical parts of the problem, all that has to be done is to search for the different functionalities that you wish to use; be that particle manipulation, beat detection, whatever you wish! Open Frameworks is likely to have the code already there for you to use, or at least have something close to what you need. Then all that’s left for you to do is the creative process, fitting the puzzle pieces together to create something new and utterly awesome! Of course, sometimes these puzzle pieces might not always fit snugly, the code may need to be tweaked to suit our projects needs so Joel led us through the simple steps of changing an application that had already be made. From importing libraries to manipulating the design and display of the application in relative to the mouse movement, we quickly realised that there was nothing to be intimated by with this framework, it was a matter of tweaking code that was already readily available for us and, overall, having fun!

Joel also emphasised his love for the open source community during the workshop. Gone are the days where people want to hide their work and keep their findings to themselves so as to become better than their peers; todays world is all about being open and sharing with the community! Every library and add-on within oF has been created by somebody and shared freely, asking for nothing in return. That might sound crazy but in reality it’s actually very clever! Not only do you help others (such as myself) to learn how to code by looking at examples and tweaking bits that are already there to suit my own needs, you also get the benefits of the community building upon your initial piece of code, fixing bugs or even making it better than you could have yourself. Heck, somebody might even see your open source code and offer you a job from it! An open source community is also a friendly one, one where people actually want to help others instead of simply focus on their own projects, and therefore the open Framework’s forum is always full of people willing to pass on their knowledge to others and help wherever they can; which is great news for us newbs!

Joel Lewis and RPCT

After the weekend at Hellicar & Lewis I’m left feeling very excited about what open Frameworks and the open source community surrounding it has to offer and can’t wait to start piecing together my own puzzle!

facebooktwittergoogle_plusredditpinterestlinkedinmail

Brighton Mini Maker Faire – Emoti

Since developing Emoti at the Art Hackathon I couldn’t help but look around for events that I believe could help get our little project out in the public eye. What better event than a Maker Faire?! (Especially one that’s in one of my favourite seaside cities in the UK, Brighton)

ASSEMBLE EMOTI DREAM TEAM!

emoti team

Only a few of our original team could attend the Maker Faire; (from left to right) Bawar Jalal, Milton De Paula, myself and Katherine Hudson. Together we started brainstorming how we could get Emoti to be shown in all of it’s glory at Brighton Mini Maker Faire! Starting with a web page and a concept to use Emoti as a way to showcase Hackathons and the power of what can be created out of merging techies and designers, we signed up as Makers and awaited our confirmation.

The email came back a week or so later, EMOTI WAS IN! We were going to be Makers with our own table space to show off our project and do as we please; this was to be my very first showcase of Emoti, or anything as a matter of fact!

As the Maker Faire came closer we discussed how best to display our visual installation and all agreed that a dark environment was needed for the audience to experience Emoti’s message and colourful beauty fully. Because of the audio that we have incorporated with the installation it also works best with headphones and in an intimate, enclosed space where the viewer can’t be disturbed by their surroundings. We finally decided that the most feasible and effective solution would be to place Emoti in a large rectangle box, draped in black cloth for the audience to pear into, this would also cause some curiosity about what is inside the box which will hopefully attract people to come and check Emoti out even more. (we’re all suckers for the unknown!) Luckily, Katherine’s house mate, James Sargent, is a carpenter who was very kindly able to offer us help with this.

emoti boxbusiness cardsWeeks of perfecting the site, tweaking Emoti and creating personalised business cards went by until suddenly the Brighton Mini Maker Faire was only a couple of days away and I was on my way to London to get together with our dream team and start the set up for our installation.

The first day was spent collecting all of the equipment needed for the creation of the Emoti box: wood, black cloth, glue, nails, etc. We then had a hand at carpentry, under the watchful eye of a professional of course (James Sargent). Luckily we made the box with all fingers in tact and smiles still on our faces. Atop the box we fitted Katherine’s newly made Emoti logo which gave the whole structure a much more polished finish that we could all be proud of.

Enough rambling, how was the event?!

Being in Brighton was amazing, myself and another Emoti member, Milton, arrived on the day before the main event to set up the Raspberry Pi, placing it within the HDMIPi and putting it all together into our brand new Emoti box. This was also a great opportunity to have a sneak peak at what was in store for the main event! Looking around at all of the tech, twinkling lights and crafts I couldn’t help but feel excitement roaring inside of me, especially when free pizza and beer was announced!

Brighton hostel viewSleeping in a hostel that night lead us to meeting even more interesting people who had travelled from around the world and fallen in love with the small seaside city of Brighton. I can’t blame them, it really is a beautiful place to be! After a night of talking to them about their adventurous travels and explaining Maker Faire’s, technology and Raspberry Pi with them, we fell asleep, ready to face the fun-packed day of Brighton Mini Maker Faire!

However the next morning, DISASTER STRIKES! Bugs have appeared from seemingly nowhere, yet the doors are opening to the public! In a frantic frenzy we code like man men (and women), hoping to get Emoti up and ready for the event as soon as possible. But of course we don’t want to confuse the people, so what should we do? Reference Seven of course! With a “What’s in the box? – Come back at 11” sign on top of the box we started to spark some curiosity and (hopefully) distracted the public from the real problems that we were encountering behind the scenes.

It’s not long until Emoti is back up and running, ready for people to marvel at the wonderful visualisation of the emotional state of the twitter-verse! It was so exciting and exhilarating watching people become drawn to our black box out of curiosity and take a peak inside. COJnE3uWgAEe0iRThose who hadn’t read what it was about became confused very quickly by the blocks of colour dancing around the screen and different audio clips clashing together, understandably so, and I was more then happy to explain to them exactly what our sculpture was showing them; how it was using real-time tweets to represent the overall emotions of twitter through colour and audio. Watching them understand the concept and become excited about it themselves was the best part, each person who came out of the experience explained how overwhelming it could be to become immersed within the box, especially once they understood what was being shown to them.

Many people had ideas of their own for Emoti, such as using it in the news and only pulling the data from people who were tweeting about a certain topic to try and determine the emotion that the majority of the public were feeling about a certain topic or event. This Emoti webpage could replace the background of the newsman as a visual representation of the emotional state of the world around the most recent news stories. What a genius idea that is!

I now completely understand why people love to bark on about how amazing open source is! When you open your creations and ideas to the world it can spark other people’s creativity, having them expand upon the initial idea or project, leading to people working together to make something even more incredible than the original creator could have even imagined!

maker faire

Overall my experience of Brighton Mini Maker Faire was an overwhelmingly positive one. Not only to showcase Emoti for the first time, but also to check out so many diverse Maker projects and workshops, from technology to hand-made arts and crafts! Everyone I encountered there was full of enthusiasm to share their knowledge and excitement to learn from those around them as well. The Maker community is simply a bunch of friendly, creative and intelligent people; most of whom are essentially big kids who can’t help but tinker with things and make cool stuff, even if it has no purpose, and that’s what makes them all so much fun to be around.

Watch my video on Brighton Maker Faire for more information:

facebooktwittergoogle_plusredditpinterestlinkedinmail

Raspberry Pi Creative Technologist: Origins!

Me and Pi

For years I’ve felt trapped in an endless battle over where I want to be in my ‘career’. Conflicted over my mutual passion for both narrative and using technology to be endlessly creative with the stories that I come up with. It seemed the war over what society deems as ‘creativity’ and what I perceived to be true was never to end.

From education where I used my initiative to adapt my ‘media course’ to include games development and web development units along with the usual film and radio ones, and left hoping to mould my own future the way that I foresaw it. But instead, fell into a trap of corporate ideals within the working world; forcing me to choose between the forward-thinking robots of the ‘tech’ world, or the pitches and brainstorming world of the ‘creatives’, with little room for anything in between.

creative technologists

This is when I stumbled across the Creative Technologist Programme by Raspberry Pi. This programme advertised itself to be ‘focused on supporting and inspiring young people who are interested in creative uses of technology.’ My jaw dropped, excitement splurging as I frantically applied to the best of my ability (a total of 3 times by accident, perhaps a bit too eager)

Tadaarrr!

Thus concludes the origins of how I became a Creative Technologist!

Raspberry Pi Towers

As a Creative Technologist I’m able to meet inspiring people from the Raspberry Pi community, from those within the Foundation to those attending Raspberry Jams and other events / meetups. Draw knowledge from successful individuals with other companies such as Hellicar and Lewis, FutureEverything, Pimoroni and more. Gain experience from getting involved in hackathons as well as independently learning about technology and what can be achieved with it.

All the while I’ll be working towards a top secret project which will be exhibited by the end of the programme, around April of next year.

Throughout this time I’ll be updating this blog with my experiences throughout the Creative Technologist programme; where I go, who I meet and what I learn. Everything! Apart from my project…

My project remains TOP SECRET!

facebooktwittergoogle_plusredditpinterestlinkedinmail

Birthday Live Stream and Twitter Wall

Birthday Live Stream

May the 15th is a very special day. My birthday, and like most people I wanted to do something pretty awesome for it! But, I rejected the normal traditions of booze, parties and even cake to try something totally new on my YouTube channel and integrate it with my new role as a Raspberry Pi Creative Technologist.

I wanted to spend my birthday with the Internet! (Yes, I know how lame that sounds but no, I have no regrets)

The Idea

I came up with this idea a couple of days before my actual birthday during one of my ‘creative naps’ (trust me, napping is extremely useful for helping creative juices flow, it’s not just laziness, I swear). I realised that the closest people to me were scattered all around the world, from different parts of the UK all the way to the USA, and wished there was a way to share my birthday with them all. But of course there is, this is the digital age after all!

So I set forth investigating all about setting up a live stream through YouTube. My idea seemed simple, live stream for as long as I could, giving my friends and viewers alike from around the world a chance to take part in the online event in their own time zones. The only problem was, THE LAG! It was horrendous, I would say there was about a 10 second delay at the best of times. Granted, this probably had a lot to do with the fact that I live in a house with 3 other equally Internet-obsessed individuals, but still, there was no way that I could interact with people this way. That’s where the Raspberry Pi comes in! I could set up a live twitter feed, pulling tweets with #YagmanXBDay and see them in real-time, using these tweets as a way to interact with my viewers more efficiently as well as having them to fill gaps in my very long stream so that I could have small breaks. Perfect!

The Tech

Twitter feedThe Twitter Feed was coded based on a BitPi.co tutorial which uses NodeJS and ttezel’s twit Node Twitter API to pull tweets with certain keywords into the terminal in realtime.

I then made my own tweaks to the code by using the node module colors.js to decorate the terminal, making the tweets appear in a more presentable way. I randomised the colour of each tweet, keeping the twitter handles the same to emphasise who the tweets were sent from and added break points in between each tweet to make them easy to differentiate from each other. The code can be found on my GitHub in the Twitter-Feed git repo.

By running this on the Raspberry Pi I was then able to SSH into it from my mac and connect an external screen just for the terminal window that was hosting the twitter feed. This way, when using Google Hangouts to live stream the event I could simply use the screen share option to switch to the open terminal window to show viewers their tweets.

The Outcome

The response and engagement that I received from the live stream itself far exceeded my expectations!

The twitter feed stayed active throughout the full 6+ hours of my live stream, even those who couldn’t watch the video stream were getting involved by tweeting with the hashtag #YagmanXBDay. Some simply wishing me a happy birthday, others helping to spread the live stream further, commending my achievement of what is possibly the ‘Nerdiest Birthday Ever’.

Viewers of the stream used the twitter feed to interact more directly with me while I entertained them through singing, attempting to rap, answering questions and even reading them a bed time story before finally ending the stream.

This birthday I came away with a wider knowledge of technology and how much the Raspberry Pi can handle, greatly improved improvisation skills by coming up with creative ideas on the spot to keep my viewers engaged for over 6 hours and ultimately proved to myself that even with limited time, if I put my mind to something I really can make it happen. (Anyone can!)

CHEESE ALERT

Thank you to everyone who got involved in my Birthday Live Stream, either by watching, tweeting or even showing interest in it now! I’m no fool and I know that without the kind and awesome people who show interest in my content and projects, they simply wouldn’t amount to anything. It’s with engagement and interaction that spontaneous ideas such as this one become a reality and I couldn’t do it without the wonderful online community that I am so grateful to be a part of.

The full stream is on my YouTube for all to enjoy!

facebooktwittergoogle_plusredditpinterestlinkedinmail

Art Hackathon – Emoti

emoti

Update: Check out the Emoti web page!

I’ve always heard about how awesome Hackathon’s could be; they’re a chance to surround yourself with intelligent people who share the same interest, come up with inspiring ideas together and become engrossed in a project, with everyone chipping in to turn concept into reality over one weekend.

Hacking at Hackathon

But I’m going to be perfectly honest with you here; however awesome this sounds, I can’t help but feel a tad intimidated by it all. There’s still so much I feel I have yet to learn and I always worry about how much of an asset (or a nuisance) I would be.

So when I saw an opportunity to go to an Art Hackathon, which aspired to mix teams with different skill sets and types, I knew that I had to attend, and I’m so glad I did! With the Hackathon holding presentations by many talented people including as Joel Lewis, Di Mainstone and Nick Rothwell, as well tables full of various tech and art supplies, there were no limitations to the amount of creativity that we could muster!

All of this inspired a creation that managed to win 2nd Place for Peoples Choice, and I can proudly say that I was a part of its development:

Emoti – Visualising our Emotions

Emoti shows the emotional state of the world through combined visual colours and audio, resulting in a beautifully chaotic representation of the emotional state of the world- or at least the twitterverse.

Using Twitter Widgets, our team was able to pull certain keywords from tweets being posted in real time and assign them to different emotion types, which meant being able to have constantly updated data on how people were feeling on twitter through these emotion-related keywords. The emotions we assigned them to were: Happy, Sad, Surprised, Afraid and Angry.

From this data we then created a simple HTML web-page with 5 divs, or blocks, of colour relating to the different emotion states. These would constantly change width depending on the data that was being collected from the tweets to give a visual representation of how many people were tweeting under each emotion:
Emoti colours

  • Green = Happy
  • Blue = Sad
  • Yellow = Surprised
  • Pink = Afraid
  • Red = Angry

To make this experience of witnessing how the world feels and how frequently these emotions change more immersive, these visual representations are also accompanied with audio. We chose five audio tracks, one to depict each emotion, adding them into the web-page using the HTML5 audio tag, and adjusted their volume depending on the emotion-based twitter data with some JavaScript wizardry. This ended up with the clashing music types seemingly battling against each other, reflecting how hectic the live emotion states were and how rapidly they would change at random; one moment showing solely happiness, the next ultimate anger.

This was an Art Hackathon don’t forget, so, of course we had to present this data in a beautiful and intriguing way. What’s more intriguing than creating the illusion of 3D colour-changing ripples?!

emoti fullFor this effect, the designers in our team lazer cut clear plastic to create the individual ripples and slotted them into a black board. I decided this would be the perfect opportunity to whip out my Raspberry Pi!  We ran the web-page through the Pi and hooked it up to the HDMIPi, allowing a bright screen for our structure to be placed onto, so that the moving coloured blocks from below would shine onto the clear plastic and give the illusion of a 3D object.

Finally, the structure was put together in a dark, enclosed space, and the end product came to life, completely exceeding my expectations! Colours danced gracefully across the ripples, making us forget that there was even a web-page below. It was easy to get lost in the entrancing movement of pattern that the object seemed to create. As soon as you immerse yourself in the full experience, with audio as well as these entrancing visuals, it becomes a little overwhelming. Watching the colours is one thing, but hearing the clashes of audio really brings the message across that this is how people from around the world are feeling right now.

It’s both a marvel and a mess all at the same time; both beautiful and chaotic. Just like the emotions we feel and the complexity behind them. 

Yes, it’s open source! Find the (somewhat messy) code here: https://github.com/itomblack/emotion-twitter

What I Gained…

Aside from the obvious: an awesome project, a better understanding of how to work in a team and improved coding skills, I managed to come away from the Hackathon feeling much more positive about what I, an as individual, can achieve. I may not be have been the most skilled coder in the room but I was still able to have meaningful input on the project, both creatively and through my development skills, which leaves me wondering what I was so worried about in the first place!

As well as this I’m so grateful to have had the pleasure of meeting many creative and genuinely lovely people. It was so interesting to see all of the various projects that everyone had made, each one entirely unique and fascinating in its own right.

Thank you to the people behind the Art Hackathon event and those intelligent folk within the Emoti Dream Team who helped bring it to life:

P.S. This is my very first blog, how am I doing? Let me know! (If you want to… No pressure…)

facebooktwittergoogle_plusredditpinterestlinkedinmail