Last year I decided that I wasn’t doing anything interesting with my collection of littleBits components and modules, so I finally gave them up and sold them. I hope that they’ve all gone to good homes.
They were fun, and very useful at times. I did enjoy them, but I think I had got to a point where I’d stopped doing anything creative with them.
The littleBits bitLab that is. I’d always wondered how sustainable it would be, but apparently there’s already an answer to that question. So it’s a shame. I think that there were some very nice projects in there but obviously not enough money to make it work.
For a long time I’ve been thinking about the idea of being able to create a synth in hardware which reacted to the virtual world, more specifically, to being tweeted at. To that that end I got myself a littleBits CloudBit, which works quite well, albeit it isn’t able to provide as much data as I’d like it to. However, that’s something that I need to overcome with a little bit of ingenuity rather than just complain about it.
So I’m starting to build the synth and get it to respond to external stimulus. The goal being to create a little system that reacts to a twitter message, creates a sequence based on that message, and somehow records that and uploads it somewhere. We’ll see. That’s the plan. Who knows how it’ll work out.
I think this could be amazing. I do think that littleBits should be doing this themselves. They should also make it a part of their iOS app. Anyway, if you’re interested in it, take a look at the GitHub page and check the roadmap.