This post is coming to you live from Maker Faire Trondheim being held in the town square here in Trondheim, Norway, all weekend.
Jon Haavie and the “Google Cardboard”-style viewer in action
What happens when you mix a “Google Cardboard” style viewer—called LINS, Swedish for lens—along with a Go Pro spherical camera mount? Well something like this, I talked to Erik Thorstensson—from Creatables in Göteborg—who was here showing off a prototype viewer created by student Alexander Osika.
Erik Thorstensson talking and the story behind the “Google Cardboard”-style viewer
Afterwards I followed up with Alexander about the project,
How did you get started?
My brother Anton Osika and I started this project earlier this summer by first 3D-printing some of the cool Open Source designs for a virtual reality headset from OpenDive. After seeing the potential the technology had, we started making prototypes out of cardboard, and we felt that we were on the right track when Google announced their project Cardboard, which was very similar to what we had in mind. But we also saw how a lot of people saw the Cardboard more like a joke than a real product, and realized that for this to become commercial, we probably would have to make it in plastic.
What was the driver behind the project?
We also wanted to take advantage of the fact that you always have your smartphone with you, and that a virtual reality headset should not be so bulky and fragile that you just can’t take it with you. We had just started prototyping in polypropylene plastic sheets, with die cutting as the expected way of manufacturing, when we came in contact with Creatables and got a lot of great support and advice.
How far have you got with the prototype?
The prototypes we have now is looking very promising; the design is foldable and takes almost no space at all, takes only seconds to set up for usage with your smartphone, is very durable and will have a manufacturing price almost as low as a Cardboard-version.
Where are you heading with the project?
We really hope that this product will help virtual reality technology being both cheaper and more accessible, as we strongly believe in both virtual and augmented reality having a big role in the future of technology. This is also why there has been no question that this will be an Open Source design, and we hope that the first distributors will be science centers having “build your own”-workshops.
The Trondheim Maker Faire is a two day faire being held in the Trondheim town square. It opened yesterday, and is open again today between 10am and 4pm. It is free to attend.
Crux, knitted wool mounted on board, 1994
Eleanor Kent, an artist who innovated methods of making art from new technologies, died recently at the age of 83. The San Francisco native started drawing and painting seriously in the 1950s and continued to branch out into other media over the years, such as color xerox, computer graphics, and even EL wire, as these technologies emerged.
From the artist’s website:
Working with Bay Area Figurative masters helped Kent form a solid art foundation, which she used to explore other mediums and forms of expression in the following decades. In the 1970s, she painted on fabric and t-shirts and used color copiers to create prints. Throughout the 1980s, Kent explored developing computer technology and graphic systems as art tools and helped found Ylem, a tech art group. During the ‘90s, Kent started knitting the fractals and other mathematical images she saw on computers, and today crochets body jewelry using electro-luminescent wire, which surrounds the wearer with light. She paints and continues to work for the creative use of technology and a sharing of information as a way of peacefully exploring our existence.
Not only did she make many extraordinary works, she also helped develop the concept of exploring new technologies as a means of artistic expression, which has come to define so much contemporary art production.
Tahoe Water, color xerox, 1981
New Suns, cibachrome print from Apple lle, 1983
Spiral Fractal, knitted wool mounted on board, 1988
Magic Carpet, knitted wool, 1993
Rose Coral, hyperbolic crochet e-l wire, 2007
A public celebration of her life will be held at 5 p.m. Aug. 7 at SOMArts in San Francisco.
[via Prothetic Knowledge]
Looking for creative ways to capture some action with your GoPro camera? Take a gander at what these industrial design students came up with during a workshop run by dutch designer Roel Wouter of Moniker at The ECAL University of Art and Design in Switzerland in this riveting video. The brief for the workshop was simply to “build an apparatus that produces videos the world has never seen before.”
Seeing this video really reminds me of how important good documentation is when making a project to share on the internet. Sometimes just thinking about how to document your project can be more important than the results of the project itself.
[via Man Battlett]
If you or your family and/or roomates are annoyed with a constantly humming GPU fan, why not consider liquid cooling your computer? Sure, simple air cooling is (was?) good enough to cool some small cars, but if you want something truly unique to add to your dual-GPU rig, few things will set it apart like a custom cooling system.
If you don’t know where to start, Carlos wrote in with his excellent set of instructions, linked above. Although there is definitely some ingenuity involved in this type of setup, it may not be quite as hard as you think. Most of the cooling components used are available off-the-shelf, so you won’t need “exotic” tools like a 3D printer or CNC router.
One of the more delicate parts of the operation is stripping the GPUs of their cooling fans and heat syncs. You’ll need to carefully strip the thermal paste off the GPU in order to properly apply the water blocks, which transfer cooling water in and out, taking the excess thermal energy with them.
According to the home page of this project, it took 12 hours to complete, as well as a couple months of research. Reportedly, the temperature dropped dramatically, and the constantly-running fans are nearly silent.
The Commodore 64 may be long out of production, but it still lives in the hearts of many enthusiasts. Some people might write new games for this computer, but YouTuber “Staring Lizzard” decided to instead design and build a stand-alone emulator.
Physically speaking, this is really a work of art. The board is designed as a The board was designed as a 6 layer PCB, of which more information can be found here. It has the same general dimensions as a Raspberry Pi in order to take advantage of its small size and readily-available cases. The display is a 7 inch TFT screen, and has a 800×480 resolution. The whole thing is encased in a beautiful clear box that looks like it could have come from an industrial design firm.
Naturally, there was also a large amount of software work that went into having this board emulate the C64. You can find the details of this here. According to the write-up emulation isn’t entirely perfect and took some trial-and error to get to work acceptably. Regardless, this is an amazing build for a hobby project.
As seen in the video, it’s able to run quite a few original Commodore games! I hope that someone else will get inspiration from this project. Since it should dimensionally be able to accommodate a Raspberry Pi, it would make a great chassis for projects involving that board.
Now that I’ve gotten your attention with these hypnotically looped GIFS, I can tell you how you can make your own seamless GIFs with a tool that automatically finds perfect loops in videos. It’s called Loop Findr and it was created by Electrical and Computer Engineering student Collin Burger in openFramworks.
Since their creation in 1987, animated GIFs have become one of the most popular means of expression on the Internet. They have evolved into their own artistic medium due to their ability to capture a particular feeling and the format’s portable nature. Loop Findr seeks to usher in a new era of seamless GIFs created from loops found in the videos the populate the Internet. Loop Findr is a tool that automatically finds these loops so users can turn them into GIFs that can then be shared all over the Web.
Download Loop Findr and see more example of seamless GIFs here.
[via Prosthetic Knowledge]
What do you do when you have a small spare organic LED display? If you’re computer engineering student Jared Sanson, you make a watch out of it, designing nearly every aspect of it from scratch.
This design was started at the circuit level, and after that, a PCB was laid out in Altium, which Jared only had limited experience using. Once the PCB and components arrived there were a few initial issues, but correcting your mistakes is always part of engineering a new item.
Once the hardware was functional, quite a bit of software work had to be done. In order to get everything running, including the firmware, graphics engine, and several other “details,” Jared used the C, C#, and Python languages. He humbly remarks on his blog that “it’s taken me a lot of work to get this far,” and I can only imagine how many hours were put into this project.
To finish things off, he considered using a 3D printed case, but decided to go with an aluminum case designed for the iPod Nano. After a little modification, it looks fantastic, but given the amount of detail put into the electronics and programming, I’m almost surprised that he didn’t make a mould and cast his own!
Slideshow pictures from above linked page, and OLED watch (1), OLED watch (2)
This week, July 14-19 2014, we’re exploring wearable electronics of all kinds on Make! If it is electronic and belongs on your body, we’d love to hear about it! You can find all of our wearable articles by going here.
If you like taking selfies that render your likeness completely unrecognizable, then you may be just the assimilationist contrarian who will enjoy making hypnotically glitchy webcam selfies with this experimental project by Adam Ferriss called Gush.
The effect works by creating a ‘flow’ image from a comparison of the current frame and previous frame. The flow looks like a colored contour image when there is movement in the scene. The flow output is plugged into another shader that runs in a feedback loop, continually blending new frames on top of old ones.
[via the creator’s project]
Ron Evans and Adrian Zankich talking about Cylon.js on the Make: Electronics Stage at the 2014 Bay Area Maker Faire
There was a time when turning an LED on and off using a microcontroller took a week, and detailed knowledge of the microcontroller. But that was before Arduino. But even with Arduino people sometimes found it hard to hack together the things the wanted to do, especially when you had to deal with networks, something that was traditionally seen as hard on an Arduino.
Despite that the Arduino, and later the Raspberry Pi, made building things—robots for instance—much easier, primarily due to the huge community that they built up around themselves. It has been those communities that has led the Arduino and the Raspberry Pi to dominate the landscape. If you had a problem, there was someone that had probably already had the same problem and solved it for you.
So tell me about Cylon.js?
There are a couple different reasons. One is that the JS community are very much trail-blazers in terms of exploring new technologies. Another is the influence of my friend Chris Williams—the main organiser of JSConf and the newer RobotsConf—who has been a key player in helping introduce the JS community to hardware hacking.
The ubiquity of JS has made it a lot easier for people to program on different kinds of JS-enabled devices, such as the Beaglebone Black and Raspberry Pi. Working in a higher-level language such as JS allows devs to spend less time of just trying to get things to work, and more time actually making something useful.
The platforms you support seems to be a mix of UI elements, pre-built hardware, software and boards. How do they interact?
We call it “full-stack robotics,” and we have adopted several different software design patterns to integrate different layers together in a seamless way. Similar to how web developers can switch between different database engines, we allow you connect to different devices, and even switch from one platform to another with a minimum number of code changes. We also support “Test-Driven Robotics” to allow devs to write automated tests before writing code on the actual hardware.
Cylon.js also supports many different kinds of communication with devices, such as serial or TCP/UDP. In the case of the Arduino we communicate using the Firmata protocol, and in the case of the Digispark we support a protocol named Littlewire created by the brilliant Jenna Fox that runs on even smaller micro-controllers such as the Digispark.
You seem to run a lot of workshops to promote the framework, tell me how those go? Why do you run them?
We have had an amazing response to the robot hacking workshops that we’ve been running at conferences all over the world.
From people who are already makers, to those who have never had a chance to program any hardware at all, we have seen a really high level of enthusiasm and happiness. We try to incorporate the artistic and creative side as well. For example, at our recent workshops we show people how to make wearable controllers out of Popsicle sticks and conductive foil to drive around Sphero robots.
Where do you see Cylon.js heading?
We are starting to see a very active community growing. At JSConf, we had a group of people that built “NodeRockets“ using Cylon.js, the Raspberry Pi, and Arduino, which they then launched into the sky using compressed air. They had telemetry readings, deployed their parachutes, and everything all using Cylon.js. No surprise that Cylon.js is demonstrating space superiority, of course!
We are adding new hardware support for more devices, some of which are not released, so we cannot talk about them yet—but more on that in the upcoming months. Our company is the “software company that makes hardware companies look good,” so we’re here to help out both as open source contributors, as well as professionals when we’re needed.
With the ability for them to hack hardware in their native language, I think we’re going to be seeing a lot more hardware hacking from the web developers.
Robert Wessels with a hacked Hexbug and Launchpad-enabled remote control.
The Texas Instruments MSP430 is similar to the Atmel ATmega micro-controller, however there are some differences, including a very low price, and some interesting refinements for low power consumption.
If you want to get your hands on one, the easiest way is to pick up a TI Launchpad developer board, however the big problem—at least until recently—both for the Launchpad and the MSP430 itself, was the programming environment. For a generation of makers used to the Arduino, the Eclipse-based development environment of the MSP430 was overly complicated and hard to use.
This was solved with the arrival of Energia. With cross platform support—for Windows, OS X and Linux—just like the Arduino environment itself, it brings the Wiring and Arduino frameworks to the MSP430, and the TI Launchpad. That means you can take your Arduino source code—your sketch—and drop it directly onto the MSP430. It makes the MSP430, once horribly hard to use, as easy to use as the Arduino.
I talked with Energia creator Robert Wessels and Texas Instruments’ Adrian Fernandez about the TI Launchpad and the Energia Project, and about the hacked Hexbug toys they’ve brought with them to Maker Faire this year.