The Interacket from Drap og Design that mimics chameleons.
Four Students from Oslo School of Archetecture and Design decided to enter a cool project into the Hackaday competition. Their project, called the Interacket, attempts to give the user an experience of how animals experience and view the world around them.
It would be hard to recreate a bats power of sight through sonar using technology, so this bunch has decided to mimic the way chameleons blend in with their environment. It is a simple and effective design that gives an inkling of superhuman possibilities and can change your perspective of the world around you. They have a video of their Interacket in action on Vimeo
A diagram of how the Interacket works.
The picture above show the design for the Interacket and the components involved. Two Arduino Unos are used as the micro-controllers for the jacket (one for each arm) alongside 9v batteries to power the board and the LEDs. LED strips are housed inside the jacket, down each arm. Adafruit’s neopixel libraries and code was used to control the LED strips based upon data obtained from the RGB color sensors worn on each hand of the user.
They used TCS34725 from Adafruit as a RGB color sensor with IR filter and a white LED. All of this allows the user to touch objects within their environment to change the color of their jacket: blending into their surroundings like a chameleon. If nothing else, it would make a great novelty.
The Arduino Uno and LED strips that are housed inside the jacket.
The photo above shows the Arduino Uno and LED strips functioning outside the jacket. The jacket itself is made very simply of painter coveralls lined with aluminum foil to reflect the light of the LEDs outward. Hopefully the jacket didn’t get too hot for the wearer either. They are currently working on the Interacket 2.0. Check out their Hackaday.io page or drapogdesign.com
Your Help Needed!
A Make reader named Monique is about to get married. She would like to adorn her dress with LEDs. She’s just not sure how to. She wrote to us for help and we thought it better to go to you: How would you tackle this problem?
Moreover, we thought to gamify it. Send us your answers or post in the comments below. If Monique chooses your solution (and, we’re afraid, if you reside in the U.S.), we’ll send you a free copy of Kate Hartman’s new title, “Make: Wearable Electronics.”
Ready? Monique writes:
I want to put LED lights on my wedding dress . . . a lot of them. I don’t know what kind to get, or what type of battery power I will need. I would like the dress to change colors, and cycle thru patterns. I really don’t know anything about LED lights or programming anything. Can you recommend something for me to complete my project?
Monique likes the look of this dress. How would you do it?
Immediately after Ryan Slaugh’s Sound Card Oscilloscope went live last week, I serendipitously began stumbling upon more oscilloscope projects and videos. One is this gem of a video by Austria-based musician Jerobeam Fenderson, who makes “oscilloscope music,” performing live gigs using the o’scope to mesmerizing effect.
In this case he appears to draw mushrooms dancing on the o’scope screen. Before they “move into space.” Yes, really. Watch:
Designed to engage and empower the maker community, Intel’s “Inside the Blue” project, developed in tandem with digital media agency Noise, encourages makers to create robotic creatures using their Galileo board. These creatures are meant respond to invisible waves all around us. To help makers get inspired, they recruited beta teams to make a few of these creatures and develop tutorials so that you can make them yourself. Make: got the first look at these creatures at World Maker Faire New York.
The Brain Coral can take any kind of sensor input and displays activity on the RGB LEDs inside. It also acts as a base station for wireless sensors. Powered by Node.js, its web-based interface allows you to access the sensor data in real-time on your tablet, laptop, or mobile device. A full how-to is available on the Intel community.
The Brain Coral can be used to control the Signal Fish, a flying robotic sensor platform. It can explore an area with a sonar based object avoidance system and a random walk algorithm. When it finds a sufficiently strong wi-fi signal, its on-board LEDs respond with dazzling patterns. Of course, you could also switch the Signal Fish into manual mode and control its flight from your phone or tablet. Intel’s step-by-step instructions show you how to establish wireless communication, construct the rig, and build out the circuit.
Both creatures can be replicated, expanded, and modified. Or if you’re inspired to create your own creature, the resources behind these projects provide a fantastic jumping off point.
Halloween is one of my favorite holidays for one reason. Candy! However by the end of the night, the neighborhood kids have usually picked over my candy bucket. This year I’m going to change that! To keep kids away, I’m going use an Arduino to detect when someone has their hand in the candy bowl, and use a solenoid to shoot silly string at the candy thief. To detect when a hand was in the candy bowl, I used an infrared LED and infrared sensor to create an invisible beam on the opening of the plastic pumpkin.
When the beam is broken the Arduino will send a command to a power switch tail which in turn makes a solenoid push down on the silly string can.
I mounted the solenoid and silly string to a few pieces of foam board so the solenoid hits the silly string every time.
To allow for easy connection of the solenoid and IR LED and sensor, I mounted a terminal block on a project enclosure. The Arduino and 9V battery sit inside the project box and the terminal block connects to the Arduino through short jumper wires.
The Arduino code for this project can be found at my GitHub page
Open-source software powers many consumer drones and UAVs today, and now a new initiative will put those applications under one unified platform managed by the Linux Foundation.
The program, called Dronecode, aims to help accelerate and broaden drone software through the deep Linux community. Announced today by 3D Robotic’s CEO Chris Anderson at the Embedded Linux Conference in Dusseldorf, Germany, it will focus on the major drone applications, including 3DR-sponsored APM (autonomous autopilot software for embedded copter, plane, and wheeled controllers), MissionPlanner and DroidPlanner (laptop/Android-based flight-path management), and MavLink (aircraft flight information communiications). It will also take oversight of the PX4 project, a cutting-edge autonomous flight endeavor that is being utilized in the 3D Robotics “Pixhawk” flight controllers.
“…we are entering the consumer and commercial drone age and I’m delighted that an open source platform is helping lead the way,” Anderson writes on dronecode.org. “Now that we have reached this level of adoption and maturity, it’s time to adopt the best practices of other highly successful open source projects, including professional management and governance structures, to ensure the continued growth and independence of these efforts. There is no better organization to lead this than the Linux Foundation.”
Along with 3D Robotic’s inclusion, the program comes with the support of major players in the drone community, including DroneDeploy, jDrones, Walkera, and Yuneec. Anderson also notes the support of Intel, Box, and Baidu for the project.
“By becoming a Linux Foundation Collaborative Project, the Dronecode community will receive the support required of a massive project right at its moment of breakthrough,” says Jim Zemlin, executive director at The Linux Foundation, in a press release. “The result will be even greater innovation and a common platform for drone and robotics open source projects.”
Beyond the Dronecode announcement, it’s been a busy past couple months for 3D Robotics.
Last month, the company announced Richard Branson as their latest investor, bringing considerable business acumen and flight experience to the company through his experience with Virgin Atlantic and America airlines and the space tourism endeavor Virgin Galactic.
In his official welcome, 3DR and Virgin posted a video of the company’s visit to Branson’s private getaway in the British Virgin Islands that demonstrated new flight functions for its aircraft, including their new GPS-powered follow-me mode. The video also includes 3D-rendered shots of the island made from quadcopter-shot footage.
3D Robotics also recently announced the next iteration of their Iris quadcopter, the Iris+, which incorporates many of these new flight functions along with double the flight time of its predecessor, improved landing apparatus, easier spin-on propellers, and direction-indicating lights.
And at the Intel Developer Forum, 3DR disclosed partnership plans to use the diminutive Edison microcontroller in their next-generation autopilot as a computing companion — allowing for more advanced functions like an optical-based follow-me mode (instead of tracking your phone’s GPS). “Our next-generation autopilot will be built around the notion of carrier boards,” Anderson says, explaining that different boards will be used for different functions.
Got a new iPhone? Have you thought about what you’ll do with your old phone? Media artist Julia Christensen is interested in what happens to pieces of technology when we’re done with them, so she made a fascinating work called BURNOUTS from upcycled iPhones. The work consists of donated iPhones that Christensen installed in 3D printed containers, featuring mirrors and lenses from discarded overhead projectors, in order to cast animated images of retired star constellations upwards onto a ceiling.
Just like the constellations illustrated in her work, Christensen connects the dots between our relationships to old technologies to create a new picture of how we understand them. Not only were these constellations edited from star charts due to increasing light pollution from Earth, but they were all constellations named for antiquated technologies, just like old phones and overhead projectors used to emit illuminated images of them.
The projection of these five constellations is a poetic metaphor for the technology producing the image––just as the constellations are still there and yet no longer in use, so are our own outdated gadgets.
Although we clearly have a long way to go toward improving how we deal with electronic waste, Christensen’s work is a stunning demonstration of the tremendous potential for creative expression using technologies that we might otherwise throw away. So, if you’re looking for a new project (or a new projector), you might want to think twice before you get rid of that old phone!
[via prothetic knowledge]
Remember the spell book from Hocus Pocus with the creepy moving eyeball? Since it is the season of Halloween-movie-replica-prop-making, I’ve found a great DIY tutorial that covers the steps required to make a pretty accurate copy!
This tutorial doesn’t include how to make the eyeball actually move, but I would love to see someone hack it with a servo to really up the creep factor! Using something like this Arduino controlled rig created by Tod Kurt.
Onto the prop making! This tutorial uses a combination of air drying and baked polymer clay for almost all of the details. A mold was used to create exact copies of some of the books embellishments– very smart!
Check out Mizerella’s blog for the full how-to on this project.
This next book prop is not specifically a Hocus Pocus replica, but I give it two big warty thumbs up for creative use of materials. Those eyeballs are just magazine cut outs with clear glass gems glued on top! This was spotted over at Design DNA.
Tinkernut’s Motion Controlled Ultrasonic Lamp takes uses sound to detect motion
Anybody can go to the hardware store and pick up a motion activated light but that’s so 80’s and the only thing involved with installing them is talent with a screwdriver. Ultrasonic waves are better at tracking movement, at least when it comes to ninja attacks and other unseen entities, which is where Tinkernut’s Motion controlled Ultrasonic Lamp comes in. The lamp uses ultrasonic waves to not only detect movement but also the direction it’s coming from.
The lamp is outfitted with three ultrasonic sensors housed in a control box that sits underneath the lamp itself, which uses to servos to move in the direction of the sound it detects. Those sensors are connected to a simple breadboard with an Arduino Uno running the show. A double-A battery pack powers the unit, which is toggled on/off using a simple switch.
Everything needed to build your own Motion Controlled Ultrasonic Lamp minus the dremel
Everything wired up and ready to be attached to the lamp
Obviously, the lamp is nothing without the code to run it and Tinkernut provides everything users need to get the ball rolling. Will it actually detect ninjas? If they’re worth their salt, yes. You can never be too careful. See the plans after this link.
This post is sponsored by Freescale.
Among the rows of makers under the trees at World Maker Faire New York, a friendly man with a bowtie played the guitar, showing off the advanced signal processing ability of the Freescale microcontroller that he’s working with in a project dubbed MonkeyJam. It’s made by Eli Hughes (who plays guitar quite well, from what I can tell) and it’s one of three in a series called Hack It Together in partnership with Freescale. The projects are part of an initiative to extend the embedded system skillset among junior and senior engineering students as well as makers, hackers, and younger students. Each project has a set of free online resources including board designs, schematics, code, and explanatory video.
Monkey Listen is another one of the projects in Hack It Together. It uses the Freescale FRDM-K20D50 to make a neat spectrum analyzer display. Just speak, sing, or whistle near the microphone and you’ll see some impressive FFT visuals. The project is meant to teach microphone and OLED display interfacing, audio capture with an ADC, and spectrum analysis via FFT.
Even though he was well into day two of Maker Faire when I encountered him, he still had tons of energy and patience with everyone that came to his table. “It’s been a learning experience,” says Eli, “I’ve been getting a lot of practice explaining the projects to people at all skill levels.”
Keep an eye out for a Kickstarter if you’d like to get your own expansion boards!