World’s lightest wireless flying robot created by NIFTI PI Sawyer Fuller’s team

Weighing in at 190 mg, “RoboFly” is only slightly larger than an actual fly. 

NIFTI PI Sawyer Fuller and his team have created what is to date the world’s lightest wireless flying robot.  The team also includes Vikram Iyer, Johannes James, Shyam Gollakota, and NIFTI graduate student Yogesh Chukewad.  See the research paper here.

Currently, insect-sized flying machines need to be tethered in order to deliver the power required for flight (check out Fuller’s “RoboBee“). In order to circumvent this issue, RoboFly is powered by a laser beam using a photovoltaic cell.  An on-board circuit boosts the seven volts generated by the cell to the 240 necessary to power the wings.  The circuit also contains a microcontroller which controls the movement of the wings.  “The microcontroller acts like a real fly’s brain telling wing muscles when to fire,” according to Vikram Iyer.

RoboFly’s flexible circuit. The copper coil and black boxes to the right comprise the boost converter, and the microcontroller is the small square box in the top right.

In the future, autonomous roboinsects could be used to complete tasks such as surveying crop growth or detecting gas leaks.  “I’d really like to make one that finds methane leaks,” says Fuller. “You could buy a suitcase full of them, open it up, and they would fly around your building looking for plumes of gas coming out of leaky pipes. If these robots can make it easy to find leaks, they will be much more likely to be patched up, which will reduce greenhouse emissions. This is inspired by real flies, which are really good at flying around looking for smelly things. So we think this is a good application for our RoboFly.”

The Robofly team. Front row: Vikram Iyer (left) and Johannes James; back row (from left): Yogesh Chukewad, Sawyer Fuller, and Shyam Gollakota.

At the moment, RoboFly is only capable of taking off and landing, as there is no way for the laser beam to track the robot’s movement; but the team hopes to soon be able to steer the laser and allow the machine to hover and fly.  Shyam Gollakota says that future versions could use tiny batteries or harvest energy from radio frequency signals.  That way, their power source can be modified for specific tasks.

See a video below of the RoboFly in action!

RoboFly has received extensive publicity, see coverage by WIRED, The Economist, IEEE Spectrum, MIT Tech Review, TechCrunch, Discover MagazineGeekWire, Popular Mechanics, Engadget, CNET, Digital TrendsSiliconrepublic, and SlashGear.

PNAS paper on sensory integration from lab of NIFTI Director Tom Daniel

Hawkmoth sensory integrationUniversity of Washington postdoctoral fellow Eatai Roth, working in the lab of NIFTI Director Tom Daniel, recently published a paper in Proceedings of the National Academy of Sciences on how multiple types of sensory information are used by hawkmoths to govern flight behavior.  The paper, entitled “Integration of parallel mechanosensory and visual pathways resolved through sensory conflict”, describes work that investigated how moths combine sensory cues to follow the motion of wavering flowers while feeding.

While hovering in front of a flower, a feeding moth receives information about how the flower is moving from two sensory modalities: visual information from the eye and mechanosensory information from the proboscis in contact with the flower.  By building a two-part artificial flower that allows for independent manipulation of visual and mechanosensory cues, Roth et al. disentangled the contribution of each sensory modality to the moth’s flower-following behavior.  They found that the brain linearly sums information from the visual and mechanosensory domains to maintain this behavior. They further demonstrated that either sensory modality alone would be sufficient for this behavior, and this redundancy makes the behavior robust to changes in the availability of sensory information.

This work furthers NIFTI’s second research thrust on sensory architecture and processing, and provides a better understanding of how multiple sensory modalities are used in nature to govern complex behaviors.

This research was also featured in a UW Today article, “Tricking moths into revealing the computational underpinnings of sensory integration”.

Photo credit: Rob Felt, Georgia Tech

Media coverage on NIFTI-funded Science article

NIFTI-funded research from Tom Daniel’s lab at the University of Washington was recently published in Science and has been receiving a wide array of media coverage.  The research investigated how hawkmoths track the location of flowers in low-light conditions.

The article, “Luminance-dependent visual processing enables moth flight in low light”, by Simon Sponberg, Jonathan P. Dyhr, Robert W. Hall, and Thomas L. Daniel, can be found on the Science website.  There is also a video associated with the published article (see below).

Science has compiled a summary of the media coverage on the research.  Articles of note include: