Researchers try to replicate the honeybee’s remarkable navigating ability in developing more efficient and versatile aerial drones.
By Thomas K. Grose
Next time a honeybee buzzes near you, think about how it got there. A bee can forage for nectar as far away as six miles from its hive, easily find its way back home, and return to the same flowerbed the next day despite having relatively weak vision as a guide. It also accomplishes these feats with very little computational power: Its tiny brain contains a mere 1 million neurons, compared with a human’s 100 billion, and yet it has “amazing cognitive abilities,” says James A. R. Marshall, a professor of computational biology at Britain’s Sheffield University. “They’re consummate navigators, far in excess of the best robots we have available.”
Figuring out how honeybees navigate with such limited brainpower could make for much more efficient and versatile unmanned aerial vehicles (UAVs). Currently, UAVs are guided from Point A to Point B by a power-intensive combination of artificial intelligence, mainly deep-learning algorithms and machine vision, and a variety of sensing tools, including lidar and GPS. Lidar sensors add weight to the vehicle, and the need for GPS limits where a drone can operate.
Marshall leads a multiuniversity research team seeking to understand how the honeybees’ neural circuits work and what enables the creatures to navigate by vision only. That knowledge would be used to build a computer model for UAVs that replicated the bees’ computational- and energy-efficient circuits and could run on a lightweight GPU chip. Drones need to be lightweight and energy-efficient if they’re to fly varied missions—such as search and rescue—and stay aloft for long periods of time. “If you want to have flexible, autonomous robot controls, then trying to reverse-engineer the honeybee brain is a very useful approach,” Marshall says. The U.K. government-funded project, dubbed Brains on Board, is two years into a five-year, $6.3 million grant that brings together a diverse team of engineers, computational neuroscientists, roboticists, and biologists from Sheffield, the University of Sussex, and Queen Mary University London.
For ground-based robots, particularly driverless cars, weight and power aren’t big issues. You can pump them up with a lot of computers and sensors and, as Marshall says, “brute-force any problems. Here we’re trying to do things as elegantly as possible, just like a bee does, with minimal neural hardware” and using only camera vision. Bees generalize visual input and automatically adapt to new places. That’s thanks to billions of years of evolution, which has crunched all those navigational problems into a simpler one that’s easier to solve computationally. “They’re very impressive little creatures,” says Alexander Cope, a computational neuroscientist in Sheffield’s computer science department.
Brains on Board isn’t trying to duplicate the mechanics bees use to fly. The project uses small, four-rotor helicopters or “quadcopters” as a platform to reproduce the aerial capabilities of honeybees. Quadcopters have no wings, so they can hover and turn like bees. “Even though it’s not flying in the same way as a bee—it’s not got flapping wings—its control dynamics are similar enough. So when our brain models are saying, ‘Do something,’ it can respond in the appropriate manner,” Marshall says.
Before a bee’s neural circuits can be reverse-engineered, they have to be decoded. Part of that effort means diving into nearly 80 years’ worth of scientific literature. Cope, Marshall’s Sheffield colleague, starts by looking for the simpler neural loops that may be involved in keeping bees flying and not crashing into things. That said, he adds, there are plenty of circuits that remain unexamined, “so there’s quite a bit of detective work” in trying to figure out what a combination of neurons is doing and separating out the actual sequence of computations from noise created by other tasks a bee’s brain performs. The researchers also don’t solely rely on honeybee research; they’re using previous studies on fruit flies. “I always describe fruit flies as stupid honeybees,” Marshall quips, because they have many fewer neurons. But their brain structure is similar, they have a common ancestor, and they also fly. “The basic problems of how do you move in a world are common,” Marshall says. Fruit fly investigators can use optogenetic tools, which require modifying DNA, to tease out the workings of the insects’ neurons. Those tools don’t work with bees because, unlike fruit flies, they live in big social groups and can’t be bred in test tubes. “Fruit flies are the kind of workhorses for understanding how neural circuits work in an insect brain,” Marshall explains, “and from that we’re kind of generalizing” so they can reproduce honeybee navigational abilities by duplicating some of those found in fruit flies.
Bees use two types of visual navigation, one egocentric, the other geocentric. The former relies on something called optic-flow estimation. If you’re on a train and look out the window, the objects closest to you appear to rush past, while those farther away seem to creep along. Honeybees can keep track of those flow speeds. “And if you can estimate how fast it’s going, you can estimate how far you’ve gone, and bees are good at doing that,” says Thomas Nowotny, a professor in the School of Engineering and Informatics at Sussex. The team has already created a model based on bee optic-flow estimating circuits. “What we came up with,” Marshall says, “was an algorithm that was much more robust when compared to classical computer-vision techniques.”
When honeybees return to a location they’ve already visited, they tend to use a geocentric approach, something called snapshot navigation, which means they guide themselves by visual landmarks. Some other insects, like desert ants, keep track of their routes the same way. Coming up with a model that duplicates how bee brains do this is challenging, Marshall says, because, unlike ants, bees are moving through three-dimensional space, not on a two-dimensional plane.
Computation to Mimic a Neuron
Once Cope has isolated a neuron he thinks is key to a certain behavior, he has to design a computational model that mimics what that brain cell does to allow a bee to integrate landmarks and motion. He then constructs an array with anywhere from 100 to 10,000 virtual neurons. Each brain cell, Cope says, will have the same computation but different parameters and also connect to other, different neurons. Those connections and slight differences between neurons basically replicate the bee’s neural compass, theorizes Cope. To test the models they develop, the researchers use virtual simulations and three types of robots. At first, models are flown in simulations on computers, where tests can be run repeatedly around the clock. “We couldn’t fly a drone 24 hours a day,” Nowotny says. Once a model is ready for a trial run in a robot, it’s initially tested in one that’s ground-based. For instance, in Sussex, they employ a gantry robot with a large arm that can be driven into any position and thus test a model as if it were flying in 3-D space. “That robot works much better to do the first run in the real world,” Nowotny says, “because you don’t have instability, you don’t have any vibrations, you don’t have drones crashing.” They also use wheeled robots—machines that don’t break as easily as quadcopters if they do bash into something. But such collisions are becoming less of a worry. The team already has had a quadcopter autonomously flying around an indoor test space without hitting anything.
Soon, the team will be able to test early models in a 3-D virtual world it’s constructing that replicates a meadow where real honeybees forage. Investigators at Queen Mary strapped antennas to the backs of some bees that send signals to two radar dishes. One dish monitors a bee’s direction and distance, the other the angle up to the bee, which gives them 3-D tracking. The researchers also have put together a set of photographs of the area they’re using for the 3-D reconstruction. Once it’s completed, Cope says, researchers can see what each bee viewed at each point of its travels and how it reacted to changes in the environment. That’s not only useful information for building models, he says, “but we can validate what we’ve done and see if we’ve reproduced how the bees responded to their environment.”
Perhaps the key piece of hardware involved is the GPU chip, a processor mainly used in computers as a graphics card. A few years ago, manufacturers began making credit-card-size GPUs for mobile devices, each with around 250 cores and a CPU on a single chip. “These [neural] models are quite hungry for computers, so you need a really fast computer with lots of memory to run them” in real time on a drone, Nowotny says. “And 250 cores on a mobile robot is quite powerful.” Also efficient. Each chip needs only 10 to 15 watts to run. “It takes such little power it makes no difference to the flight time of the drone.” There’s one hitch, however. “Brain models don’t fit naturally on GPUs because they’re graphics cards, they’re not brain simulators,” Nowotny explains. So his team also builds the software that can take a brain-model description and run it effectively on a GPU.
Although honeybees live in social groups and communicate with one another, Brains on Board is focused solely on how the brains of single bees work. “They have a social dimension, but their individual abilities are very attractive, and that’s really the goal of the project,” Marshall says. However, in the future he’d like to develop bee-inspired drones that can swarm. (Researchers elsewhere have developed robot swarms applying different technology.) Marshall has another project that’s studying the collective behavior of honeybees, and “we have to figure out how do we marry these two approaches, so we can have swarms [of UAVs] interacting with each other, but they’re still individually complex and highly competent.” Marshall also envisions a time when drones will fly not only by vision but also using other senses, like a magnetic compass, as a backup. (Sidebar, page TK.)“These are things that are simple and lightweight, and engineers can capture them and we can fuse them in.”
The grant runs for three more years, but there’s much left to be done. Eventually all the models will have to be forged into one. “We still have to nail them down and get them working together,” Cope says. “These are very complex sets of equations, and while you have an idea of how they should work together, it quite often doesn’t work out that way. These are things you discover when you actually build the model.” Cope expects they’ll clear those remaining hurdles. “We should have something by the end of the project that can navigate, that can understand its environment, and can solve problems without any user input.” Confident of success, Marshall points to the quadcopter that can fly indoors. It’s operating “in a very challenging environment, even though it’s just doing the simplest of tasks—not running into walls,” he says. “For me, that shows that it works.”
Just like a honeybee.
Thomas K. Grose is Prism’s chief correspondent, based in the United Kingdom.
Design by Nicola Nittoli
Buzz of Enthusiasm
From flapping wings to swarms, researchers—and defense innovators—flatter bees with imitation.
To many engineers and scientists, honeybees are the bee’s knees.
They’re providing inspiration for a new generation of autonomous drones, or unmanned aerial vehicles (UAVs), and have helped improve Internet traffic. Moreover, because bee populations are under threat, many researchers are working to improve their understanding of bee behavior in hopes of finding ways not just to mimic them but to save them.
While the Brains on Board project led by Britain’s Sheffield University (see “Just Enough Brainpower,” page 24) is working to reverse-engineer honeybee neural circuits and develop autonomous drones that can navigate by vision like bees, Harvard University’s RoboBee drones use deep learning, computer vision, accelerometers, and gyroscopes to duplicate the flying mechanics of honeybees. Much smaller than the British drone, its flapping wings move, thanks to piezoelectric actuators, and the materials composing its “muscles” contract when voltage is applied.
Tiny UAVs need minute computer chips. Recently, a group of MIT engineers unveiled a chip that’s only 20 square millimeters: small enough to fit inside honeybee-size nanodrones. The Navion chip consumes only 24 milliwatts of power, or 1 one-thousandth the electricity a lightbulb uses.
Honeybees may not rely exclusively on vision to navigate. Last year, a team of scientists at Canada’s Simon Fraser University discovered that honeybee abdomens contain ferromagnetic material similar to magnetite that allows them to navigate by sensing magnetic fields.
There are many potential uses for bee-inspired drones, including search-and-rescue missions, surveillance, and climate and environmental monitoring. At NASA’s Ames Research Center, researcher Terry Fong has developed the Astrobee, a free-flying robot, to help with chores and survey sound levels, air quality and other factors inside the International Space Station.
Bee robots could eventually be used as pollinators, substituting for bee populations that are shrinking worldwide—possibly as a result of disease, pesticides, loss of foraging areas, pests, or some combination. Bees pollinate almost a third of the crops humans consume.
Eijiro Miyako, an engineer at Japan’s National Institute of Advanced Industrial Science and Technology, earlier this year demonstrated a robotic drone that’s roughly 1.6 inches in diameter—a bit bigger than a bee, but small enough to work as a pollinator. In experiments, Miyako showed that the gel-coated horse hairs on his bee bot could pick up and release the pollen of Japanese lilies. However, his robo-pollinator isn’t autonomous.
Scientists and engineers want to learn more about bee behavior and the social organization of their colonies to help save them. One ongoing study at the University of Puerto Rico is using videos and sensors to monitor bee colonies and designing algorithms to analyze the big sets of data they collect. To help track individual bees, the UPR team places small, lightweight barcodes on them. A Harvard study is looking into how the widely used group of neonicotinoid pesticides is affecting bumblebees. To track the bees, researchers chill them in a refrigerator to make them immobile, then use superglue to place tiny barcodes on their backs.
Part of the life of bees—and of other insects—is swarming. In bees, it occurs when a large group splits off from an existing colony to form a new one. The Defense Advanced Research Projects Agency (DARPA) views swarming behavior as a model for urban warfare, in which troops control hundreds of unmanned air and ground vehicles at once. A key challenge is giving humans the ability to interact with and influence the swarming drones.
You can thank bees for making it easier to surf the Internet. Back in 1992, Georgia Tech systems engineers worked with a Cornell University biologist to model how bees forage. It was a neat trick, but then they couldn’t figure out much use for it. Some 10 years later, one of the engineers was approached by a computer-science graduate student who was trying to find a way for Web hosts to better allocate their servers. Their subsequent collaboration resulted in the Honeybee Algorithm, a model for biologically inspired programming that has led to a $50 billion market for Web-hosting services. The formula has helped increase Internet speeds—and it’s a rather sweet way to put bee research to practical use. – T.G.