416-479-0074

Bee Vaccines: Saving Pollinators One Queen at a Time

Bee Vaccines: Saving Pollinators One Queen at a Time

Good news from scientists for our planet’s poor beleaguered bees! The United States Department of Agriculture has approved a first-of-its-kind vaccine for the little pollinators, to protect them from American foulbrood disease, which results from a bacterial infection and can take out entire colonies in one swoop. 

The science behind the vaccine is fascinating: Insects don’t have an immune system like that of mammals, where, say you or I can be injected with an inert flu virus, and our systems develop antibodies to fight the real live flu when it shows up later. So most researchers didn’t think that vaccine-like intervention into honeybee health was possible. Enter the folks at the University of Helsinki and Dalan Animal Health, who observed a kind of immunity passing from queen to offspring a few years ago, and tried to puzzle it out.

“[Professor Dalial] Freitak and colleagues discovered a key egg-yolk protein called vitellogenin was the transport mechanism for trans-generational immunity in insects. This foundational discovery laid the groundwork for a novel kind of insect vaccine, and the team’s first target was honeybees.

Over the following years the researchers developed a vaccine to target a disease called American Foulbrood. The disease is caused by Paenibacillus larvae bacteria and once it takes hold in a bee population often the only option is to completely destroy the colony. The vaccine works by binding inactive bacterial cells to the vitellogenin protein so when it is consumed by a queen it can be directly transferred to her larvae.

‘The vaccine is incorporated into the royal jelly by the worker bees, who then feed it to the queen,’ a statement from developing company Dalan Animal Health explains. ‘She ingests it, and fragments of the vaccine are deposited in her ovaries. Having been exposed to the vaccine, the developing larvae have immunity as they hatch.’”

With USDA approval of the vaccine and widespread rollout, scientists, farmers, and bee enthusiasts are looking forward to getting foulbrood disease under control. This successful proof of concept may also mean other vaccines, for bees and other insects, will quickly follow—wouldn’t it be terrific if we could vaccinate mosquitoes against malaria?? Not only would we be taking care of some of the smallest cousins we share our Earth with, but taking care of ourselves as well. To borrow an arachnid metaphor, such is the Web of Life!

Hungry Hungry Smartwatches: The Tamagotchi of the Future

Way back in the primordial ooze called the 1990s, one of the many trends that swept the planet was that of the tiny pocket pet known as the Tamagotchi. For those of you unlucky enough to not have had a preteen in their household at the time, these gadgets were little eggs on a keychain with LCD screens, on which the life cycle of a tiny alien creature would play out and be influenced by the buttons you pressed to feed and play with them. They were very fun, if stressful, especially if your kid enlisted you to take care of it during the school day so it wouldn’t die because their teacher had banned them in class.

A team of scientists from the University of Chicago have pulled this once-trendy gizmo into the modern age, both technologically and morally. They have adapted the concept to a smartwatch run on a slime mold that the wearer has to feed and keep alive. This living organism/device symbiosis interrogates and complicates the dependent relationships people have with their smart devices.

“They created an enclosure attached to the smartwatch and placed a species of slime mold known as Physarum polycephalum inside it. To enjoy one of the key functions of the accessory – heart rate monitoring – they would need to keep the mold alive by feeding and caring for it.

Here is exactly how it works – the slime mold is placed in one side of the enclosure and as it is fed with a mixture of water and oats it grows to the other side of the enclosure forming an electrical circuit that activates the heart rate monitor function. If the mold is ignored, it goes dormant and the circuit is cut off.

Interestingly, users can forget about their pet slime mold for days, months, or even years, as it can be ‘revived’ by resuming care for it. But scientists wanted to know if simply knowing that there is a living, dormant organism in there affected people’s relationship with the gadget.”

The team engaged a group of five people to wear the smart watches for a two week period, and to write down their feelings about the devices while they fed the slime molds as normal for the first week, then deliberately starved them for the second. The participants reported feeling attachment for their watches, even going so far as naming them, as well as emotions like grief and guilt as the slime molds died.

While this particular watch will likely never be mass produced, we can apply these philosophical findings to the other devices we burn through in our ultra connected lives. Might we be more interested in conservation and re-use of resources if we fed—a nurturing act for our species— our laptops, phones, or watches? If we were more conscious of our use of them, since they would be in literal relationship to us, would that help with doom scrolling and other forms of device-based dissociation? Lots of interesting questions come up—definitely food for thought!

 

Taste Test Crowns Quinoa Cookies

Quinoa

 

Quinoa—that ancient grain with a nutty flavour and a persistent presence in chic salad bowls over the past decade—has surprised us again. In addition to being a fibre,  protein, and vitamin powerhouse, Washington State University researchers have discovered that quinoa flour works demonstrably well as a cookie additive. And, unlike many other “healthy” additions to indulgent foods that should never have happened (ugh, carob), the quinoa flour actually improves the taste, texture,  and “spreadability” of the cookie dough. In a study published in the Journal of Food Science, preliminary results showed that more people preferred a sugar cookie with some added quinoa flour, over an entirely wheat flour cookie.

“[R]esearchers looked at ten different quinoa breeding lines and tested them as a flour in cookies at 25% up to 100% quinoa. Many of the breeding lines held up well at the lower levels but the cookies tended to crumble as they approached 100% quinoa flour.

The preliminary results from the taste tests also show that using up to 25% quinoa flour tended to have better results. The researchers purposely chose sugar cookies for the taste test because they are plain as opposed to chocolate chip cookies which might mask any flavor from the quinoa. For the sugar cookie, a little quinoa might have an advantage, said Elizabeth Nalbandian, the study’s first author and a Ph.D. student in [study author Girish] Ganjyal’s lab.

‘I think at 10%, quinoa added a type of nutty flavor that people really liked,’ she said, noting the testers liked it even more than the control whole flour cookie.”

The Washington state connection meant researchers had skin in the game: The two types of quinoa that came out on top in the results are specifically bred to grow in the Pacific Northwest climate. But I think this good news applies to all cookie lovers—especially those of us wanting to consider our health, while not giving up the fun things in life! Hmm, I wonder if David might be willing to whip up a batch of his famous chocolate chip cookies with a dash of quinoa in them…? For fun—and for science!

Edited Chicks May Mean a More Ethical Breakfast

chicks and eggs

Vegetarianism and veganism is a hard road to travel, not least because it can be difficult to determine where your personal hard moral line is. For example, a lacto-ovo vegetarian won’t chow down on a piece of chicken, but will an egg, making a distinction between the different levels of animal exploitation they represent: a vegan will eat neither. But even if an egg itself doesn’t represent a death to a vegetarian, the industry that surrounds its production is filled with harm; including the mass killing of male chicks, who are valueless to the industry because they cannot lay more eggs.

But a team out of Israel’s Volcani Institute has devised a method to prevent this slaughter—by preventing male chicks from existing in the first place. They’ve done this, they claim, by gene editing hens to lay eggs that will result in only female chicks hatching.

“The scientists have gene-edited DNA into the Golda hens that can stop the development of any male embryos in eggs that they lay. The DNA is activated when the eggs are exposed to blue light for several hours.

Female chick embryos are unaffected by the blue light and develop normally. The chicks have no additional genetic material inside them nor do the eggs they lay, according to Dr Cinnamon [the project’s chief scientist].

‘Farmers will get the same chicks they get today and consumers will get exactly the same eggs they get today,’ he said. ‘The only minor difference in the production process is that the eggs will be exposed to blue light.’”

Certain jurisdictions in the EU are already banning the culling of male chicks, and simultaneously warming up to the idea of light genetic modification of livestock. If fully adopted, this innovation—not published, because the team wants to license it ASAP—will go a long way toward not having a world filled with roosters! And once scientists have the chance to make sure the resulting eggs are fit for human consumption, eaters will have one less moral choice to make before breakfast.

Vitamin A Problem? Microparticle Solution!

I’ve come to my love for food (and making the best condiments for it that I can!) via a circuitous route, through many years in the wilderness of IT and business solutions. But before that, I was a chemist—and this recent news about a fascinating advance in vitamin applications made me feel like I’d come full circle.

A team from MIT has shown that encapsulating vitamin A in polymer microparticles before fortifying food with it enables the vitamin to better weather storage and cooking. This allows higher than typical amounts of the key nutrient to make it into the humans eating it. As vitamin A deficiency is prevalent in developing countries (and is the leading cause of childhood blindness in the world), this easy, low-barrier way of boosting vitamin A intake could be a game changer.

“In a 2019 study, the MIT team showed that they could use a polymer called BMC to encapsulate nutrients, including iron, vitamin A, and several others. They showed that this protective coating improved the shelf life of the nutrients, and that people who consumed bread fortified with encapsulated iron were able to absorb the iron. […]

Using an industrial process known as a spinning disc process, the researchers mixed vitamin A with the polymer to form particles 100 to 200 microns in diameter. They also coated the particles with starch, which prevents them from sticking to each other.

The researchers found that vitamin A encapsulated in the polymer particles were more resistant to degradation by intense light, high temperatures, or boiling water. Under those conditions, much more vitamin A remained active than when the vitamin A was free or when it was delivered in a form called VitA 250, which is currently the most stable form of vitamin A used for food fortification.”

The technology was trialed in flour and bouillon cubes, both used extensively in sub-Saharan Africa, an area deeply affected by vitamin A deficiency. Testing then showed the bioavailability of the encapsulated vitamin A as being nearly the same as vitamin A consumed on its own. Two companies are now the proud licensees of the tech, and are planning to roll it out into the market soon. This tiny fix in the nutrient profile of common foods can mean a big change for health worldwide—and what a delicious way to do it!

How a Crab Kerfuffle Cancelled the Fishing Season

crab

I’ve been keeping an eye on the story of the snow crab population crash for a couple of months now, and as someone interested in shellfish from both a culinary and an environmental standpoint, I’m getting a bit concerned! The NOAA (the National Oceanic and Atmospheric Association, the American body in charge of the crabs’ habitats and fishing thereof) is gesturing vaguely to climate change to blame for October’s cancellation of Bering Sea snow crab season because 11 billion crabs had basically up and disappeared. But, thanks to a new analysis by Nautilus, it seems things the situation is far more complicated. And it has everything to do with math.

The tale spun by Spencer Roberts is worth the full read, but the gist is as follows: Officials—and fishers—may be repeating history; Spencer cites the 1980s crash of a similar species, the Alaskan king crab, as precedent. Then, as now, it may come down to an ignorance of the crabs’ natural behaviours. Testing nets can drag through a huge pod of hundreds of crabs (that gather in dome-like piles to rest between foraging sessions) and then that highly concentrated number can be erroneously extrapolated to an entire area. This may mean that the 11 billion death toll may be overinflated because there were never that many crabs, to begin with.

“ This opens the possibility for inflated population estimates if surveys happen to intersect aggregations of crabs. That may have happened twice with king crabs: their Cold War collapse in the Bering Sea was preceded by a “recruitment pulse”—a cohort of maturing males—that motivated regulators to double catch limits every three years. […]

“We know that recruitment boom was real,” [NOAA’s Alaska Fisheries Science Center program manager Mike] Litzow responded when asked about the possibility that survey methods had caused crab populations to be overestimated. He cited crab reproductive cycles, improved survey coverage, and the fact that the boom persisted for two consecutive years. But while a pulse did occur, was it truly as large as the models suggested? And should NOAA regulators have raised catch limits when its assessments also suggested that the abundance of harvest-sized males had dropped by half in the decade prior?”

Spencer falls heavily on the “No” side here, but the situation gets tragic for the fishers involved, who sank their livelihood into an industry that may have never been robust enough to take it. The crabs themselves got the shorter end of the stick; the limits raised to harvest crabs that didn’t exist truly decimated the ones that did. Only time will tell if populations can recover—it might be worth voting with our dinner plates over.

Spicy Cetaceans Steal Snacks, Stumping Science

spicy dolphin

If there’s one thing I’ve learned from the dulcet tones of David Attenborough, narrating yet another amazing animal documentary over a soothing soundtrack, it’s that dolphins are very smart. They use highly intelligent strategies to live their lives, hunt, and even play. They also love to eat—all the better if their meal is generously pre-caught for them, by human fishers who are deeply chagrined at the highway robbery occurring in their nets. After noisemakers and reflective material didn’t deter this dolphin behaviour (called depredation) in the Aegean Sea, a team of Greek researchers recently brought out the big guns: hot sauce-laced nets.

More precisely, they coated fishing nets with a resin that contained capsaicin, the chemical irritant that gives chili peppers their famous heat. Capsaicin has been used successfully on land to prevent squirrels, deer, and other mammals from eating what they shouldn’t (like seed from a bird feeder, or crops). But it had never been tried on dolphins…

“Yet after five months of test fishing with capsaicin-coated nets, the research team co-led by Maria Garagouni, a marine biologist at Aristotle University of Thessaloniki in Greece, faced a tough realization: their idea didn’t work. The bottlenose dolphins that interacted with their nets were entirely unfazed. […]

While it’s known that many cetaceans, including bottlenose dolphins, lack four of the five primary tastes—they can only pick up salty—spiciness is registered by a different set of sensory cells through chemesthesis. This process, which signals sensations such as pain and heat, is little studied in the species. Other toothed whales do appear to have the hardware required for capsaicin detection, notes [neuroscientist Aurélie] Célérier, but there’s a lot left to learn.

There could be something else at play in the dolphins’ triumph over spice: cetacean super smarts. […] The dolphins may simply have figured out a way to break into the spicy nets without making much contact. “

Intriguingly, a mystery predator, unseen by the researchers, did avoid the spicy nets, while massacring the control nets. The team is putting their research into who the strange snacker was, as well as their central question, on ice for now. The wheel of science turns slowly—which I’m sure the hungry dolphins appreciate!

Prehistoric Fish Dish Earliest Yet

prehistoric cooking by fire

Have you ever been on a salad kick, say in the heights of summer when it’s too hot too cook, and all those crunchy veggies are fine and dandy, until the weather turns and you can finally fire up the oven, and that first bite of roast beef or chicken or lasagne with a bolognese sauce—the taste of cooked food that has undergone that irreversible chemical transformation—just, as the kids say, hits different?

Now imagine that moment, but on a species-wide scale. Scientists have been trying to nail down that moment—exactly when we (or one of our Homo genus cousins) harnessed and controlled fire to cook food—for centuries. The cooking process is known to make foods easier to digest and unlocks certain nutrients, that allowed us to both grow our brains and spend less time using them to acquire raw grazing materials. A game-changing discovery by a team from three Israeli universities has shown that our ancestors invented cooking at least 780,000 years ago, a whopping 610,000 years earlier than the previous estimate!

The proof was in the remains of a giant fire-roasted barb (a carp-like fish), found at the Gesher Benot Ya’aqov archaeological site by the team. The findings of their study were recently published in Nature Ecology and Evolution.

“In the study, the researchers focused on pharyngeal teeth (used to grind up hard food such as shells) belonging to fish from the carp family. These teeth were found in large quantities at different archaeological strata at the site. By studying the structure of the crystals that form the teeth enamel (whose size increases through exposure to heat), the researchers were able to prove that the fish caught at the ancient Hula Lake, adjacent to the site, were exposed to temperatures suitable for cooking, and were not simply burned by a spontaneous fire.”

The team found extensive evidence of roasted barb on the site, which points to a long tradition of settlement there, and of passing down cooking skills. Further investigation may also prove a hypothesis that the eating of fish in particular represented a “quantum leap” in human development; as we now know, omega-3 fatty acids, present at high levels in fish, as well as zinc and iodine, support cognition.

It seems that this ancient barbecue, on the shores of Hula Lake, wasn’t just a get-together for the family group that ate that day—in a way, all humans were there, changing the future, one flame-grilled bite at a time.

Greasing the Wheels of the Future with Cooking Oil-Boosted Seeds

Cooking oil is one of those things in a kitchen that I think about as a means to an end—deliciously fried food —rather than an ingredient itself. Thankfully, a team of scientists working at Singapore’s Nanyang Technological University has spared much more than a passing thought for this culinary workhorse. They’ve discovered a way of editing the genes of plants to produce seeds with a staggering 15-18% more oil in them. They’re planning for this increased yield to reduce the space required to raise oil-producing plants, like sunflower, peanut, and soy, and therefore decrease the pressure of industrial agriculture on our environment.

“The secret to helping plants store more oil in their seeds is one of their proteins called WRINKLED1 (WRI1). Scientists have known for over two decades that WRI1 plays an important role in controlling plant seed oil production. […]

Published in the scientific journal Science Advances, the team detailed the molecular structure of WRI1 and how it binds to plant DNA—which signals to the plant how much oil to accumulate in its seeds.

Based on the understanding that the atomic structure of the WRI1-DNA complex revealed, the team modified WRI1 to enhance its affinity for DNA in a bid to improve oil yield. In this approach, some portions in WRI1 were selected for modifications to improve its binding to DNA and several forms of WRI1 were produced.

These candidate WRI1s were then further tested to assess their ability to activate oil production in plant cells. As expected by the team, they showed that their modified versions of WRI1 increased DNA binding ten-fold compared to the original WRI1—ultimately leading to more oil content in its seeds.”

The team also determined that the binding mechanism between WRI1 and the DNA of their test plants (Nicotiana benthamiana and Arabidopsis thaliana) was “extensively conserved,” meaning it may be common to a large number of plant species. In this, they may have uncovered a bonus feature: upping the fat content of nuts and seeds that are eaten as-is (and not just pressed for oil) means that the people who consume them can feel satisfied faster, and meet their nutritional needs with less bulk—a boon for those living in places where sourcing food is a problem.

We at DFC do love a bit of judicious gene editing —anything that gets food into the mouths of hungry folks is a good thing. That, plus the space-saving aspect, and this new invention is primed for a well-oiled future!

First Pasta the Post: New Processing Keeps Noodles Fresh

The dried pasta you can get off the shelf has wonderful applications, but in many dishes, I think fresh pasta is the best. Making it yourself takes time, though, and the stuff from the supermarket’s fridge section doesn’t last long enough to make it a reliable staple. But Italian scientists have been bending their minds toward this  (distinctly Italian) problem, and have figured out a way to extend the shelf life of fresh pasta: by reinventing the packaging, the atmosphere inside it, and the microbial profile of the pasta itself. This has on average doubled the lifespan of the pasta, which (the researchers hope) can help reduce food waste and, in turn, the carbon footprint of the pasta production industry itself.

“Scientists in Italy report that they worked with a pasta factory in Altamura to create 144 samples of short, thin twisted pasta known as trofie. One set of 48 samples was packaged using conventional film and a packaging atmosphere composed of 20% carbon dioxide to 80% nitrogen.

A second set of 48 samples was packaged with a film that was less permeable to water and oxygen and with an atmosphere of 40% carbon dioxide to 60% nitrogen, while the third set of 48 samples also used these new conditions but, in addition, had a multi-strain probiotic mixture added to the pasta dough. The samples were all stored at 4C.

The team reported that the conventionally packaged pasta showed decreasing carbon dioxide levels over a 90-day storage period, resulting in the growth of visible moulds. By contrast, the two types of experimental samples had an almost stable atmosphere, and no fungal growth, over a 120 -day period.”

As a culinarily distinct culture, Italy is very careful about the provenance and creation of its food. So the researchers made sure their interventions are well within regulations—and were indeed undertaken at the express request of the factory with which they collaborated. Next steps will now involve investigating long term feasibility of this starchy overhaul. I’m a big fan of any innovation that brings more pasta to more people—almost as much of a fan as I am of pasta itself!