New in fascinating nature updates! Everyone’s favourite adorable wrinkly horror, the naked mole rat, has been found by scientists out of lifespan study company Calico Life Sciences LLC to defy everyone’s least favourite law of mortality, the Gompertz law. This means that naked mole rats do not age in a typical way. In addition to being a trippy addition to the already extraordinary list of cool attributes about this mammal, this fact could have an impact on human health.
Among the fun facts about naked mole rats: they spend their lives underground in eusocial colonies of up to 300 individuals, with a dominant female “queen” and just a few select male mole rats involved in breeding. They have evolved beyond the need to see very well and regulate their body temperatures. They are (almost) immune to cancer, and don’t seem to experience pain. And now:
“The team collected what they describe as 3,000 points of data regarding the lifespan of the naked mole rat, and found that many had lived for 30 years. But perhaps more surprisingly, they found that the chance of dying for the mole rats did not increase as they aged. All other mammals that have been studied have been found to conform to what is known as Gompertz’s mortality law, which states that the risk of death for a typical mammal grows exponentially after they reach sexual maturity — for humans, that means the odds of dying double every eight years after reaching age 30. This, the researchers claim, suggests that mole rats do not age — at least in the conventional sense. They do eventually die, after all.
The scientists involved in this study state that what “determines naked mole-rat lifespans in captivity [i.e. without predation skewing the numbers] is currently unknown,” and needs an awful lot more research. Given that investigation into the imperviousness of naked mole rats to pain has shed light on human congenital pain insensitivity, could that research lead to a greater understanding of human aging? Here’s hoping these sweet, kind of horrifying adorable little old men of the animal kingdom may soon let us in on more of their secrets!
It’s been so long since I’ve seen a flower out in the wild; I’m having a hard time believing they’ll ever return! But, of course, they will, fulfilling a genetic legacy that has made flowering plants (or angiosperms) the dominant plant type on Earth.
Flowering plants edged out former first-place gymnosperms (which include conifers and ferns) around 150 million years ago, but until a recent co-study from teams out of San Francisco State University and Yale, scientists were stumped (pun intended!) as to how the small, colourful plants did it. It turns out, it likely had to do with their very smallness — of their genome.
The team analyzed data held by the Royal Botanic Gardens, Kew (in London, UK), and compared genome sizes to physical properties like the number of leaf pores, and rates of leaf water loss and photosynthesis. The researchers found that the rise and continued success of flowering plants on Earth is a result of what they termed “genome downsizing.” From the BBC:
“By shrinking the size of the genome, which is contained within the nucleus of the cell, plants can build smaller cells.
In turn, this allows greater carbon dioxide uptake and carbon gain from photosynthesis, the process by which plants use light energy to turn carbon dioxide and water into glucose and oxygen.
Angiosperms can pack more veins and pores into their leaves, maximizing their productivity.
The researchers say genome-downsizing happened only in the angiosperms, and this was ‘a necessary prerequisite for rapid growth rates among land plants.’”
Plants with smaller genomes were therefore far more efficient at, well, being plants, which has lead to their dominance. Their aesthetic appeal to humans is a lucky side effect!
This landmark discovery has resulted in further, refined questions that scientists should have fun answering: like, why have ferns, ginkgo, and conifers still survived, if, by the metric set by this research, they are no longer the “fittest” plants? While I’m sure the answer will be fascinating, I confess I’m happy with all plants… Though at this moment deep in February, I’m not going to blame myself for treasuring crocuses or daffodils just a tiny bit more!
My two grandsons are still in the visual-entertainment-for-distraction phase (hello there, animated Wheels on the Bus or Paw Patrol for the forty-seventh time). But, when they grow old enough to grasp plot and theme, we’re looking forward to introducing them to all the modern film classics, including Star Wars (or “The Space Hero’s Journey” and The Lion King (or “Animal Hamlet”)
But a recent article by Isabel Fattal in The Atlantic has shed light on a phenomenon that occurs in both these movies, that I honestly hadn’t clocked — and that would be good to keep in mind when the grandkids graduate to a real movie night. Building on their initial 1998 study sociolinguist Calvin Gidney and Professor Julie Dobrow (both of Tufts University) has shown that many “bad guys” in American childrens’ entertainment speak with foreign accents or non-standard English dialects. This marks them as “other” in their narratives — creating an Us-vs.-Them conflict that potentially imparts a harmful subliminal message to kids about diversity.
The study found that the accent coded as most “evil” in the entertainments studied was British: In The Lion King, main villain Scar is voiced by Jeremy Irons, who trained at the Bristol Old Vic. But:
“German and Slavic accents are also common for villain voices. Henchmen or assistants to villains often spoke in dialects associated with low socioeconomic status, including working-class Eastern European dialects or regional American dialects such as ‘Italian-American gangster’ (like when Claude in Captain Planet says ‘tuh-raining’ instead of ‘training.’) None of the villains in the sample studied seemed to speak Standard American English; when they did speak with an American accent, it was always in regional dialects associated with low socioeconomic status.
(Interestingly, the preponderance of German, Slavic, and Russian accents sported by villains points to an inherited bias in American culture: The study authors say it’s likely a holdover from World War II and the Cold War — a time still within the memory of many creators of children’s’ entertainment. Though world conflicts have changed over the past 70 years, no new accents have knocked the above from their second place “evil” spot behind British.)
Isabel Fattal goes into detail about the cognitive repercussions of these lessons, absorbed by kids who, on the surface, are “just” watching a cartoon. Not only do children use television to sort out the concept of ethnic identities and where they themselves fit, they also (like adults) use linguistic cues to make judgments on the perceived intelligence and education of a speaker — and use those assumptions to determine how they treat others.
Fattal concludes that this problem seems deeply entrenched in our culture and entertainment, but that it can be turned around and used to educate kids when they watch these shows and movies with adults who can discuss the issue with them. So, I’m actually looking forward to hanging out with my grandkids in a way that involves more media awareness — and a lot less Wheels on the Bus.