Ascension, a company out of the UK, is catching a bit of flak from geeks for misrepresenting its primary service: scattering human ashes in “space.” Unlike other existing companies, whose space burial services involve launching a sealed craft into orbit for later, earthly retrieval; or NASA, who piggybacked Eugene Shoemaker’s ashes on the deliberately crashed Lunar Prospector probe as the first (and currently only) burial on the moon, Ascension pledges to release your loved one’s ashes in space proper. The contentious issue is their precise definition of “space.”
The company claims it will take an ashes-bearing helium weather balloon up 35 kilometers, into the stratosphere. According to their website, this is well within “near space,” according to the definition that considers the Armstrong limit the border of our atmosphere. (The Armstrong limit is 19km up, and marks the point where protective suiting must be worn because the water in our bodies boils at our ambient temperature.)
But, experts don’t necessarily consider that far enough to be space. The most commonly accepted border between atmosphere and space is the Kármán line, which is located about 100km above the surface of the Earth, and well above the Armstrong line.
Ascension may be using a more romantic definition of where space starts for laypeople who just want a bit of Grandpa to join the constellations he loved so much. But, fudging aside, it may be better in the long run than sending Grandpa into the depths of space. Because then, he becomes debris.
“‘If ashes were scattered in orbit, which these are not, then they’d join the millions of tiny bits of space junk which are traveling at speeds of 7-8 km per second, [space archaeologist Alice Gorman of Flinders University (Australia)] said.
‘Junk this size causes damage to spacecraft by constant bombardment. Each impact is trivial but there’s a cumulative effect. Fortunately, in Low Earth Orbit, this stuff usually enters the Earth’s atmosphere quickly.’”
And burns up harmlessly.
So, while we don’t yet have the opportunity to mingle with the stars from whence we came, post-mortem, I think it’s a fair tradeoff to not accidentally take out the Soyuz. Besides, there are still so many intriguing options for burial on earth! There is still plenty of time to look to the skies.
I love stories of humans who have managed to develop a closer bond with the machines. The optimist in me believes wholeheartedly that humankind is that much closer to the Singularity every time a new prosthetic arm is developed! Which is why this piece by Finnish programmer Tuukka Ojala, over at the blog for Vincit, his software development company, is THE BEST.
Ojala, who is blind, describes the nuts and bolts of his working procedure. Most of his strategies offer an experience that seems closer to the way a computer might “think” than that of sighted monitor-and-mouse users. (In fact, he’s long used a screen reader that fires off what he’s working on at a staggering 450 words per minute!) Ojala says he is most at home on the command line, the most text-based basic access point to a computer’s programming. But unfortunately, too much of his working style is dictated by a lack of accessibility among his tools (which is a problem in the industry).
“[G]iven my love of the command line, why am I sticking with Windows, the operating system not known for its elegant command line tools? The answer is simple: Windows is the most accessible operating system there is. NVDA, my screen reader of choice is open source and maintained more actively than any other screen reader out there. If I had the choice I would use Mac OS since in my opinion it strikes a neat balance between usability and functionality. Unfortunately VoiceOver, the screen reader built in to Mac OS, suffers from long release cycles and general neglect, and its navigation models aren’t really compatible with my particular way of working. There’s also a screen reader for the Gnome desktop and, while excellently maintained for such a minor user base, there are still rough edges that make it unsuitable for my daily use. So, Windows it is.”
In addition to coding like a demon, Ojala has taken up the mantle of general accessibility consultant at his job: “Or police, [depending] on how you look at it.” He has ideas about how to make coding, and web pages in general, more accessible for users who are blind or visually impaired. I’m looking forward to reading more of his blog, for the inside scoop on accessibility, as well as the wonderful world of coding. (Singularity, here we come!)
As a woman in the technical field of computing (and having transitioned from a long career in the technical field of chemistry), I’ve come up against many insidious examples of sexism in my time. Oh, how often have I wanted to give the perpetrators their comeuppance — but been unable to without blowback! So I applaud in vicarious glee the ingenious solution a pair of L.A. artists cooked up.
Penelope Gazin and Kate Dwyer co-founded the online bizarre-art marketplace Witchsy. When developing the platform, they started detecting condescension and disrespect in emails from outside developers and designers, who were often male. Gazin and Dwyer had an inkling that these men were addressing them this way because they were young women jumping into a tech endeavor. So they invented a third, fictional cofounder, a man named (get this) “Keith Mann,” and started corresponding with troublesome contacts as him.
“‘It was like night and day,’ says Dwyer. ‘It would take me days to get a response, but Keith could not only get a response and a status update, but also be asked if he wanted anything else or if there was anything else that Keith needed help with.’
Dwyer and Gazin continued to deploy Keith regularly when interacting with outsiders and found that the change in tone wasn’t just an anomaly. In exchange after exchange, the perceived involvement of a man seemed to have an effect on people’s assumptions about Witchsy and colored how they interacted with the budding business. One developer in particular seemed to show more deference to Keith than he did to Dwyer or Gazin, right down to the basics of human interaction.
‘Whenever he spoke to Keith, he always addressed Keith by name,’ says Gazin. ‘Whenever he spoke to us, he never used our names.’”
There’s an awful lot of light being shined on sexism in tech industries these days. What Dwyer and Gazin have contributed to this conversation is concrete evidence of how ridiculous the sexist impulse is: all it took to get their correspondents to wise up was a man’s signature at the bottom of an email. Hopefully, their evidence will join the massed amount of other undeniable proof of the sneakiness of sexism, and help turn the tide. Until then, Dwyer and Gazin have said they have retired Keith — but still envision having to resort to him again.
While it seems that only the whiz-bang-iest robots in the field these days are the ones getting attention, they are also the most complicated. And, as we all know, the more complicated a thing is, the more opportunities it has to fail. This is especially true in robotics (DARPA competition blooper reel, anyone?). This has led some researchers to simplify their approaches; stripping away the human characteristics and the multipurpose appendages, to create robots that do one thing really well.
One such bot is the “Deformation-driven rolling robot with a soft outer shell,” invented by Yoichi Masuda and Masato Ishikawa of Osaka University, and presented in a paper at this year’s IEEE International Conference on Advanced Intelligent Mechatronics. Their robot is structured after one of our simplest machines, the wheel.
“Instead of motors and gears, however, the wheel surrounding this robot is made from a soft material that’s squished and stretched by a set of four wires connected to an inner core. It’s still mostly dependent on gravity to get around, as the robot is essentially repeatedly falling over as its changing shape makes it unstable. But that also greatly reduces the amount of power it needs to move.”
This robot might mark a trend among experts in the field, away from generality and into specificity of purpose. But this little rolly guy’s simplicity doesn’t mean it’s single-purpose: the interior can hold up to two 360° cameras and a multitude of sensors, and it could be sent into a warzone, an industrial accident, or a natural phenomenon like a volcano with equal impunity. It would be able to transmit a great deal of information about a situation before its destruction – which would not be a hardship because of its low cost and ease of building!
Though I’ll miss potentially having my own personal protocol droid hanging around, this robotics concept does seem far more practical, and possible. I like that the calming adage “simple is good” extends as far as the helper machines of the future. And I also like how it leaves room for us and our human intelligence: We have hope again of not being rendered obsolete! (We’ll see how it goes…!).
It is a truth universally acknowledged that being a human can be stressful. Whether it comes from an unfulfilled desire for a more flexible workplace, ill health, or family tension, none of us is immune to the physical and mental effects of stress.
Unfortunately, there seems to be an increasing amount of evidence that the effects of your stress might not end with you. Stressors you experience may actually result in changes to the expression of your genetic code, which then can be passed down to your offspring. While recent research, on humans and animals, has been controversial because it calls into question the strict definition of what is “inheritable,” it has also been galvanizing. A brand new study published in the journal Science Advances last month has now traced (in worms) how these epigenetic changes can show up in offspring five generations down the line.
“How and why these changes are transmitted between generations is what [lead author Ben] Lehner and his colleagues [at the Barcelona-based Centre for Genomic Regulation] were interested in studying. In their worm study, they inserted a gene into the worm genome that would normally be silenced, and found the worms with the gene also carried mutations in proteins involved in the copying of DNA. Their offspring did not carry the same mutation in DNA replication, but for the next five generations the gene in question was still incorrectly activated.”
How this happens is not yet fully understood by researchers, but, in humans, it could point to a method for a community to respond to stresses like famine, in a way that pivots faster than the traditional Darwinian concept of evolution. In fact, to a lot of folks, epigenetics is looking more and more like Lamarckian heredity – the now rejected theory that “soft” personal traits acquired over an organism’s lifetime can be passed down, not just “hard” genetics (Jean-Baptiste Lamarck’s classic example was the long neck of the giraffe – lengthened over time by each generation’s stretching to reach and eat the topmost leaves on trees.)
This research could someday allow us to help people with conditions that otherwise seem mysterious in origin – for example: young children with severe depression, that may actually result from experiences their mothers had before they were born; or diabetes and shortened lifespan in men whose fathers or grandfathers lived through a nutritional boomjust before puberty.
While the relationship between epigenetics and evolution is uncertain, it certainly can’t hurt to try to reduce your own stress levels. Not only will you feel better, but science may one day show you’ve helped your great-great-great-grandkids as well. Until then – we’ll keep exploring!
With DFC’s offices located in the Frontenac Arch Biosphere we’ve gotten used to being close to nature. Sometimes, after work, I’ll take a quick jaunt out to the lake right across the road, and watch from the shore as the fish do their thing. It’s quite relaxing! But I’ve long been curious about how our aquatic neighbours survive our harsh snowy winters. I mean, way down at the bottom of the lake, covered with ice, they must run out of oxygenated water pretty quickly…?
Turns out, they do – but some species that winter in frozen ponds and lakes have adapted a workaround. A study organized by the University of Oslo has looked at wild carp (and their domestic counterparts, goldfish) and determined that, when faced with reduced oxygen levels, these specific fish metabolize the carbohydrates in their systems into easily eliminated alcohols, rather than the usual toxic lactic acid. Essentially, their tiny bodies turn into living breweries!
“This comprises a modification of a set of the enzymes that channel energy-rich carbohydrates into mitochondria, the energy-producing parts of a cell. During their evolution, the fish gained a second set of the enzymes, which helps turn the metabolic products into alcohol when oxygen levels drop. The enzymes act in essentially the same way as brewer’s yeast.
‘Usually, other species die long before the decrease in oxygen availability is even a problem for the crucian carp,’ says [team leader Catherine] Fagernes. ‘By using this method, the fish gets rid of the dangerous end products.’”
It’s a true case of survival of the fittest. And also the tipsiest: for most of the winter, the carp’s blood alcohol levels would make them legally too drunk to drive. (If they could drive to begin with, of course…) I now think a little differently about the hardiness of my carp friends in a lake: instead of feeling pity, I’m now really impressed they’ve harnessed the power of science to make the winter a little less bleak, and more energy efficient!
When the calendar clicks over to September, the time has come to start accepting that summer is nearly over. While I will miss the long days and the hours outside in beautiful sun, what I won’t miss is how those can combine into scary opportunities for melanoma (skin cancer) to strike.* Though I’m diligent with the SPF, there’s always a tiny “what-if” in the back of my mind…
That “what-if” is calmed somewhat by fascinating news from the University of Waterloo and the Sunnybrook Research Institute, who have combine forces to develop an AI that can help doctors identify melanoma sooner. Melanoma is very curable in the early stages, but can become quite dangerous if it’s allowed to become entrenched. (Please, talk to your doctor if you have a mole or an odd-looking patch on your skin you’re unsure about!).
This new AI has learned from tens of thousands of images of skin lesions and their underlying biological data, so it can reliably identify which are harmless and which worthy of a biopsy. This saves a great deal of health care costs, both in doctors’ hours and in surgical resources, and helps patients who don’t need a biopsy avoid the discomfort of one.
“Currently, dermatologists largely rely on subjective visual examinations of skin lesions such as moles to decide if patients should undergo biopsies to diagnose the disease.
The new system deciphers levels of biomarker substances in lesions, adding consistent, quantitative information to assessments currently based on appearance alone. In particular, changes in the concentration and distribution of eumelanin, a chemical that gives skin its colour, and hemoglobin, a protein in red blood cells, are strong indicators of melanoma.”
The ultimate goal of this technology is to help reduce the time spent diagnosing cases – during which a lesion can rapidly transition from worrisome to a major problem. The team is happy to report that their AI could be available for doctors to use next year: hopefully well before the sunburn-happy prime summer months!
* The Canadian Dermatology Association recommends sunscreen use during the winters in Canada as well, especially if you engage in snow sports, where the sun’s rays can reflect off the snow.
Humans share a great deal of similarities with our chimpanzee cousins, including our use of stone tools, our love of fruit, and the fact we both pass the mirror test. But for the many ways we are strikingly alike, humans have one heartbreaking difference lurking in our brains: we develop Alzheimer’s disease, as no other primate can.
However, researchers have recently discovered the first signs of cognitive decline, similar to Alzheimer’s, in chimpanzees – an early stage that, for some mysterious reason, seems to progress no further. This self-limiting action, in so close a relative, may point to a way of avoiding Alzheimer’s in humans. And now scientists are looking at chimp brains for clues.
Researchers out of Kent State were granted full access to a bank of 20 brain samples, collected by the National Chimpanzee Brain Resource from chimps who had died naturally in captivity between 37 and 62 years of age. They looked for elevated levels of amyloid beta, a protein that breaks down quickly in healthy human brains, but doesn’t in cases of Alzheimer’s. Extra amyloid beta leads to an accumulation of plaques between neurons. Plaques cause another protein, tau, to collect into tangles that affects healthy brain cells, and hence, cognition.
“Interestingly, traces of amyloid beta were higher in chimp blood vessels than in plaques — that’s not what typically happens in humans. A build-up of amyloid beta deposits in the brain’s blood vessels does occur in humans (a condition known as cerebral amyloid angiopathy), but the predominant effect of amyloid beta in our species is the production of excess plaque. ‘This suggests that amyloid buildup in the brain’s blood vessels precedes plaque formation in chimpanzees,’ noted study co-author Melissa Edler.”
Amyloid beta’s presence in chimps’ blood instead of their brains might be an indicator of why they don’t seem to experience severe cognitive decline and we do. But the scientists on the team acknowledge that this tiny spark of possibility needs to be fanned into a flame with full research – that must be done in an ethical manner with the help of our furry brethren. Until then, the attempt to unravel the terrifying mystery of the human brain will continue.
We at DFC have had friends celebrating a couple birthdays recently, and I only just realized that it’s been a long while since any of us have blown out candles on a cake! I guess it’s something that grown ups don’t really do anymore, and I briefly felt sad about it — until I read about this new study, published recently in the Journal of Food Research. A group of scientists has determined that the act of blowing out candles on a birthday cake sprays the festive confection with a staggering amount of bacteria.
They did this by undertaking the most charmingly crafty experiment I’ve ever heard of, preparing:
“two test birthday ‘cakes’ made of Styrofoam which they then spread with real icing […] and decorated with exactly 17 candles. Before having volunteers blow out the candles on both cakes, they had all of them smell and consume a piece of hot pizza — ‘to simulate a meal-dessert sequence.’ Afterwards, they compared the amount of bacteria present on each cake surface, and then repeated the whole exercise three times”
This experiment found that not only was the natural bacteria population of these cakes increased by a whopping 1400%, but the range of the bacteria was increased by a factor of 100, by the force of the puff.
Thankfully, the scientists say, it’s not all bad. As we have learned from recent research into the population of healthy bacteria in our bodies — our microbiome — exposure to new bacteria, care of little Timmy’s overenthusiastic candle snuffing, can actually boost our immune systems. It’s when little Timmy’s sick, though, that the researchers warn us to skip the cake, and the cold.
We at DFC love it when science turns to our animal friends for answers to human problems. It keeps everything in perspective, reminding us that, after all, humans are animals too!
Now, researchers are looking to the elegant slug for insight into a new type of surgical adhesive. As we learned when looking into the adhesive properties of the octopus tentacle current adhesives are sometimes hard to stick to wet and/or irregularly shaped organs. But slugs are experts at creating mucus that keeps them stuck to all kinds of things.
A team out of Harvard has just published a paper in Science, building on a previous analysis of the mucus of Arion subfuscus, a slug common in Western Europe. The mucus has two key parts: polycations that create the physical bond between mucus and surface, and a matrix that can process stresses on that bond. The Harvard team, led by David J. Mooney, sought to replicate these two factors with artificial materials.
“[The] team created a stress-dissipating matrix from cross-linked polymers, polyacrylamide, and alginate. The researchers then coated the matrix with the polycation chitosan, which inserts itself into the matrix and produces an adhesive surface.
[…]The researchers tested the adhesive on pig skin, liver, heart, and cartilage and found that it was stronger than both cyanoacrylate (superglue) and a surgical sealant called CoSeal.”
The field is particularly excited by this adhesive’s success with closing wounds on the liver, a notoriously finicky organ to repair. Further development and testing will occur — slow and steady may yet win this race for medical achievement!