HELP SAVE THE WORLD TODAY

EDUCATION IS THE MOST POWERFUL WEAPON WHICH YOU CAN USE TO CHANGE THE WORLD.

RICH_SCI_DATES

I LOVE YOU BECAUSE YOU'RE AWESOME JUST LIKE ME!

GIVE'S YOU THE BETTER...

FRIENDSHIP... IS NOT SOMETHING YOU LEARN IN SCHOOL. BUT IF YOU HAVEN'T LEARNED THE MEANING OF FRIENDSHIP, YOU REALLY HAVEN'T LEARNED ANYTHING.

DO YOU KNOW...

THE PERSON WHO YOU'RE WITH MOST IN LIFE IS YOURSELF AND IF YOU DON'T LIKE YOURSELF YOU'RE ALWAYS WITH SOMEBODY YOU DON'T LIKE.

MAKING IT HAPPEN

WHERE JUSTICE IS DENIED, WHERE POVERTY IS ENFORCED, WHERE IGNORANCE PREVAILS, AND WHERE ANY ONE CLASS IS MADE TO FEEL THAT SOCIETY IS AN ORGANIZED CONSPIRACY TO OPPRESS, ROB AND DEGRADE THEM, NEITHER PERSONS NOR PROPERTY WILL BE SAFE.

Sunday, December 31, 2017

NASA warning: ‘Unseen’ asteroid set to skim Earth

 

EXCLUSIVE: NASA is closely monitoring the arrival of a previously unseen asteroid that is set to brush past Earth today at 21,362 miles per hour.

The space rock had been invisible to astronomers until Christmas Day when it was first identified.
Now called asteroid2017 YZ4, it will come past us between the Earth and the Moon at a distance of just 139,433 miles.
The Moon is 238,000 miles from Earth, and this pass is considered a cat’s whisker in astronomical terms.
NASA considers and monitors anything which comes within six million miles of our planet as a near earth asteroid.
This year, we discovered 1,985 new near Earth asteroids. There were 1888 such objects discovered in 2016 and 1,571 in 2015.
NASA spokesman
 

Wednesday, August 16, 2017

Mystery of how first animals appeared on Earth solved

Research has solved the mystery of how the first animals appeared on Earth, a pivotal moment for the planet without which humans would not exist.

Research led by The Australian National University (ANU) has solved the mystery of how the first animals appeared on Earth, a pivotal moment for the planet without which humans would not exist.
Lead researcher Associate Professor Jochen Brocks said the team found the answer in ancient sedimentary rocks from central Australia.
"We crushed these rocks to powder and extracted molecules of ancient organisms from them," said Dr Brocks from the ANU Research School of Earth Sciences.
"These molecules tell us that it really became interesting 650 million years ago. It was a revolution of ecosystems, it was the rise of algae."
Dr Brocks said the rise of algae triggered one of the most profound ecological revolutions in Earth's history, without which humans and other animals would not exist.
"Before all of this happened, there was a dramatic event 50 million years earlier called Snowball Earth," he said.
"The Earth was frozen over for 50 million years. Huge glaciers ground entire mountain ranges to powder that released nutrients, and when the snow melted during an extreme global heating event rivers washed torrents of nutrients into the ocean."
Dr Brocks said the extremely high levels of nutrients in the ocean, and cooling of global temperatures to more hospitable levels, created the perfect conditions for the rapid spread of algae. It was the transition from oceans being dominated by bacteria to a world inhabited by more complex life, he said.
"These large and nutritious organisms at the base of the food web provided the burst of energy required for the evolution of complex ecosystems, where increasingly large and complex animals, including humans, could thrive on Earth," Dr Brocks said.
The research is published in Nature, and the findings will be presented at the Goldschmidt Conference in Paris, France, this week.
Co-lead researcher Dr Amber Jarrett discovered ancient sedimentary rocks from central Australia that related directly to the period just after the melting of Snowball Earth.
"In these rocks we discovered striking signals of molecular fossils," said Dr Jarrett, an ANU Research School of Earth Sciences PhD graduate.
"We immediately knew that we had made a ground-breaking discovery that snowball Earth was directly involved in the evolution of large and complex life."

Wednesday, July 12, 2017

Creating music by thought alone

Newly-developed hands-free musical instrument now allows people to make music with their minds.

Neurologists have created a hands-free, thought-controlled musical instrument. They hope that this new instrument will help empower and rehabilitate patients with motor disabilities such as those from stroke, spinal cord injury, amputation, or amyotrophic lateral sclerosis (ALS).

The Encephalophone is a musical instrument that you control with your thoughts, without movement," explains Thomas Deuel, a neurologist at Swedish Medical Center and a neuroscientist at the University of Washington, and first author of the report.
"I am a musician and neurologist, and I've seen many patients who played music prior to their stroke or other motor impairment, who can no longer play an instrument or sing," says Deuel. "I thought it would be great to use a brain-computer instrument to enable patients to play music again without requiring movement."
The Encephalophone collects brain signals through a cap that transforms specific signals into musical notes. The invention is coupled with a synthesizer, allowing the user to create music using a wide variety of instrumental sounds.
Dr. Deuel originally developed the Encephalophone (patent pending) in his own independent laboratory, in collaboration with Dr. Felix Darvas, a physicist at the University of Washington. In this first report, they describe their development of the instrument, as well as their initial studies showing evidence of how easily the instrument might be used. This preliminary study showed that a trial group of 15 healthy adults were able to use the instrument to correctly recreate musical tones, with no prior training.
"We first sought to prove that novices -- subjects who had no training on the Encephalophone whatsoever -- could control the device with an accuracy that was better than random," says Deuel. "These first subjects did quite well, way above chance probability on their very first try.
The Encephalophone can be controlled via two independent types of brain signals: either those associated with the visual cortex (i.e. closing one's eyes), or those associated with thinking about movement. Control by thinking about movement may be the most useful for disabled patients, and Deuel plans to continue researching this application. But for now, this current study shows that, at least for this small group of novice users, control by eye closing is more accurate than control by imagining movements.
The Encephalophone is based on brain-computer interfaces using an old method, called electroencephalography, which measures electrical signals in the brain. Scientists first began converting these signals into sounds in the 1930s and, later, into music in the 1960s. But these methods were still difficult to control and were not easily accessible to non-specialist users.
In a collaboration with the Center for Digital Arts and Experimental Media (DXARTS), Deuel has built upon such research to make the Encephalophone more musically versatile, as well as easier to use.
Deuel and his collaborators are already working with more people to see how much users can improve with training. Deuel also plans to begin clinical trials of the Encephalophone later this year to see whether it may be useful or enjoyable for disabled patients.
"There is great potential for the Encephalophone to hopefully improve rehabilitation of stroke patients and those with motor disabilities," Deuel says.

Saturday, July 1, 2017

Cocoa and chocolate are not just treats -- they are good for your cognition

Cocoa can be seen as a dietary supplement to protect human cognition and can counteract different types of cognitive decline.
Researchers have examined the available literature for the effects of acute and chronic administration of cocoa flavanols on different cognitive domains. It turns out that cognitive performance was improved by a daily intake of cocoa flavanols.
 
A balanced diet is chocolate in both hands -- a phrase commonly used to justify ones chocolate snacking behavior. A phrase now shown to actually harbor some truth, as the cocoa bean is a rich source of flavanols: a class of natural compounds that has neuroprotective effects.
In their recent review published in Frontiers in Nutrition, Italian researchers examined the available literature for the effects of acute and chronic administration of cocoa flavanols on different cognitive domains. In other words: what happens to your brain up to a few hours after you eat cocoa flavanols, and what happens when you sustain such a cocoa flavanol enriched diet for a prolonged period of time?
Although randomized controlled trials investigating the acute effect of cocoa flavanols are sparse, most of them point towards a beneficial effect on cognitive performance. Participants showed, among others, enhancements in working memory performance and improved visual information processing after having had cocoa flavanols. And for women, eating cocoa after a night of total sleep deprivation actually counteracted the cognitive impairment (i.e. less accuracy in performing tasks) that such a night brings about. Promising results for people that suffer from chronic sleep deprivation or work shifts.
It has to be noted though, that the effects depended on the length and mental load of the used cognitive tests to measure the effect of acute cocoa consumption. In young and healthy adults, for example, a high demanding cognitive test was required to uncover the subtle immediate behavioral effects that cocoa flavanols have on this group.
The effects of relatively long-term ingestion of cocoa flavanols (ranging from 5 days up to 3 months) has generally been investigated in elderly individuals. It turns out that for them cognitive performance was improved by a daily intake of cocoa flavanols. Factors such as attention, processing speed, working memory, and verbal fluency were greatly affected. These effects were, however, most pronounced in older adults with a starting memory decline or other mild cognitive impairments.
And this was exactly the most unexpected and promising result according to authors Valentina Socci and Michele Ferrara from the University of L'Aquila in Italy. "This result suggests the potential of cocoa flavanols to protect cognition in vulnerable populations over time by improving cognitive performance. If you look at the underlying mechanism, the cocoa flavanols have beneficial effects for cardiovascular health and can increase cerebral blood volume in the dentate gyrus of the hippocampus. This structure is particularly affected by aging and therefore the potential source of age-related memory decline in humans."
So should cocoa become a dietary supplement to improve our cognition? "Regular intake of cocoa and chocolate could indeed provide beneficial effects on cognitive functioning over time. There are, however, potential side effects of eating cocoa and chocolate. Those are generally linked to the caloric value of chocolate, some inherent chemical compounds of the cocoa plant such as caffeine and theobromine, and a variety of additives we add to chocolate such as sugar or milk."
Nonetheless, the scientists are the first to put their results into practice: "Dark chocolate is a rich source of flavanols. So we always eat some dark chocolate. Every day."

Monday, May 1, 2017

Gene editing strategy eliminates HIV-1 infection in live animals

Scientists have demonstrated that HIV-1 replication can be completely shut down and the virus eliminated from infected cells in animals with a powerful gene editing technology known as CRISPR/Cas9.
Credit: © gamjai / Fotolia


A permanent cure for HIV infection remains elusive due to the virus's ability to hide away in latent reservoirs. But now, scientists show that they can excise HIV DNA from the genomes of living animals to eliminate further infection.

A permanent cure for HIV infection remains elusive due to the virus's ability to hide away in latent reservoirs. But now, in new research published in print May 3 in the journal Molecular Therapy, scientists at the Lewis Katz School of Medicine at Temple University (LKSOM) and the University of Pittsburgh show that they can excise HIV DNA from the genomes of living animals to eliminate further infection. They are the first to perform the feat in three different animal models, including a "humanized" model in which mice were transplanted with human immune cells and infected with the virus.
The team is the first to demonstrate that HIV-1 replication can be completely shut down and the virus eliminated from infected cells in animals with a powerful gene editing technology known as CRISPR/Cas9. The work was led by Wenhui Hu, MD, PhD, currently Associate Professor in the Center for Metabolic Disease Research and the Department of Pathology (previously in the Department of Neuroscience) at LKSOM; Kamel Khalili, PhD, Laura H. Carnell Professor and Chair of the Department of Neuroscience, Director of the Center for Neurovirology, and Director of the Comprehensive NeuroAIDS Center at LKSOM; and Won-Bin Young, PhD. Dr. Young was Assistant Professor in the Department of Radiology at the University of Pittsburgh School of Medicine at the time of the research. Dr. Young recently joined LKSOM.
The new work builds on a previous proof-of-concept study that the team published in 2016, in which they used transgenic rat and mouse models with HIV-1 DNA incorporated into the genome of every tissue of the animals' bodies. They demonstrated that their strategy could delete the targeted fragments of HIV-1 from the genome in most tissues in the experimental animals.
"Our new study is more comprehensive," Dr. Hu said. "We confirmed the data from our previous work and have improved the efficiency of our gene editing strategy. We also show that the strategy is effective in two additional mouse models, one representing acute infection in mouse cells and the other representing chronic, or latent, infection in human cells."
In the new study, the team genetically inactivated HIV-1 in transgenic mice, reducing the RNA expression of viral genes by roughly 60 to 95 percent, confirming their earlier findings. They then tested their system in mice acutely infected with EcoHIV, the mouse equivalent of human HIV-1.
"During acute infection, HIV actively replicates," Dr. Khalili explained. "With EcoHIV mice, we were able to investigate the ability of the CRISPR/Cas9 strategy to block viral replication and potentially prevent systemic infection." The excision efficiency of their strategy reached 96 percent in EcoHIV mice, providing the first evidence for HIV-1 eradication by prophylactic treatment with a CRISPR/Cas9 system.
In the third animal model, latent HIV-1 infection was recapitulated in humanized mice engrafted with human immune cells, including T cells, followed by HIV-1 infection. "These animals carry latent HIV in the genomes of human T cells, where the virus can escape detection," Dr. Hu explained. Following a single treatment with CRISPR/Cas9, viral fragments were successfully excised from latently infected human cells embedded in mouse tissues and organs.
In all three animal models, the researchers utilized a recombinant adeno-associated viral (rAAV) vector delivery system based on a subtype known as AAV-DJ/8. "The AAV-DJ/8 subtype combines multiple serotypes, giving us a broader range of cell targets for the delivery of our CRISPR/Cas9 system," Dr. Hu said. They also re-engineered their previous gene editing apparatus to now carry a set of four guide RNAs, all designed to efficiently excise integrated HIV-1 DNA from the host cell genome and avoid potential HIV-1 mutational escape.
To determine the success of the strategy, the team measured levels of HIV-1 RNA and used a novel live bioluminescence imaging system. "The imaging system, developed by Dr. Young while at the University of Pittsburgh, pinpoints the spatial and temporal location of HIV-1-infected cells in the body, allowing us to observe HIV-1 replication in real-time and to essentially see HIV-1 reservoirs in latently infected cells and tissues," Dr. Khalili explained.
The new study marks another major step forward in the pursuit of a permanent cure for HIV infection. "The next stage would be to repeat the study in primates, a more suitable animal model where HIV infection induces disease, in order to further demonstrate elimination of HIV-1 DNA in latently infected T cells and other sanctuary sites for HIV-1, including brain cells," Dr. Khalili said. "Our eventual goal is a clinical trial in human patients."

Tuesday, April 25, 2017

Mission control: Salty diet makes you hungry, not thirsty

Salty snacks. Surprisingly, in the long run, a salty diet causes people to drink less.
Credit: © fotofabrika / Fotolia


New studies show that salty food diminishes thirst while increasing hunger, due to a higher need for energy

We've all heard it: eating salty foods makes you thirstier. But what sounds like good nutritional advice turns out to be not true in the long run. In a study carried out during a simulated mission to Mars, an international group of scientists has found exactly the opposite to be true. 'Cosmonauts' who ate more salt retained more water, weren't as thirsty, and needed more energy.

For some reason, no one had ever carried out a long-term study to determine the relationship between the amount of salt in a person's diet and his drinking habits. Scientists have known that increasing a person's salt intake stimulates the production of more urine -- it has simply been assumed that the extra fluid comes from drinking. Not so fast! say researchers from the German Aerospace Center (DLR), the Max Delbrück Center for Molecular Medicine (MDC), Vanderbilt University and colleagues around the world. Recently they took advantage of a simulated mission to Mars to put the old adage to the test. Their conclusions appear in two papers in the current issue of The Journal of Clinical Investigation.
What does salt have to do with Mars? Nothing, really, except that on a long space voyage conserving every drop of water might be crucial. A connection between salt intake and drinking could affect your calculations -- you wouldn't want an interplanetary traveler to die because he liked an occasional pinch of salt on his food. The real interest in the simulation, however, was that it provided an environment in which every aspect of a person's nutrition, water consumption, and salt intake could be controlled and measured.
The studies were carried out by Natalia Rakova (MD, PhD) of the Charité and MDC and her colleagues. The subjects were two groups of 10 male volunteers sealed into a mock spaceship for two simulated flights to Mars. The first group was examined for 105 days; the second over 205 days. They had identical diets except that over periods lasting several weeks, they were given three different levels of salt in their food.
The results confirmed that eating more salt led to a higher salt content in urine -- no surprise there. Nor was there any surprise in a correlation between amounts of salt and overall quantity of urine. But the increase wasn't due to more drinking -- in fact, a salty diet caused the subjects to drink less. Salt was triggering a mechanism to conserve water in the kidneys.
Before the study, the prevailing hypothesis had been that the charged sodium and chloride ions in salt grabbed onto water molecules and dragged them into the urine. The new results showed something different: salt stayed in the urine, while water moved back into the kidney and body. This was completely puzzling to Prof. Jens Titze, MD of the University of Erlangen and Vanderbilt University Medical Center and his colleagues. "What alternative driving force could make water move back?" Titze asked.
Experiments in mice hinted that urea might be involved. This substance is formed in muscles and the liver as a way of shedding nitrogen. In mice, urea was accumulating in the kidney, where it counteracts the water-drawing force of sodium and chloride. But synthesizing urea takes a lot of energy, which explains why mice on a high-salt diet were eating more. Higher salt didn't increase their thirst, but it did make them hungrier. Also the human "cosmonauts" receiving a salty diet complained about being hungry.
The project revises scientists' view of the function of urea in our bodies. "It's not solely a waste product, as has been assumed," Prof. Friedrich C. Luft, MD of the Charité and MDC says. "Instead, it turns out to be a very important osmolyte -- a compound that binds to water and helps transport it. Its function is to keep water in when our bodies get rid of salt. Nature has apparently found a way to conserve water that would otherwise be carried away into the urine by salt."
The new findings change the way scientists have thought about the process by which the body achieves water homeostasis -- maintaining a proper amount and balance. That must happen whether a body is being sent to Mars or not. "We now have to see this process as a concerted activity of the liver, muscle and kidney," says Jens Titze.
"While we didn't directly address blood pressure and other aspects of the cardiovascular system, it's also clear that their functions are tightly connected to water homeostasis and energy metabolism."

Policymakers 'flying blind' into the future of work

New kinds of data needed to assess technology's impact on jobs

Will a robot take away my job? Many people ask that question, yet policymakers don't have the kind of information they need to answer it intelligently, say the authors of a new study.

Will a robot take away my job? Many people ask that question, yet policymakers don't have the kind of information they need to answer it intelligently, say the authors of a new study from the National Academies of Sciences, Engineering and Medicine (NASEM).
"Policymakers are flying blind into what has been called the fourth industrial revolution," said Tom M. Mitchell, the E. Fredkin University Professor in the Carnegie Mellon University School of Computer Science, and Erik Brynjolfsson, the Schussel Family Professor in the MIT Sloan School of Management, co-chairs of the NASEM study.
Government agencies need to collect different kinds of labor data to accurately assess and predict how computer and robotic technologies will affect the workplace, Mitchell and Brynjolfsson said. Failure to do so could, at best, result in missed opportunities; at worst, it could be disastrous.
The study, "Information Technology and the U.S. Workforce: Where Are We and Where Do We Go From Here," and a related commentary by Mitchell and Brynjolfsson was published today by the journal Nature.
Information technology, artificial intelligence and robotics will affect almost all occupations, but how that will occur for each is unclear. Many people will be displaced by technology, while the demand for other jobs will increase. New industries will be born and other as-yet-unimagined jobs will be created.
These future effects likely will be larger than have already been seen, the NASEM report says, but it's hard to say definitively if technology will expand or shrink the workforce.
"There is a dramatic shortage of information and data about the exact state of the workforce and automation, so policymakers don't know answers to even basic questions such as 'Which types of technologies are currently having the greatest impacts on jobs?' and 'What new technologies are likely to have the greatest impact in the next few years?'" Mitchell said.
"Our NASEM study report details a number of both positive and negative influences technology has had on the workforce," Mitchell said. "These include replacing some jobs by automation, creating the opportunity for new types of freelance work in companies like Uber and Lyft, and making education and retraining courses available to everyone through the internet. But nobody can judge today the relative impact these different forces have made on the workforce, or their net outcome."
More research is needed to better understand these different influences of technology on the workforce, and how they will add up. Automation is better than humans at some tasks, but not all. Routine information-processing and manual tasks are readily automated, for instance, but people remain more creative and adaptable, and have better interpersonal skills. Some occupations may be reorganized accordingly and some skills that today aren't recognized or directly compensated may grow in value.
The NASEM panel recommended that to prepare students for a constantly changing workforce, schools should focus attention on those uniquely human characteristics that could differentiate people from machines in the workplace, and emphasize training in fields expected to drive the future economy.
The panel said new data sources, methods and infrastructures are necessary to support this research. In their Naturecommentary, Mitchell and Brynjolfsson go further, calling for the government to create an integrated information strategy to combine public and privately held data.
"Governments must learn the lessons that industry has learned over the past decade, about how to take advantage of the exploding volume of online, real-time data to design more attractive products and more effective management policies," Mitchell said.
Similarly, he and Brynjolfsson argue, governments must shift from the current "plan then implement" paradigm for making policy, to a more iterative "sense and respond" paradigm that monitors the impacts of new policies, measures their effectiveness and adapts to optimize those policies based on their observed impacts.

Is soda bad for your brain? (And is diet soda worse?)

Matthew Pase is lead author on two studies that link higher consumption of both sugary and artificially sweetened drinks to adverse brain effects.
Credit: Cydney Scott


Both sugary, diet drinks correlated with accelerated brain aging

Excess sugar -- especially the fructose in sugary drinks -- might damage your brain, new research suggests. Researchers found that people who drink sugary beverages frequently are more likely to have poorer memory, smaller overall brain volume, and a significantly smaller hippocampus. A follow-up study found that people who drank diet soda daily were almost three times as likely to develop stroke and dementia when compared to those who did not.

Americans love sugar. Together we consumed nearly 11 million metric tons of it in 2016, according to the US Department of Agriculture, much of it in the form of sugar-sweetened beverages like sports drinks and soda.
Now, new research suggests that excess sugar -- especially the fructose in sugary drinks -- might damage your brain. Researchers using data from the Framingham Heart Study (FHS) found that people who drink sugary beverages frequently are more likely to have poorer memory, smaller overall brain volume, and a significantly smaller hippocampus -- an area of the brain important for learning and memory.
But before you chuck your sweet tea and reach for a diet soda, there's more: a follow-up study found that people who drank diet soda daily were almost three times as likely to develop stroke and dementia when compared to those who did not.
Researchers are quick to point out that these findings, which appear separately in the journals Alzheimer's & Dementia and Stroke, demonstrate correlation but not cause-and-effect. While researchers caution against over-consuming either diet soda or sugary drinks, more research is needed to determine how -- or if -- these drinks actually damage the brain, and how much damage may be caused by underlying vascular disease or diabetes.
"These studies are not the be-all and end-all, but it's strong data and a very strong suggestion," says Sudha Seshadri, a professor of neurology at Boston University School of Medicine (MED) and a faculty member at BU's Alzheimer's Disease Center, who is senior author on both papers. "It looks like there is not very much of an upside to having sugary drinks, and substituting the sugar with artificial sweeteners doesn't seem to help."
"Maybe good old-fashioned water is something we need to get used to," she adds.
Matthew Pase, a fellow in the MED neurology department and an investigator at the FHS who is corresponding author on both papers, says that excess sugar has long been associated with cardiovascular and metabolic diseases like obesity, heart disease, and type 2 diabetes, but little is known about its long-term effects on the human brain. He chose to study sugary drinks as a way of examining overall sugar consumption. "It's difficult to measure overall sugar intake in the diet," he says, "so we used sugary beverages as a proxy."
For the first study, published in Alzheimer's & Dementia on March 5, 2017, researchers examined data, including magnetic resonance imaging (MRI) scans and cognitive testing results, from about 4,000 people enrolled in the Framingham Heart Study's Offspring and Third-Generation cohorts. (These are the children and grandchildren of the original FHS volunteers enrolled in 1948.) The researchers looked at people who consumed more than two sugary drinks a day of any type -- soda, fruit juice, and other soft drinks -- or more than three per week of soda alone. Among that "high intake" group, they found multiple signs of accelerated brain aging, including smaller overall brain volume, poorer episodic memory, and a shrunken hippocampus, all risk factors for early-stage Alzheimer's disease. Researchers also found that higher intake of diet soda -- at least one per day -- was associated with smaller brain volume.
In the second study, published in Stroke on April 20, 2017, the researchers, using data only from the older Offspring cohort, looked specifically at whether participants had suffered a stroke or been diagnosed with dementia due to Alzheimer's disease. After measuring volunteers' beverage intake at three points over seven years, the researchers then monitored the volunteers for 10 years, looking for evidence of stroke in 2,888 people over age 45, and dementia in 1,484 participants over age 60. Here they found, surprisingly, no correlation between sugary beverage intake and stroke or dementia. However, they found that people who drank at least one diet soda per day were almost three times as likely to develop stroke and dementia.
Although the researchers took age, smoking, diet quality, and other factors into account, they could not completely control for preexisting conditions like diabetes, which may have developed over the course of the study and is a known risk factor for dementia. Diabetics, as a group, drink more diet soda on average, as a way to limit their sugar consumption, and some of the correlation between diet soda intake and dementia may be due to diabetes, as well as other vascular risk factors. However, such preexisting conditions cannot wholly explain the new findings.
"It was somewhat surprising that diet soda consumption led to these outcomes," says Pase, noting that while prior studies have linked diet soda intake to stroke risk, the link with dementia was not previously known. He adds that the studies did not differentiate between types of artificial sweeteners and did not account for other possible sources of artificial sweeteners. He says that scientists have put forth various hypotheses about how artificial sweeteners may cause harm, from transforming gut bacteria to altering the brain's perception of "sweet," but "we need more work to figure out the underlying mechanisms."

Genetics, environment combine to give everyone a unique sense of smell

Genetically identical mice exposed to different smells as they grow up develop different olfactory receptors in their noses.Credit: © Marion Wear / Fotolia


Genetically identical mice develop different smell receptors in response to their environments.

Receptors in the noses of mice exposed to certain smells during life are different to genetically similar mice that lived without those smells, new research shows. The study found it is this combination of genetics and experience that gives each individual a unique sense of smell.

Researchers from the Wellcome Trust Sanger Institute and their collaborators have shown that receptors in the noses of mice exposed to certain smells during life are different to genetically similar mice that lived without those smells. Published today in eLife, the study found it is this combination of genetics and experience that gives each individual a unique sense of smell.
Our sense of smell comes from the olfactory organ in the nose, which is made up of sensory neurons containing receptors that can detect odours. There are about one thousand types of olfactory receptors in the nose, compared with only three types of visual receptors in the eye, and 49 types of taste receptors on the tongue. Of our senses, the olfactory system is the most complex, and combinations of signals from different olfactory receptors allow people to smell an enormously large repertoire of odours. However, how different people vary in their smelling abilities is not well understood.
To investigate the sense of smell the researchers used laboratory mice as a model, comparing the olfactory neurons from genetically identical animals that grew up in different environments. They also compared animals that grew up in the same environment but were genetically different.
The team used RNA sequencing to see which receptor genes were active. The researchers found that genetics controlled which receptors were present in the mice. Crucially however, they found that the environment that the individual had lived in had a significant effect on the number of cells able to identify each smell.
Professor Fabio Papes, an author on the paper from the University of Campinas in Brazil, said: "It became clear that the role of genes, especially those that encode olfactory receptors in the genome, is very important in the construction of nasal tissue, but there was a very remarkable contribution of the environment, something that has not been previously described to this extent. We found the cellular and molecular construction of the olfactory tissue at a given moment is prepared not only by the organism's genes but also by its life history."
Olfactory neurons are formed throughout an individual's lifetime, and the study showed the olfactory system adapted to the environment, leading to more cells capable of detecting scents to which there has been greater exposure. As a consequence, different individuals, even if genetically similar, may have completely different olfactory abilities. This could contribute to the individuality of the sense of smell, even in humans.
The knowledge that an individual's history can affect the structure of olfactory tissue neurons may have implications for personalised medicine as different people's sense organs could be constructed differently and respond in different ways. Studying olfactory neurons can also provide information about how the neurons in the brain are organised and function.
Dr Darren Logan, the lead author on the study from the Wellcome Trust Sanger Institute, said: "The neurons in the olfactory system are highly connected to the neurons in the brain and studying these can help us understand neuronal development. We have shown that each individual has a very different combination of possible olfactory neurons, driven by genetics. In this study we also show that, with experience of different smells, these combinations of neurons change, so both genetics and environment interplay to give every individual a unique sense of smell."

Friday, April 21, 2017

Macrophages conduct electricity, help heart to beat

The image shows a volumetric reconstruction of a human atrioventricular node. Cardiomyocytes (red) appear densely interspersed with macrophages (green).
Credit: Maarten Hulsmans & Matthias Nahrendorf


Macrophages have a previously unrecognized role in helping the mammalian heart beat in rhythm. Researchers have discovered that macrophages aggregate around central cardiac cells that regulate electrical impulses within the mouse heart, helping the cells conduct electricity. Mice that were genetically engineered to lack macrophages have irregular heartbeats, hinting that these immune cells may also play a role in heart disease.

Macrophages, immune cells known for their PAC-MAN-like ingestion of microbial intruders and biological waste, have a previously unrecognized role in helping the mammalian heart beat in rhythm. Massachusetts General Hospital researchers discovered that macrophages aggregate around central cardiac cells that regulate electrical impulses within the mouse heart, helping the cells conduct electricity. Mice that were genetically engineered to lack macrophages have irregular heartbeats, hinting that these immune cells may also play a role in heart disease. The findings appear April 20 in the journal Cell.

"This work opens up a completely new view on electrophysiology; now, we have a new cell type on the map that is involved in conduction," says senior author Matthias Nahrendorf, a systems biologist at Massachusetts General Hospital, Harvard Medical School. "Macrophages are famous for sensing their environment and changing their phenotype very drastically, so you can think about a situation where there is inflammation in the heart that may alter conduction, and we now need to look at whether these cells are causally involved in conduction abnormalities."
Researchers have known for decades that macrophages are in high abundance around inflamed or diseased hearts, but Nahrendorf's investigation began when he asked what the immune cells were doing in a healthy heart. After sending a mouse model depleted of macrophages for a heart MRI and electrocardiogram, the technician reported back that something was wrong; the mouse's heart was beating too slowly. Tests in a healthy rodent revealed a high density of resident macrophages at the heart's atrioventricular node, which passes electricity from the atria to the ventricles.
Nahrendorf showed the results to his colleagues, David Milan and Patrick Ellinor, both electrophysiologists at Massachusetts General Hospital, who responded by opening the doors to their labs. Together, the teams found that macrophages extend their cell membranes between cardiac cells and create pores, also called gap junctions, for the electrical current to flow through. The macrophages contribute by preparing the conducting heart cells for the next burst of electricity so conducting cells are able to keep up with a fast contraction rhythm.
"When we got the first patch clamp data that showed the macrophages in contact with cardiomyoctes were rhythmically depolarizing, that was the moment I realized they weren't insulating, but actually helping to conduct," Nahrendorf says. "This work was very exciting because it was an example of how team science can help to connect fields that are traditionally separated -- in this case, immunology and electrophysiology."
The group will follow up by looking at whether macrophages are involved in common conduction abnormalities. There are also potential connections between macrophages and anti-inflammatory drugs, which are widely reported to help with heart disease. If macrophages do play a role in disease, the researchers say it can open up a new line of therapeutics, as these immune cells naturally consume foreign molecules in their presence and are easy to target as a result.

Water is streaming across Antarctica


New survey finds liquid flow more widespread than thought

In the first such continent-wide survey, scientists have found extensive drainages of meltwater flowing over parts of Antarctica's ice during the brief summer.

In the first such continent-wide survey, scientists have found extensive drainages of meltwater flowing over parts of Antarctica's ice during the brief summer. Researchers already knew such features existed, but assumed they were confined mainly to Antarctica's fastest-warming, most northerly reaches. Many of the newly mapped drainages are not new, but the fact they exist at all is significant; they appear to proliferate with small upswings in temperature, so warming projected for this century could quickly magnify their influence on sea level. An accompanying study looks at how such systems might influence the great ice shelves ringing the continent, which some researchers fear could collapse, bringing catastrophic sea-level rises. Both studies appear this week in the leading scientific journal Nature.

Explorers and scientists have documented a few Antarctic melt streams starting in the early 20th century, but no one knew how extensive they were. The authors found out by systematically cataloging images of surface water in photos taken from military aircraft from 1947 onward, and satellite imagery from 1973 on. They found nearly 700 seasonal systems of interconnected ponds, channels and braided streams fringing the continent on all sides. Some run as far as 75 miles, with ponds up to several miles wide. They start as close as 375 miles from the South Pole, and at 4,300 feet above sea level, where liquid water was generally thought to be rare to impossible.
"This is not in the future -- this is widespread now, and has been for decades," said lead author Jonathan Kingslake, a glaciologist at Columbia University's Lamont-Doherty Earth Observatory. "I think most polar scientists have considered water moving across the surface of Antarctica to be extremely rare. But we found a lot of it, over very large areas." The data are too sparse in many locations for the researchers to tell whether the extent or number of drainages have increased over the seven decades covered by the study. "We have no reason to think they have," said Kingslake. "But without further work, we can't tell. Now, looking forward, it will be really important to work out how these systems will change in response to warming, and how this will affect the ice sheets."
Many of the newly mapped drainages start near mountains poking through glaciers, or in areas where powerful winds have scoured snow off underlying bluish ice. These features are darker than the mostly snow-covered ice sheet, and so absorb more solar energy. This causes melting, and on a slope, liquid water then melts a path downhill through overlying snow. If the continent warms this century as projected, this process will occur on a much larger scale, say the authors. "This study tells us there's already a lot more melting going on than we thought," said coauthor Robin Bell, a Lamont-Doherty polar scientist. "When you turn up the temperature, it's only going to increase."
Antarctica is already losing ice, but the direct effects of meltwater, which generally refreezes in winter, are probably negligible for now. The concern among glaciologists is that this could change in the future. Most loss right now is taking place near the edges, where giant, floating shelves of ice attached to the land are being eroded from underneath by warming ocean currents. The shelves, which ring three-quarters of Antarctica, help hold back the land-bound glaciers behind them, and as they lose mass, glaciers appear to be accelerating their march to the sea.
The most dramatic example is the Antarctic Peninsula, which juts far north from the main ice sheet, and where average temperatures have soared 7 degrees Fahrenheit in the last 50 years. In 1995 and 2002, large chunks of the peninsula's Larsen Ice Shelf suddenly disintegrated into the ocean within days. Scientists now suspect that pooling water was at work; liquid tends to burrow down, fracturing the ice with heat or pressure, or both, until a shattering point is reached. Today, another giant piece of the Larsen is cracking, and could come apart at any time.
Further south, temperatures have remained more or less stable, but many of the newly spotted streams there already make their way from the interior out onto ice shelves, or originate on the shelves themselves. That raises the specter that such collapses could happen across much vaster reaches of Antarctica this century, should warming proceed as expected, said Kingslake.
On the other hand, an accompanying study led by Bell found that a longtime drainage on West Antarctica's Nansen Ice Shelf may actually be helping keep the shelf together. The elaborate river-like system on the 30-mile-long shelf was first observed in 1909, by a team from the expedition led by British explorer Ernest Shackleton. Aerial imagery and remote sensing since then shows it has remained remarkably stable, efficiently draining excess meltwater during summer through a series of deep sinkholes and a roaring 400-foot-wide waterfall into the ocean. "It could develop this way in other places, or things could just devolve into giant slush puddles," said Bell. "Ice is dynamic and complex, and we don't have the data yet."
Near the other pole, seasonal melt streams and ponds are far more common on the fast-warming Greenland ice sheet, and their growing influence may hold lessons. In recent years as much as 90 percent of Greenland's ice surface has undergone some degree of seasonal melting. Much of the water probably stays at or near the surface and refreezes in winter. But in some areas, it is plunging through deep holes to underlying rock, lubricating glaciers' slide to the sea. In others, water may be refreezing near the surface into solid sheets that can more easily channel surface melt to the sea in succeeding seasons. Until recently, icebergs discharged from glaciers were Greenland's main contributor to sea-level rise. But between 2011 and 2014, 70 percent of the 269 million tons of Greenland's ice and snow lost to the ocean came directly from meltwater, not icebergs.
Antarctica's visible drainages may be the tip of the proverbial iceberg. Another study by a separate team published in January revealed that East Antarctica's Roi Baudouin Ice Shelf harbors a largely invisible liquid drainage just under the snow. The team, led by Utrecht University polar scientist Jan Lenaerts, detected it using radar images and drilling. They suspect that such features lurk in many places. And unlike surface streams, these ones are insulated, so may stay liquid year-round.
Helen Fricker, a glaciologist at Scripps Institution of Oceanography who was not involved the new studies, said of the continent-wide survey, "We knew there were other [melt] zones, but we didn't know exactly how extensive they are. This is a really nice study, as it does just that." Douglas MacAyeal, a glaciologist at the University of Chicago also not involved in the studies, said that until recently, "nobody's been that interested in melting," because most scientists thought it was relatively rare. Now, he said, "We're working hard to figure out if this stuff is relevant to sea-level predictions."

Naked mole-rats 'turn into plants' when oxygen is low

Ignore the whiskers and teeth -- these are plants.
Credit: Thomas Park/UIC


Discovery could lead to treatments for heart attack, stroke

Deprived of oxygen, naked mole-rats can survive by metabolizing fructose just as plants do -- a finding that could lead to treatments for heart attacks and strokes.

Deprived of oxygen, naked mole-rats can survive by metabolizing fructose just as plants do, researchers report this week in the journal Science.
Understanding how the animals do this could lead to treatments for patients suffering crises of oxygen deprivation, as in heart attacks and strokes.
"This is just the latest remarkable discovery about the naked mole-rat -- a cold-blooded mammal that lives decades longer than other rodents, rarely gets cancer, and doesn't feel many types of pain," says Thomas Park, professor of biological sciences at the University of Illinois at Chicago, who led an international team of researchers from UIC, the Max Delbrück Institute in Berlin and the University of Pretoria in South Africa on the study.
In humans, laboratory mice, and all other known mammals, when brain cells are starved of oxygen they run out of energy and begin to die.
But naked mole-rats have a backup: their brain cells start burning fructose, which produces energy anaerobically through a metabolic pathway that is only used by plants -- or so scientists thought.
In the new study, the researchers exposed naked mole-rats to low oxygen conditions in the laboratory and found that they released large amounts of fructose into the bloodstream. The fructose, the scientists found, was transported into brain cells by molecular fructose pumps that in all other mammals are found only on cells of the intestine.
"The naked mole-rat has simply rearranged some basic building-blocks of metabolism to make it super-tolerant to low oxygen conditions," said Park, who has studied the strange species for 18 years.
At oxygen levels low enough to kill a human within minutes, naked mole-rats can survive for at least five hours, Park said. They go into a state of suspended animation, reducing their movement and dramatically slowing their pulse and breathing rate to conserve energy. And they begin using fructose until oxygen is available again.
The naked mole-rat is the only known mammal to use suspended animation to survive oxygen deprivation.
The scientists also showed that naked mole-rats are protected from another deadly aspect of low oxygen -- a buildup of fluid in the lungs called pulmonary edema that afflicts mountain climbers at high altitude.
The scientists think that the naked mole-rats' unusual metabolism is an adaptation for living in their oxygen-poor burrows. Unlike other subterranean mammals, naked mole-rats live in hyper-crowded conditions, packed in with hundreds of colony mates. With so many animals living together in unventilated tunnels, oxygen supplies are quickly depleted.

Thursday, April 13, 2017

Research uses mirrors to make solar energy cost competitive

Concentrating solar power technologies use mirrors to reflect and concentrate sunlight to produce heat, which can then be used to produce electricity, according to ongoing work by mechanical engineers. These technologies present a distinct advantage over photovoltaic (PV) cells in their ability to store the sun’s energy as thermal energy, experts say.

If the current national challenge to make solar energy cost competitive with other forms of energy by the end of this decade is met, Ranga Pitchumani, the John R. Jones III Professor of Mechanical Engineering at Virginia Tech, will have played a significant role in the process.
U.S. Secretary of Energy Steven Chu announced the Department of Energy's SunShot Initiative in February 2011. Its objective was to reduce the installed cost of solar energy systems by about 75 percent in order to allow widespread, large-scale adoption of this renewable clean energy technology.
Following the announcement, Pitchumani was invited to direct the Concentrating Solar Power (CSP) program for the SunShot Initiative towards its ambitious goals.
"The SunShot goal is to get solar energy technologies to achieve cost-parity with other energy generation sources on the grid without subsidy by the year 2020. That's an aggressive mission which calls for several subcomponent innovations and ingenious system designs to drive costs down, while improving efficiencies," said Pitchumani.
"Concentrating solar power technologies use mirrors to reflect and concentrate sunlight to produce heat, which can then be used to produce electricity," Pitchumani explained. These technologies present a distinct advantage over photovoltaic (PV) cells in their ability to store the sun's energy as thermal energy, and represent a subset of the SunShot Initiative. Pitchumani is a leading expert in the field of concentrating solar power. He and his research group at Virginia Tech have developed novel thermal energy storage technologies for concentrating solar power applications that are widely published. He is the overall conference chair for SolarPACES 2013 this year, the foremost international meeting in the area of concentrating solar power systems, and is an editor for Solar Energy.
"Fossil fueled power plants pose a potential risk to the environment through an increased carbon footprint, and my efforts are in supplanting fossil energy with renewable sources including solar energy. Concentrating solar power plants capture the solar energy and store it as heat, which can, in turn, be used to drive a turbine and produce electricity. In fact, studies have shown that CSP with thermal energy storage also facilitates greater incorporation of other renewables such as wind and photovoltaic on the grid. That's a win-win on all fronts," Pitchumani said.
"Due to the intermittent nature of solar energy availability, it is often desirable to store thermal energy from a concentrating solar power plant for use on demand, including at times when solar energy is unavailable such as during cloud cover or overnight. Energy can be stored either as sensible heat (in solid or molten media), latent heat (using phase change materials), or as products of a thermochemical process, of which latent heat and thermochemical storage offer high volumetric energy density and potentially high power cycle efficiency, provided costs can be tamed," he added.
In his role, Pitchumani oversees a team of several program managers, technical, financial and support personnel, who actively manage the awards in the portfolio. During his leadership, the SunShot Concentrating Solar Power Program has launched over $130 million in new funding initiatives since October 2011. Combined with the awards continuing from prior funding opportunities, the program maintains an appropriately balanced portfolio of projects at industry, national laboratories, and universities dedicated to applied scientific research, development and demonstration to advance cutting edge concentrating solar power technologies for the near-, mid- and long-terms.
Concentrating solar power plants could provide for low-cost energy generation and have the potential to become the leading source of renewable energy for future power generation. In the U.S., several large-scale commercial plants (e.g., Ivanpah Solar Electric Generating Station (SEGS), Crescent Dunes Solar Energy Project, and Abengoa Solana Generating Station) are currently under construction, with some getting ready to be commissioned starting in a few months, that would more than triple the total capacity of Concentrating Solar Power-generated electricity to about 1.8 gigawatt and place the U.S. as one of the global leaders in CSP capacity.
On a worldwide scale, studies suggest that concentrating solar power technology systems could provide approximately one-quarter of the global electricity needs by 2050, Pitchumani said.

What Your Opinion ?

Watching too much television could cause fatal blood clots

  Spending too much time in front of the television could increase your chance of developing potentially fatal blood clots known as ve...