Duke Research Blog

Following the people and events that make up the research community at Duke.

Author: Sarah Haurin Page 1 of 3

Does aging make our brains less efficient?

We are an aging population. Demographic projections predict the largest population growth will be in the oldest age group – one study predicted a doubling of people age 65 and over between 2012 and 2050. Understanding aging and prolonging healthy years is thus becoming increasingly important.

Michele Diaz and her team explore the effects of aging on cognition.

For Michele Diaz, PhD, of Pennsylvania State University, understanding aging is most important in the context of cognition. She’s a former Duke faculty member who visited campus recently to update us on her work.

Diaz said the relationship between aging and how we think is much more nuanced than the usual stereotype of a steady cognitive decline with age.

Research has found that change in cognition with age cannot be explained as a simple decline: while older people tend to decline with fluid intelligence, or information processing, they maintain crystallized intelligence, or knowledge.

Diaz’s work explores the relationship between aging and language. Aging in the context of language shows an interesting phenomenon: older people have more diverse vocabularies, but may take longer to produce these words. In other words, as people age, they continue to learn more words but have a more difficult time retrieving them, leading to a more frequent tip-of-the-tongue experience.

In order to understand the brain activation patterns associated with such changes, Diaz conducted a study where participants of varying ages were asked to name objects depicted in images while undergoing fMRI scanning. As expected, both groups showed less accuracy in naming of less common objects, and the older adult group showed a slightly lower naming accuracy than the younger.

Additionally, Diaz found that the approach older adults take to solving more difficult tasks may be different from younger adults: in younger adults, less common objects elicited an increase in activation, while older adults showed less activation for these more difficult tasks.

Additionally, an increase in activation was associated with a decrease in accuracy. Taken together, these results show that younger and older adults rely on different regions of the brain when presented with difficult tasks, and that the approach younger adults take is more efficient.

In another study, Diaz and her team explored picture recognition of objects of varying semantic and phonological neighborhood density. Rather than manipulation of how common the objects presented in the images are, this approach looks at networks of words based on whether they sound similar or have similar meanings. Words that have denser networks, or more similar sounding or meaning words, should be easier to recognize.

An example of a dense (left) and sparse (right) phonological neighborhood. Words with a greater number of similar sounding or meaning words should be more easily recognized. Image courtesy of Vitevitch, Ercal, and Adagarla, Frontiers in Psychology, 2011.

With this framework, Diaz found no age effect on recognition ability for differences in semantic or phonological neighborhood density. These results suggest that adults may experience stability in their ability to process phonological and semantic characteristics as they age.

Teasing out these patterns of decline and stability in cognitive function is just one part of understanding aging. Research like Diaz’s will only prove to be more important to improve care of such a growing demographic group as our population ages.

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

Predicting sleep quality with the brain

Modeling functional connectivity allows researchers to compare brain activation to behavioral outcomes. Image: Chu, Parhi, & Lenglet, Nature, 2018.

For undergraduates, sleep can be as elusive as it is important. For undergraduate researcher Katie Freedy, Trinity ’20, understanding sleep is even more important because she works in Ahmad Hariri’s Lab of Neurogenetics.

After taking a psychopharmacology class while studying abroad in Copenhagen, Freedy became interested in the default mode network, a brain network implicated in autobiographical thought, self-representation and depression. Upon returning to her lab at Duke, Freedy wanted to explore the interaction between brain regions like the default mode network with sleep and depression.

Freedy’s project uses data from the Duke Neurogenetics Study, a study that collected data on brain scans, anxiety, depression, and sleep in 1,300 Duke undergraduates. While previous research has found connections between brain connectivity, sleep, and depression, Freedy was interested in a novel approach.

Connectome predictive modeling (CPM) is a statistical technique that uses fMRI data to create models for connections within the brain. In the case of Freedy’s project, the model takes in data on resting state and task-based scans to model intrinsic functional connectivity. Functional connectivity is mapped as a relationship between the activation of two different parts of the brain during a specific task. By looking at both resting state and task-based scans, Freedy’s models can create a broader picture of connectivity.

To build the best model, a procedure is repeated for each subject where a single subject’s data is left out of the model. Once the model is constructed, its validity is tested by taking the brain scan data of the left-out subject and assessing how well the model predicts that subject’s other data. Repeating this for every subject trains the model to make the most generally applicable but accurate predictions of behavioral data based on brain connectivity.

Freedy presented the preliminary results from her model this past summer at the BioCORE Symposium as a Summer Neuroscience Program fellow. The preliminary results showed that patterns of brain connectivity were able to predict overall sleep quality. With additional analyses, Freedy is eager to explore which specific patterns of connectivity can predict sleep quality, and how this is mediated by depression.

Freedy presented the preliminary results of her project at Duke’s BioCORE Symposium.

Understanding the links between brain connectivity, sleep, and depression is of specific importance to the often sleep-deprived undergraduates.

“Using data from Duke students makes it directly related to our lives and important to those around me,” Freedy says. “With the field of neuroscience, there is so much we still don’t know, so any effort in neuroscience to directly tease out what is happening is important.”

Post by undergraduate blogger Sarah Haurin
Post by undergraduate blogger Sarah Haurin

These Microbes ‘Eat’ Electrons for Energy

The human body is populated by a greater number of microbes than its own cells. These microbes survive using metabolic pathways that vary drastically from humans’.

Arpita Bose’s research explores the metabolism of microorganisms.

Arpita Bose, PhD, of Washington University in St. Louis, is interested in understanding the metabolism of these ubiquitous microorganisms, and putting that knowledge to use to address the energy crisis and other applications.

Photoferrotrophic organisms use light and electrons from the environment as an energy source

One of the biggest research questions for her lab involves understanding photoferrotrophy, or using light and electrons from an external source for carbon fixation. Much of the source of energy humans consume comes from carbon fixation in phototrophic organisms like plants. Carbon fixation involves using energy from light to fuel the production of sugars that we then consume for energy.

Before Bose began her research, scientists had found that some microbes interact with electricity in their environments, even donating electrons to the environment. Bose hypothesized that the reverse could also be true and sought to show that some organisms can also accept electrons from metal oxides in their environments. Using a bacterial strain called Rhodopseudomonas palustris TIE-1 (TIE-1), Bose identified this process called extracellular electron uptake (EEU).

After showing that some microorganisms can take in electrons from their surroundings and identifying a collection of genes that code for this ability, Bose found that this ability was dependent on whether a light source was also present. Without the presence of light, these organisms lost 70% of their ability to take in electrons.   

Because the organisms Bose was studying can rely on light as a source of energy, Bose hypothesized that this dependence on light for electron uptake could signify a function of the electrons in photosynthesis.  With subsequent studies, Bose’s team found that these electrons the microorganisms were taking were entering their photosystem.

To show that the electrons were playing a role in carbon fixation, Bose and her team looked at the activity of an enzyme called RuBisCo, which plays an integral role in converting carbon dioxide into sugars that can be broken down for energy. They found that RuBisCo was most strongly expressed and active when EEU was occurring, and that, without RuBisCo present, these organisms lost their ability to take in electrons. This finding suggests that organisms like TIE-1 are able to take in electrons from their environment and use them in conjunction with light energy to synthesize molecules for energy sources.  

In addition to broadening our understanding of the great diversity in metabolisms, Bose’s research has profound implications in sustainability. These microbes have the potential to play an integral role in clean energy generation.

Post by undergraduate blogger Sarah Haurin
Post by undergraduate blogger Sarah Haurin

How the Flu Vaccine Fails

Influenza is ubiquitous. Every fall, we line up to get our flu shots with the hope that we will be protected from the virus that infects 10 to 20 percent of people worldwide each year. But some years, the vaccine is less effective than others.

Every year, CDC scientists engineer a new flu virus. By examining phylogenetic relationships, which are based on shared common ancestry and relatedness, researchers identify virus strains to target with a vaccine for the following flu season.

Sometimes, they do a good job predicting which strains will flourish in the upcoming flu season; other times, they pick wrong.

Pekosz’s work has identified why certain flu seasons saw less effective vaccines.

Andrew Pekosz, PhD, is a researcher at Johns Hopkins who examines why we fail to predict strains to target with vaccines. In particular, he examines years when the vaccine was ineffective and the viruses that were most prevalent to identify properties of these strains.

A virus consists of RNA enclosed in a membrane. Vaccines function by targeting membrane proteins that facilitate movement of the viral genome into host cells that it is infecting. For the flu virus, this protein is hemagglutinin (HA). An additional membrane protein called neuraminidase (NA) allows the virus to release itself from a cell it has infected and prevents it from returning to infected cells.  

The flu vaccine targets proteins on the membrane of the RNA virus. Image courtesy of scienceanimations.com.

Studying the viruses that flourished in the 2014-2015 and 2016-2017 flu seasons, Pekosz and his team have identified mutations to these surface proteins that allowed certain strains to evade the vaccine.

In the 2014-2015 season, a mutation in the HA receptor conferred an advantage to the virus, but only in the presence of the antibodies present in the vaccine. In the absence of these antibodies, this mutation was actually detrimental to the virus’s fitness. The strain was present in low numbers in the beginning of the flu season, but the selective pressure of the vaccine pushed it to become the dominant strain by the end.

The 2016-2017 flu season saw a similar pattern of mutation, but in the NA protein. The part of the virus membrane where the antibody binds, or the epitope, was covered in the mutated viral strain. Since the antibodies produced in response to the vaccine could not effectively identify the virus, the vaccine was ineffective for these mutated strains.

With the speed at which the flu virus evolves, and the fact that numerous strains can be active in any given flu season, engineering an effective vaccine is daunting. Pekosz’s findings on how these vaccines have previously failed will likely prove invaluable at combating such a persistent and common public health concern.

Post by undergraduate blogger Sarah Haurin
Post by undergraduate blogger Sarah Haurin


The Costs of Mental Effort

Every day, we are faced with countless decisions regarding cognitive control, or the process of inhibiting automatic or habitual responses in order to perform better at a task.

Amitai Shenhav, PhD, of Brown University, and his lab are working on understanding the factors that influence this decision making process. Having a higher level cognitive control is what allows us to complete hard tasks like a math problem or a dense reading, so we may expect that the optimal practice is to exert a high level of control at all times.

Shenhav’s lab explores motivation and decision making related to cognitive control.

Experimental performance shows this is not the case: people tend to choose easier over hard tasks, require more money to complete harder tasks, and exert more mental effort as the reward value increases. These behaviors all suggest that the subjects’ automatic state is not to be at the highest possible level of control.

Shenhav’s research has centered around why we see variation in level of control. Because cognitive control is a costly process, there must be a limit to how much we can exert. These costs can be understood as tradeoffs between level of control and other brain functions and consequences of negative affective changes related to difficult tasks, like stress.

To understand how people make decisions about cognitive control in real time, Shenhav has developed an algorithm called the Expected Value of Control (EVC) model, which focuses on how individuals weigh the costs and benefits of increasing control.

Employing this model has helped Shenhav and his colleagues identify situations in which people are likely to choose to invest a lot of cognitive control. In one study, by varying whether the reward was paired only with a correct response or was given randomly, Shenhav simulated variability in efficacy of control. They found that people learn fairly quickly whether increasing their efforts will increase the likelihood of earning the reward and adjust their control accordingly: people are more likely to invest more effort when they learn that there is a correlation between their own effort and the likelihood of reward than when rewards are distributed independent of performance.

Another study explored how we adjust our strategies following difficult tasks. Experiments with cognitive control often rely on paradigms like the Stroop task, where subjects are asked to identify a target cue (color) while being presented with a distractor (incongruency of the word with its text color). Shenhav found that when subjects face a difficult trial or make a mistake, they adjust by decreasing attention to the distractor.

The Stroop task is a classic experimental design for understanding cognitive control. Successful completion of Stroop task 3 requires overriding your reflex to read the word in cases where the text and its color are mismatched.

A final interesting finding from Shenhav’s work tells us that part of the value of hard work may be in the work itself: people value rewards following a task in a way that scales to the effort they put into the task.

Style Recommendations From Data Scientists

A combination of data science and psychology is behind the recommendations for products we get when shopping online.

At the intersection of social psychology, data science and fashion is Amy Winecoff.

Amy Winecoff uses her background in psychology and neuroscience to improve recommender systems for shopping.

After earning a Ph.D. in psychology and neuroscience here at Duke, Winecoff spent time teaching before moving over to industry.

Today, Winecoff works as a senior data scientist at True Fit, a company that provides tools to retailers to help them decide what products they suggest to their customers.

True Fit’s software relies on collecting data about how clothes fit people who have bought them. With this data on size and type of clothing, True Fit can make size recommendations for a specific consumer looking to buy a certain product.    

In addition to recommendations on size, True Fit is behind many sites’ recommendations of products similar to those you are browsing or have bought.

While these recommender systems have been shown to work well for sites like Netflix, where you may have watched many different movies and shows in the recent past that can be used to make recommendations, Winecoff points out that this can be difficult for something like pants, which people don’t tend to buy in bulk.

To overcome this barrier, True Fit has engineered its system, called the Discovery engine, to parse a single piece of clothing into fifty different traits. With this much information, making recommendations for similar styles can be easier.

However, Winecoff’s background in social psychology has led her to question how well these algorithms make predictions that are in line with human behavior. She argues that understanding how people form their preferences is an integral part of designing a system to make recommendations.

One way Winecoff is testing how true the predictions are to human preferences is employing psychological studies to gain insight in how to fine-tune mathematical-based recommendations.

With a general goal of determining how humans determine similarity in clothes, Winecoff designed an online study where subjects are presented with a piece of clothing and told the garment is out of stock. They are then presented with two options and must pick one to replace the out-of-stock item. By varying one aspect in each of the two choices, like different color, pattern, or skirt length, Winecoff and her colleagues can distinguish which traits are most salient to a person when determining similarity.

Winecoff’s work illustrates the power of combining algorithmic recommendations with social psychological outcomes, and that science reaches into unexpected places, like influencing your shopping choices.  

Post by undergraduate blogger Sarah Haurin
Post by undergraduate blogger Sarah Haurin

Bias in Brain Research

Despite apparent progress in achieving gender equality, sexism continues to be pervasive — and scientists aren’t immune.  

In a cyber talk delivered to the Duke Institute for Brain Sciences, professor Cordelia Fine of the University of Melbourne highlighted compelling evidence that neuroscientific research is yet another culprit of gender bias.

Fine says the persistent idea of gender essentialism contributes to this stagnation. Gender essentialism describes the idea that men and women are fundamentally different, specifically at a neurological level. This “men are from Mars, women are from Venus” attitude has spread from pop culture into experimental design and interpretation.

However, studies that look for sex differences in male and female behavior tend to show more similarities than differences. One study looked at 106 meta-analyses about psychological differences between men and women. The researchers found that in areas as diverse as temperament, communication styles, and interests, gender had a small effect, representing statistically small differences between the sexes.

Looking at fMRI data casts further doubt on how pronounced gender differences really are. A meta-analysis of fMRI studies investigating functional differences between men and women found a large reporting bias. Studies finding brain differences across genders were overrepresented compared to those finding similarities.

Of those small sex differences found in the central nervous system, Fine points out how difficult it is to determine their functional significance. One study found no difference between men and women in self-reported emotional experience, but found via fMRI that men exhibited more processing in the prefrontal cortex, or the executive center of the brain, than women. Although subjective experience of emotion was the same between men and women, the researchers reported that men are more cognitive, while women are more emotional.

Fine argues that conclusions like this are biased by gender essentialism. In a study she co-authored, Fine found that gender essentialism correlates with stronger belief in gender stereotypes, that gender roles are fixed, and that the current understanding of gender does not need to change.

When scientists allow preconceived notions about gender to bias their interpretation of results, our collective understanding suffers. The best way to overcome these biases is to ensure we are continuing to bring more and more diverse voices to the table, Fine said.

Fine spoke last month as part of the Society for Neuroscience Virtual Conference, “Mitigating Implicit Bias: Tools for the Neuroscientist.” The Duke Institute for Brain Sciences (@DukeBrain) made the conference available to the Duke community.  

Post by undergraduate blogger Sarah Haurin
Post by undergraduate blogger Sarah Haurin

Nature vs. Nurture and Addiction

Epigenetics involves modifications to DNA that do not change its sequence but only affect which genes are active, or expressed. Photo courtesy of whatisepigenetics.com

The progressive understanding of addiction as a disease rather than a choice has opened the door to better treatment and research, but there are aspects of addiction that make it uniquely difficult to treat.

One exceptional characteristic of addiction is its persistence even in the absence of drug use: during periods of abstinence, symptoms get worse over time, and response to the drug increases.

Researcher Elizabeth Heller, PhD, of the University of Pennsylvania Epigenetics Institute, is interested in understanding why we observe this persistence in symptoms even after drug use, the initial cause of the addiction, is stopped. Heller, who spoke at a Jan. 18 biochemistry seminar, believes the answer lies in epigenetic regulation.

Elizabeth Heller is interested in how changes in gene expression can explain the chronic nature of addiction.

Epigenetic regulation represents the nurture part of “nature vs. nurture.” Without changing the actual sequence of DNA, we have mechanisms in our body to control how and when cells express certain genes. These mechanisms are influenced by changes in our environment, and the process of influencing gene expression without altering the basic genetic code is called epigenetics.

Heller believes that we can understand the persistent nature of the symptoms of drugs of abuse even during abstinence by considering epigenetic changes caused by the drugs themselves.

To investigate the role of epigenetics in addiction, specifically cocaine addiction, Heller and her team have developed a series of tools to bind to DNA and influence expression of the molecules that play a role in epigenetic regulation, which are called transcription factors. They identified the FosB gene, which has been previously implicated as a regulator of drug addiction, as a site for these changes.

Increased expression of the FosB gene has been shown to increase sensitivity to cocaine, meaning individuals expressing this gene respond more than those not expressing it. Heller found that cocaine users show decreased levels of the protein responsible for inhibiting expression of FosB. This suggests cocaine use itself is depleting the protein that could help regulate and attenuate response to cocaine, making it more addictive.

Another gene, Nr4a1, is important in dopamine signaling, the reward pathway that is “hijacked” by drugs of abuse.  This gene has been shown to attenuate reward response to cocaine in mice. Mice who underwent epigenetic changes to suppress Nr4a1 showed increased reward response to cocaine. A drug that is currently used in cancer treatment has been shown to suppress Nr4a1 and, consequently, Heller has shown it can reduce cocaine reward behavior in mice.

The identification of genes like FosB and Nr4a1 and evidence that changes in gene expression are even greater in periods of abstinence than during drug use. These may be exciting leaps in our understanding of addiction, and ultimately finding treatments best-suited to such a unique and devastating disease.   

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

Drug Homing Method Helps Rethink Parkinson’s

The brain is the body’s most complex organ, and consequently the least understood. In fact, researchers like Michael Tadross, MD, PhD, wonder if the current research methods employed by neuroscientists are telling us as much as we think.

Michael Tadross is using novel approaches to tease out the causes of neuropsychiatric diseases at a cellular level.

Current methods such as gene editing and pharmacology can reveal how certain genes and drugs affect the cells in a given area of the brain, but they’re limited in that they don’t account for differences among different cell types. With his research, Tadross has tried to target specific cell types to better understand mechanisms that cause neuropsychiatric disorders.

To do this, Tadross developed a method to ensure a drug injected into a region of the brain will only affect specific cell types. Tadross genetically engineered the cell type of interest so that a special receptor protein, called HaloTag, is expressed at the cell membrane. Additionally, the drug of interest is altered so that it is tethered to the molecule that binds with the HaloTag receptor. By connecting the drug to the Halo-Tag ligand, and engineering only the cell type of interest to express the specific Halo-Tag receptor, Tadross effectively limited the cells affected by the drug to just one type. He calls this method “Drugs Acutely Restricted by Tethering,” or DART.

Tadross has been using the DART method to better understand the mechanisms underlying Parkinson’s disease. Parkinson’s is a neurological disease that affects a region of the brain called the striatum, causing tremors, slow movement, and rigid muscles, among other motor deficits.

Only cells expressing the HaloTag receptor can bind to the AMPA-repressing drug, ensuring virtually perfect cell-type specificity.

Patients with Parkinson’s show decreased levels of the neurotransmitter dopamine in the striatum. Consequently, treatments that involve restoring dopamine levels improve symptoms. For these reasons, Parkinson’s has long been regarded as a disease caused by a deficit in dopamine.

With his technique, Tadross is challenging this assumption. In addition to death of dopaminergic neurons, Parkinson’s is associated with an increase of the strength of synapses, or connections, between neurons that express AMPA receptors, which are the most common excitatory receptors in the brain.

In order to simulate the effects of Parkinson’s, Tadross and his team induced the death of dopaminergic neurons in the striatum of mice. As expected, the mice displayed significant motor impairments consistent with Parkinson’s. However, in addition to inducing the death of these neurons, Tadross engineered the AMPA-expressing cells to produce the Halo-Tag protein.

Tadross then treated the mice striatum with a common AMPA receptor blocker tethered to the Halo-Tag ligand. Amazingly, blocking the activity of these AMPA-expressing neurons, even in the absence of the dopaminergic neurons, reversed the effects of Parkinson’s so that the previously affected mice moved normally.

Tadross’s findings with the Parkinson’s mice exemplifies how little we know about cause and effect in the brain. The key to designing effective treatments for neuropsychiatric diseases, and possibly other diseases outside the nervous system, may be in teasing out the relationship of specific types of cells to symptoms and targeting the disease that way.

The ingenious work of researchers like Tadross will undoubtedly help bring us closer to understanding how the brain truly works.

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

 

Aging and Decision-Making

Who makes riskier decisions, the young or the old? And what matters more in our decisions as we age — friends, health or money? The answers might surprise you.

Kendra Seaman works at the Center for the Study of Aging and Human Development and is interested in decision-making across the lifespan.

Duke postdoctoral fellow Kendra Seaman, Ph.D. uses mathematical models and brain imaging to understand how decision-making changes as we age. In a talk to a group of cognitive neuroscientists at Duke, Seamen explained that we have good reason to be concerned with how older people make decisions.

Statistically, older people in the U.S. have more money, and additionally more expenditures, specifically in healthcare. And by 2030, 20 percent of the US population will be over the age of 65.

One key component to decision-making is subjective value, which is a measure of the importance a reward or outcome has to a specific person at a specific point in time. Seaman used a reward of $20 as an example: it would have a much higher subjective value for a broke college student than for a wealthy retiree. Seaman discussed three factors that influence subjective value: reward, cost, and discount rate, or the determination of the value of future rewards.

Brain imaging research has found that subjective value is represented similarly in the medial prefrontal cortex (MPFC) across all ages. Despite this common network, Seaman and her colleagues have found significant differences in decision-making in older individuals.

The first difference comes in the form of reward. Older individuals are likely to be more invested in the outcome of a task if the reward is social or health-related rather than monetary. Consequently, they are more likely to want these health and social rewards  sooner and with higher certainty than younger individuals are. Understanding the salience of these rewards is crucial to designing future experiments to identify decision-making differences in older adults.

A preference for positive skew becomes more pronounced with age.

Older individuals also differ in their preferences for something called “skewed risks.” In these tasks, positive skew means a high probability of a small loss and a low probability of a large gain, such as buying a lottery ticket. Negative skew means a low probability of a large loss and a high probability of a small gain, such as undergoing a common medical procedure that has a low chance of harmful complications.

Older people tend to prefer positive skew to a greater degree than younger people, and this bias toward positive skew becomes more pronounced with age.

Understanding these tendencies could be vital in understanding why older people fall victim to fraud and decide to undergo risky medical procedures, and additionally be better equipped to motivate an aging population to remain involved in physical and mental activities.

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

Page 1 of 3

Powered by WordPress & Theme by Anders Norén