Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Lecture Page 2 of 15

ECT: Shockingly Safe and Effective

Husain is interested in putting to rest misconceptions about the safety and efficacy of ECT.

Few treatments have proven as controversial and effective as electroconvulsive therapy (ECT), or ‘shock therapy’ in common parlance.

Hippocrates himself saw the therapeutic benefits of inducing seizures in patients with mental illness, observing that convulsions caused by malaria helped attenuate symptoms of mental illness. However, depictions of ECT as a form of medical abuse, as in the infamous scene from One Flew Over the Cuckoo’s Nest, have prevented ECT from becoming a first-line psychiatric treatment.

The Duke Hospital Psychiatry program recently welcomed back Duke Medical School alumnus Mustafa Husain to deliver the 2018 Ewald “Bud” Busse Memorial Lecture, which is held to commemorate a Duke doctor who pioneered the field of geriatric psychiatry.

Husain, from the University of Texas Southwestern, delivered a comprehensive lecture on neuromodulation, a term for the emerging subspecialty of psychiatric medicine that focuses on physiological treatments that are not medication.

The image most people have of ECT is probably the gruesome depiction seen in “One Flew Over the Cuckoo’s Nest.”

Husain began his lecture by stating that ECT is one of the most effective treatments for psychiatric illness. While medication and therapy are helpful for many people with depression, a considerable proportion of patients’ depression can be categorized as “treatment resistant depression” (TRD). In one of the largest controlled experiments of ECT, Husain and colleagues showed that 82 percent of TRD patients treated with ECT were remitted. While this remission rate is impressive, the rate at which remitted individuals experience a relapse into symptoms is also substantial – over 50% of remitted individuals will experience relapse.

Husain’s study continued to test whether a continuation of ECT would be a potentially successful therapy to prevent relapse in the first six months after acute ECT. He found that continuation of ECT worked as well as the current best combination of drugs used.

From this study, Husain made an interesting observation – the people who were doing best in the 6 months after ECT were elderly patients. He then set out to study the best form of treatment for these depressed elderly patients.

Typically, ECT involves stimulation of both sides of the brain (bilateral), but this treatment is associated with adverse cognitive effects like memory loss. Using right unilateral ECT effectively decreased cognitive side effects while maintaining an appreciable remission rate.

After the initial treatment, patients were again assigned to either receive continued drug treatment or continued ECT. In contrast to the previous study, however, the treatment for continued ECT was designed based on the individual patients’ ratings from a commonly used depression scaling system.

The results of this study show the potential that ECT has in becoming a more common treatment for major depressive disorder: maintenance ECT showed a lower relapse rate than drug treatment following initial ECT. If psychiatrists become more flexible in their prescription of ECT, adjusting the treatment plan to accommodate the changing needs of the patients, a disorder that is exceedingly difficult to treat could become more manageable.

In addition to discussing ECT, Husain shared his research into other methods of neuromodulation, including Magnetic Seizure Therapy (MST). MST uses magnetic fields to induce seizures in a more localized region of the brain than available via ECT.

Importantly, MST does not cause the cognitive deficits observed in patients who receive ECT. Husain’s preliminary investigation found that a treatment course relying on MST was comparable in efficacy to ECT. While further research is needed, Husain is hopeful in the possibilities that interventional psychiatry can provide for severely depressed patients.

By Sarah Haurin 

DNA Breakage: What Doesn’t Kill You…

What doesn’t kill you makes you stronger―at least according to Kelly Clarkson’s recovery song for middle school crushes, philosopher Friedrich Nietzsche, and New York University researcher Viji Subramanian.

During the creation of sperm or eggs, DNA molecules exchange genetic material. This increases the differences between offspring and their parents and the overall species diversity and is thought to make an individual and a species stronger.

However, to trade genetic information — through a process called recombination — the DNA molecules must break at points along the chromosomes, risking permanent damage and loss of genomic integrity. In humans, errors during recombination can lead to infertility, fetal loss, and birth defects.

Subramanian, a postdoctoral researcher in the lab of Andreas Hochwagen at NYU, spoke at Duke on February 26. She studies how cells prevent excessive DNA breakage and how they regulate repair.

Subramanian uses budding yeast to study the ‘synaptonemal complex,’ a structure that forms between pairing chromosomes as shown in the above image. Over three hundred DNA breakage hotspots exist in the budding yeast’s synaptonemal complex. Normally, double-stranded DNA breaks go from none to some and then return to none.

However, when Subramanian removed the synaptonemal complex, the breaks still appeared, but they did not completely disappear by the end of the process. She  concluded that synaptonemal complex shuts down DNA break formation. The synaptonemal complex therefore is one way cells prevent excessive DNA breakage.

The formation of the synaptonemal complex

 

During DNA breakage repair, preference must occur between the pairing chromosomes in order for recombination to correctly transpire. A protein called Mek1 promotes this bias by suppressing DNA in select areas. Early in the process of DNA breakage and repair Mek1 levels are high, while synaptonemal complex density is low. Later, the synaptonemal complex increases while the Mek1 decreases.

This led to Subramanian’s conclusion that synaptonemal complex is responsible for removing Mek1, allowing in DNA repair. She then explored if the protein pch2 regulates the removal of Mek1. In pch2-mutant budding yeast cells, DNA breaks were not repaired.

Subramanian showed that at least one aspect of DNA breakage and repair occurs through the Mek1 protein suppression of repair, creating selectivity between chromosomes. The synaptonemal complex then uses pch2 to remove Mek1 allowing DNA breakage repair.

Subramanian had another question about this process though: how is breakage ensured in small chromosomes? Because there are fewer possible breaking points, the chance of recombination seems lower in small chromosomes. However, Subramanian discovered that zones of high DNA break potential exist near the chromosome ends, allowing numerous breaks to form even in smaller chromosomes. This explains why smaller chromosomes actually exhibit a higher density of DNA breaks and recombination since their end zones occupy a larger percentage of their total surface area.

In the future, Subramanian wants to continue studying the specific mechanics behind DNA breaks and repair, including how the chromosomes reorganize during and after this process. She is also curious about how Mek1 suppresses repair and has more than 200 Mek1 mutants in her current study.

Kelly Clarkson may prove that heartbreaks don’t destroy you, but Viji Subramanian proves that DNA breaks create a stronger, more unique genetic code.         

Post by Lydia Goff

        

Obesity: Do Your Cells Have a Sweet Tooth?

Obesity is a global public health crisis that has doubled since 1980. That is why Damaris N. Lorenzo, a professor of  Cell Biology and Physiology at UNC-Chapel Hill, has devoted her research to this topic.

Specifically, she examines the role of ankyrin-B variants in metabolism. Ankyrins play a role in the movement of substances such as ions into and out of the cell. One of the ways that ankyrins affect this movement is through the glucose transporter protein GLUT4 which is present in the heart, skeletal muscles, and insulin-responsive tissues. GLUT4 plays a large role in glucose levels throughout the entire body.

Through her research, Lorenzo discovered that with modern life spans and high calorie diets, ankyrin-B variants can be a risk factor for metabolic disease. She presented her work for the Duke Developmental & Stem Cell Biology department on March 7th.

Prevalence of Self-Reported Obesity Among U.S. Adults by State, 2016

GLUT4 helps remove glucose from the body’s circulation by moving it into cells. The more GLUT4, the more sugar cells absorb.

Ankyrin-B’s role in regulating GLUT4 therefore proves really important for overall health. Through experiments on mice, Lorenzo discovered that mice manipulated to have ankyrin-B mutations also had high levels of cell surface GLUT4. This led to increased uptake of glucose into cells. Ankyrin-B therefore regulates how quickly glucose enters adipocytes, cells that store fat. These ankyrin-B deficient mice end up with adipocytes that have larger lipid droplets, which are fatty acids.

Lorenzo was able to conclude that ankyrin-B deficiency leads to age-dependent obesity in mutant mice. Age-dependent because young ankyrin-B mutant mice with high fat diets are actually more likely to be affected by this change.

Obese mouse versus a regular mouse

Ankyrin-B has only recently been recognized as part of GLUT4 movement into the cell. As cell sizes grow through increased glucose uptake, not only does the risk of obesity rise but also inflammation is triggered and metabolism becomes impaired, leading to overall poor health.

With obesity becoming a greater problem due to increased calorie consumption, poor dietary habits, physical inactivity, environmental and life stressors, medical conditions, and drug treatments, understanding factors inside of the body can help. Lorenzo seeks to discover how ankyrin-B protein might play a role in the amount of sugar our cells internalize.

Post by Lydia Goff

Understanding the Link Between ADHD and Binge Eating Could Point to New Treatments

 

Binge eating disorder is the most prevalent eating disorder in the United States. Infographic courtesy of Multi-Service Eating Disorders Association

With more than a third of the adult population of the United States meeting criteria for obesity, doctors are becoming increasingly interested in behaviors that contribute to these rates.

Allan Kaplan is interested in improving treatment of binge eating disorder.

Allan Kaplan, MD, of the University of Toronto, is interested in eating disorders, specifically binge eating disorder, which is observed in about 35 percent of people with obesity.

Binge eating disorder (BED) is a pattern of disordered eating characterized by consumption of a large number of calories in a relatively short period of time. In addition to these binges, patients report lack of control and feelings of self-disgust. Because of these patterns of excessive caloric intake, binge eating disorder and obesity go hand-in-hand, and treatment of the disorder could be instrumental in decreasing rates of obesity and improving overall health.

In addition to the health risks associated with obesity, binge eating disorder is associated with anxiety disorders, affective disorders, substance abuse and attention deficit hyperactivity disorder (ADHD) – in fact, about 30 percent of individuals with binge eating disorder also have a history of ADHD.

Binge eating disorder displays a high comorbidity with mood and affective disorders. Infographic courtesy of American Addiction Centers.

ADHD is characterized by inability to focus, hyperactivity, and impulsivity, and substance abuse involves cravings and patterns of losing control followed by regret. These patterns of mental and physiological sympoms resemble those seen in patients with binge eating disorder. Kaplan and other researchers are linking the neurological patterns observed in these disorders to better understand BED.

Researchers have found that the neurological pathways become active when a patient with binge eating disorder is provided with a food-related stimulus. Individuals with the eating disorder are more sensitive to food-related rewards than most people. Researchers have also identified a genetic basis — certain genes make individuals more susceptible to reward and thus more likely to engage in binges.

Because patients with ADHD exhibit similar neurological patterns, doctors are looking to drugs already approved by the FDA to treat ADHD as possible treatments for binge eating disorder. The first of these approved drugs, Vyvanse, has proven not much better than the traditional form of treatment, cognitive behavioral therapy, a form of talk therapy that aims to identify and correct dysfunctions in behavior and thought patterns that lead to disordered behaviors.

Another drug, however, proved promising in a study conducted by Kaplan and his colleagues. The ADHD drug methylphenidate, combined with CBT, led to significant clinical outcomes — pateints engaged in less binges and cravings and body mass index decreased. Kaplan argues that the most effective treatment would reduce binges, treat physiological symptoms like obesity, improve psychological disturbances like low self-esteem, and, of course, be safe. So far, the combination of psychostimulants like methylphenidate and CBT have met these criteria.

Kaplan emphasized a need to make information about binge eating disorder and its treatments more available. Most individuals currently being treated for BED do not obtain treatment knowing they have an eating disorder — they are usually diagnosed only after seeking help with obesity-related health issues or help in weight loss. Making clinicians more familiar with the disorder and its associated behaviors as well as encouraging patients to seek treatment could prove instrumental in combating the current healthcare issue of obesity.

By Sarah Haurin

High as a Satellite — Integrating Satellite Data into Science

Professor Tracey Holloway researches air quality at the University of Wisconsin-Madison.

Professor Tracey Holloway researches air quality at the University of Wisconsin-Madison.

Satellite data are contributing more and more to understanding air quality trends, and professor Tracey Holloway wants the world to know.

As a professor of the Department of Atmospheric and Oceanic Science at University of Wisconsin-Madison and the current Team Lead of the NASA Health and Air Quality Applied Sciences Team (HAQAST), she not only helps with the science related to satellites, but also the communication of findings to larger audiences.

Historically, ground-based monitors have provided estimates on changes in concentrations of air pollutants, Holloway explained in her March 2, 2018 seminar, “Connecting Science with Stakeholders,” organized by Duke’s Earth and Ocean Sciences department.

Despite the valuable information ground-based monitors provide, however, factors like high costs limit their widespread use. For example, only about 400 ground-based monitors for nitrogen dioxide currently exist, with many states in the U.S. entirely lacking even a single one. Almost no information on nitrogen dioxide levels had therefore existed before satellites came into the picture.

To close the gap, HAQAST employed earth-observing and polar-orbiting satellites — with fruitful results. Not only have they provided enough data to make more comprehensive maps showing nitrogen dioxide distributions and concentrations, but they also have detected formaldehyde, one of the top causes of cancer, in our atmosphere for the first time.

Satellites have additional long-term benefits. They can help determine potential monitoring sites before actually having to invest large amounts of resources. In the case of formaldehyde, satellite-generated information located areas of higher concentrations — or formaldehyde “hotspots” —  in which HAQAST can now prioritize placing a ground-based monitor. Once established, the site can evaluate air dispersion models, provide air quality information to the public and add to scientific research.

A slide form Holloway’s presentation, in the LSRC A building on March 2, explaining the purposes of a monitoring site.

A slide from Holloway’s presentation, in the LSRC A building on March 2, explaining the purposes of a monitoring site.

Holloway underscored the importance of effectively communicating science. She explained that many policymakers don’t have the strong science backgrounds and therefore need quick and friendly explanations of research from scientists.

Perhaps more significant, though, is the fact that some people don’t even realize that information exists. Specifically, people don’t realize that more satellites are producing new information every day; Holloway has made it a personal goal to have more one-on-one conversations with stakeholders to increase transparency.

Breakthroughs in science aren’t made by individuals: science and change are collaborative. And for Holloway, stakeholders also include the general public. She founded the Earth Science Women’s Network, with one of her goals being to change the vision of what a “scientist” looks like. Through photo campaigns and other communication and engagement activities, she interacted with adults and children to make science more appealing. By making science more sexy, it would be easier to inspire new and continue old discussions, create a more diverse research environment, and make the field more open for all.

Professor Tracey Holloway, air quality researcher at University of Wisconsin-Madison, presented her research at Duke on March 2, 2018.

Professor Tracey Holloway, air quality researcher at University of Wisconsin-Madison, presented her research at Duke on March 2, 2018.

Post by Stella Wang, class of 2019

Post by Stella Wang, class of 2019

What is a Model?

When you think of the word “model,” what do you think?

As an Economics major, 
the first thing that comes to my mind is a statistical model, modeling phenomena such as the effect of class size on student test scores. A
car connoisseur’s mind might go straight to a model of their favorite vintage Aston
Martin. Someone else studying fashion even might imagine a runway model. The point is, the term “model” is used in popular discourse incredibly frequently, but are we even sure what it implies?

Annabel Wharton, a professor of Art, Art History, and Visual Studies at Duke, gave a talk entitled “Defining Models” at the Visualization Friday Forum. The forum is a place “for faculty, staff and students from across the university (and beyond Duke) to share their research involving the development and/or application of visualization methodologies.” Wharton’s goal was to answer the complex question, “what is a model?”

Wharton began the talk by defining the term “model,” knowing that it can often times be rather ambiguous. She stated the observation that models are “a prolific class of things,” from architectural models, to video game models, to runway models. Some of these types of things seem unrelated, but Wharton, throughout her talk, pointed out the similarities between them and ultimately tied them together as all being models.

The word “model” itself has become a heavily loaded term. According to Wharton, the dictionary definition of “model” is 9 columns of text in length. Wharton then stressed that a model “is an autonomous agent.” This implies that models must be independent of the world and from theory, as well as being independent of their makers and consumers. For example, architecture, after it is built, becomes independent of its architect.

Next, Wharton outlined different ways to model. They include modeling iconically, in which the model resembles the actual thing, such as how the video game Assassins Creed models historical architecture. Another way to model is indexically, in which parts of the model are always ordered the same, such as the order of utensils at a traditional place setting. The final way to model is symbolically, in which a model symbolizes the mechanism of what it is modeling, such as in a mathematical equation.

Wharton then discussed the difference between a “strong model” and a “weak model.” A strong model is defined as a model that determines its weak object, such as an architect’s model or a runway model. On the other hand, a “weak model” is a copy that is always less than its archetype, such as a toy car. These different classifications include examples we are all likely aware of, but weren’t able to explicitly classify or differentiate until now.

Wharton finally transitioned to discussing one of her favorite models of all time, a model of the Istanbul Hagia Sophia, a former Greek Orthodox Christian Church and later imperial mosque. She detailed how the model that provides the best sense of the building without being there is found in a surprising place, an Assassin’s Creed video game. This model is not only very much resembles the actual Hagia Sophia, but is also an experiential and immersive model. Wharton joked that even better, the model allows explorers to avoid tourists, unlike in the actual Hagia Sophia.

Wharton described why the Assassin’s Creed model is a highly effective agent. Not only does the model closely resemble the actual architecture, but it also engages history by being surrounded by a historical fiction plot. Further, Wharton mentioned how the perceived freedom of the game is illusory, because the course of the game actually limits players’ autonomy with code and algorithms.

After Wharton’s talk, it’s clear that models are definitely “a prolific class of things.” My big takeaway is that so many thing in our everyday lives are models, even if we don’t classify them as such. Duke’s East Campus is a model of the University of Virginia’s campus, subtraction is a model of the loss of an entity, and an academic class is a model of an actual phenomenon in the world. Leaving my first Friday Visualization Forum, I am even more positive that models are powerful, and stretch so far beyond the statistical models in my Economics classes.


By Nina Cervantes

Game-Changing App Explores Conservation’s Future

In the first week of February, students, experts and conservationists from across the country were brought together for the second annual Duke Blueprint symposium. Focused around the theme of “Nature and Progress,” this conference hoped to harness the power of diversity and interdisciplinary collaboration to develop solutions to some of the world’s most pressing environmental challenges.

Scott Loarie spoke at Duke’s Mary Duke Biddle Trent Semans Center.

One of the most exciting parts of this symposium’s first night was without a doubt its all-star cast of keynote speakers. The experiences and advice each of these researchers had to offer were far too diverse for any single blog post to capture, but one particularly interesting presentation (full video below) was that of National Geographic fellow Scott Loarie—co-director of the game-changing iNaturalist app.

iNat, as Loarie explained, is a collaborative citizen scientist network with aspirations of developing a comprehensive mapping of all terrestrial life. Any time they go outside, users of this app can photograph and upload pictures of any wildlife they encounter. A network of scientists and experts from around the world then helps the users identify their finds, generating data points on an interactive, user-generated map of various species’ ranges.

Simple, right? Multiply that by 500,000 users worldwide, though, and it’s easy to see why researchers like Loarie are excited by the possibilities an app like this can offer. The software first went live in 2008, and since then its user base has roughly doubled each year. This has meant the generation of over 8 million data points of 150,000 different species, including one-third of all known vertebrate species and 40% of all known species of mammal. Every day, the app catalogues around 15 new species.

“We’re slowly ticking away at the tree of life,” Loarie said.

Through iNaturalist, researchers are able to analyze and connect to data in ways never before thought possible. Changes to environments and species’ distributions can be observed or modeled in real time and with unheard-of collaborative opportunities.

To demonstrate the power of this connectedness, Loarie recalled one instance of a citizen scientist in Vietnam who took a picture of a snail. This species had never been captured, never been photographed, hadn’t been observed in over a century. One of iNat’s users recognized it anyway. How? He’d seen it in one of the journals from Captain James Cook’s 18th-century voyage to circumnavigate the globe.

It’s this kind of interconnectivity that demonstrates not just the potential of apps like iNaturalist, but also the power of collaboration and the possibilities symposia like Duke Blueprint offer. Bridging gaps, tearing down boundaries, building up bonds—these are the heart of conservationism’s future. Nature and Progress, working together, pulling us forward into a brighter world.

Post by Daniel Egitto

 

 

How A Bat’s Brain Navigates

Most of what we know about how the hippocampus, a region of the brain associated with memory formation and spatial representations, comes from research done on rodents. Rat brains have taught us a lot, but researchers in Israel have found an interesting alternative model to understanding how the hippocampus helps mammals navigate: Bats.

The Egyptian fruit bat proved the perfect subject for studies of mammalian navigation.

Weizmann Institute neurophysiologist Nachum Ulanovsky, PhD, and his team have looked to bats to understand the nuances of navigation through space. While previous research has identified specific cells in the hippocampus, called place cells, that are active when an animal is located in a specific place, there is not much literature describing how animals actually navigate from point A to point B.

Nachum Ulanovsky

Ulanovsky believes that bats are an ingenious model to study mammalian navigation. While bats have the same types of hippocampal neurons found in rats, the patterns of bats’ neurons’ firings more closely match that of humans than rats do.

Ulanovsky sought to test how bats know where they are going. Using GPS tracking equipment, his team found that wild bats that lived in a cave would travel up to 20 kilometers to forage fruit from specific trees. Night after night, these bats followed similar routes past perfectly viable food sources to the same tree over and over again.

The understanding of hippocampal place cells firing at specific locations doesn’t explain the apparent guided travel of the bat night after night, and other explanations like olfactory input do not explain why the bats fly over good food sources to their preferred tree.

The researchers designed an experiment to test how bats encode the 3D information necessary for this navigation. By letting the bats fly around and recording brain activity, Ulanovsky and team found that their 3D models are actually spherical in shape. They also found another type of hippocampal cells that encode the orientation the bat is facing. These head direction cells operate in a coordinate system that allows for a continuity of awareness of its orientation as the animal moves through space.

http://www.cell.com/cms/attachment/2091916945/2076305003/gr1_lrg.jpg

Ulanovsky found bats relied on memory to navigate toward the goal.

To understand how the bats navigate toward a specific goal, the researchers devised another experiment. They constructed a goal with a landing place and a food incentive. The bat would learn where the goal was and find it. In order to test whether the bats’ ability to find the goal was memory-based, or utilized the hippocampus, the researchers then conducted trials where the goal was hidden from the bats’ view.

To test whether the bats’ relied on memory, the Ulvanosky team measured the goal direction angle, or the angle between the bat’s head orientation and the goal. After being familiarized with the location of the goal, the bats tended toward a goal-direction angle of zero, meaning they oriented themselves toward the goal even when the goal was out of sight.

Continued research identified cells that encode information about the distance the bat is from the goal, the final piece allowing bats to navigate to a goal successfully. These hippocampal cells selectively fire when the bat is within specific distances of the goal, allowing for an awareness of location over distance.

While Ulanovsky and his team have met incredible success in identifying new types of cells as well as new functions of known cells in the hippocampus, further research in a more natural setting is required.

“If we study only under these very controlled and sterile environments, we may miss the very thing we are trying to understand, which is behavior,” Ulanovsky concluded.

By Sarah Haurin

Dopamine, Drugs, and Depression

The neurotransmitter dopamine plays a major role in mental illnesses like substance abuse disorders and depressive disorders, as well as a more general role in reward and motivational systems of the brain. But there are still certain aspects of dopamine activity in the brain that we don’t know much about.

Nii Antie Addy and his lab are interested in the role of dopamine in substance abuse and mood disorders.

Duke graduate Nii Antie Addy, PhD, and his lab at Yale School of Medicine have been focusing on dopamine activity in a specific part of the brain that has not been studied: the ventral tegmental area (VTA).

To understand the mechanisms underlying this association, Addy and his team looked at cue-induced drug-seeking behavior. Using classical conditioning, rats can be trained to pair certain cues with the reward of drug administration. When a rat receives an unexpected award, dopamine activity increases. After conditioning, dopamine is released in response to the cue more  than to the drug itself. Looking at the patterns of dopamine release in rats who are forced to undergo detoxification can thus provide insight into how these cues and neurotransmitter activity relate to relapse of substance abuse.

When rats are taught to self-administer cocaine, and each administration of the drug is paired with the cue, after a period of forced detoxification, the rodents continue to try to self-administer the drug, even when the drug is withheld and only the cue is presented. This finding again demonstrates the connection between the cue and drug-seeking behavior.

Studying the activity in the VTA gave additional insights into the regulation of this system. During the period of abstinence, when the rodents are forced to detox, researchers observed an increase in the activity of cholingergic neurons, or neurons in the brain system that respond to the neurotransmitter acetylcholine.

Using these observations, Addy and his team sought to identify which of the various receptors that respond to acetylcholine can be used to regulate the dopamine system behind drug-seeking behaviors. They discovered that a specific type of acetylcholine receptor, the muscarinic receptor, is involved in more general reward-seeking behaviors and thus may be a target for therapies.

Using Isradipine, a drug already approved by the FDA for treatment of high blood pressure, Addy designed an experiment to test the role of these muscarinic receptors. He co-opted the drug to act as a calcium antagonist in the VTA and thus increase dopamine activity in rodents during their forced detox and before returning them to access to cocaine. The outcome was promising: administration of Isradipine was associated with a decrease in the coke-seeking behavior of rodents then placed in the chamber with the cue.

The understanding of the role of cholinergic neurons in regulation of dopamine-related mental illnesses like substance-abuse also contributes insights into depressive and anxiety disorders. If the same pathway implicated in cue-induced drug-seeking were involved in depressive and anxious behaviors, then increasing cholinergic activity should increase pro-depressive behavior.

Addy’s experiment yielded exactly these results, opening up new areas to be further researched to improve the treatment of mood disorders.

Post by Sarah Haurin

 

Morphogenesis: All Guts and Morning Glories

What is morphogenesis? Morphogenesis examines the development of the living organisms’ forms.

It also is an area of research for Lakshminarayanan Mahadevan, Professor of Applied Mathematics, Organismic and Evolutionary Biology and Physics at Harvard University. On his presentation in the Public Lectures Unveiling Math (PLUM) series here at Duke, he credited the beginnings of morphogenesis to D’Arcy Wentworth Thompson, author of the book On Growth and Form.

Mathematically, morphogenesis focuses on how different rates of growth change the shapes of organisms as they develop. Cell number, cell size, cell shape, and cell position comprise the primary cellular factors of multicellular morphogenesis, which studies larger structures than individual cells and is Mahadevan’s focus.

Effects on tissues appear through changes in sizes, connectivities, and shapes, altering the phenotype, or the outward physical appearance. All these variables change in space and time. Professor Mahadevan presented on morphogenesis studies that have been conducted on plant shoots, guts, and brains.

Research on plant shoots often concentrates on the question, “Why do plant shoots grow in such a wide variety of directions and what determines their shapes?” The picture below shows the different postures appearances of plant shoots from completely straight to leaning to hanging.

Can morphogenesis make sense of these differences? Through mathematical modeling, two stimuli for shoots’ shapes was determined: gravity and itself. Additionally, elasticity as a function of the shoots’ weight plays a role in the mathematical models of plant shoots’ shapes which appear in Mahadevan’s paper co-written with a fellow professor, Raghunath Chelakkot. Mahadevan also explored the formation of flower and leaf shapes with these morphogenesis studies. 

Over twenty feet of guts are coiled up inside you. In order to fit these intestines inside the mammals, they must coil and loop. But what variables determine how these guts loop around? To discover the answer to this question, Mahadevan and other researchers examined chick embryos which increase their gut lengths by a factor greater than twenty over a twelve-day span. They were able to create a physical model using a rubber tube sewn to a sheet that followed the same patterns as the chicks’ guts. Through their observation of not only chicks but also quail and mice, Mahadevan determined that the morphogenesis of the guts has no dependence on genetics or any other microscopic factors.

Mahadevan’s study of how the brain folds occurs through MRI images of human fetal development. Initially, barely any folding exists on fetal brains but eventually the geometry of the surrounding along with local stress forms folds on the brain. By creating a template with gel and treating it to mimic the relationship between the brain’s gray matter and white matter, Mahadevan along with other researchers discovered that they could reproduce the brain’s folds. Because they were able to recreate the folds through only global geometry and local stress, they concluded that morphogenesis evolution does not depend on microscopic factors such as genetics. Further, by examining if folding regions correlate with the activity regions of the brain, questions about the effect of physical form on abilities and the inner functions of the brain.

  

     

Page 2 of 15

Powered by WordPress & Theme by Anders Norén