Duke Research Blog

Following the people and events that make up the research community at Duke.

Page 2 of 63

Becoming the First: Erika Weinthal

Editor’s Note: In the “Becoming the First” series,  first-generation college student and Rubenstein Scholar Lydia Goff explores the experiences of Duke researchers who were the first in their families to attend college.

A portrait of Erika Weinthal

Erika Weinthal

In her corner office with a wall of windows and stuffed bookshelves, Erika Weinthal keeps a photo of her father. He came to the United States from Germany in 1940. And for a German Jew, that was extremely late. According to the family stories, Weinthal’s father left on the second to last boat from Italy. It is no surprise that he was never a big traveler after his arrival to America. As Weinthal describes it, “America…was the country that saved him.” Not only did it protect him, but it also gave his children opportunities that he did not have, such as going to college.

Weinthal, Lee Hill Snowdon Professor of Environmental Policy in Duke’s Nicholas School of the Environment, took this opportunity to become the first in her family to attend college, launching her career researching environmental policy and water security in areas including the former Soviet Union, Middle East, East Africa, India and the United States.

In high school, Weinthal traveled as an exchange student to Germany, a country her relatives could never understand her desire to visit. “As a child of a refugee, you didn’t talk about the war,” she explains as she describes how this silence created her curiosity about what happened. That journey to Bremen marked only the first of many trips around the world. In the Middle East, she examines environmental policy between countries that share water. In India, she has researched the relationship between wildlife and humans near protected areas. “What do you do when protected wildlife destroys crops and threatens livelihoods?” she asks, proving that since her curiosity about the war, she has not stopped asking questions.

However, her specific interest in environmental science and policy came straight from a different war: the Cold War. She became obsessed with everything Russian partly thanks to a high school teacher who agreed to teach her Russian one-on-one. The teacher introduced Weinthal to Russian literature and poetry. While her parents, like many parents, would have loved for her to become a doctor or a lawyer, they still trusted her when she enrolled in Oberlin College intent on studying Soviet politics. A class on Soviet environment politics further increased her interest in water security.

Currently, her work contends that water should be viewed as a basic human need separate from the political conflicts in Palestine and Israel. She has studied how protracted conflict in the region has led to the deterioration of water quality in the Gaza Strip, creating a situation in which water is now unfit for human consumption. Weinthal argues that these regions should not view water as property to be secured but rather as a human right they should guarantee.

Erika Weinthal’s father in 1940

As a child of a refugee and a first-generation college student, Weinthal says “you grow up essentially so grateful for what others have sacrificed for you.” Her dad believed in giving back to the next generation. He accomplished that goal and, in the process, gave the world a researcher who’s invested in environmental policy and human rights.

Post by Lydia Goff

 

Detangling Stigma and Mental Illness

Can you imagine a world without stigma? Where a diagnosis of autism or schizophrenia didn’t inevitably stick people with permanent labels of “handicap,” “abnormal,” “disturbed,” or “dependent”?

Roy Richard Grinker can. In fact, he thinks we’re on the way to one.

It’s a subject he’s studied and lectured on extensively—stigmas surrounding mental health conditions, that is. His expertise, influence, and unique insight in the field led him to April 12, where he was the distinguished speaker of an annual lecture commemorating Autism Awareness Month. The event was co-sponsored by the Duke Center for Autism and Brain Development, the Duke Institute for Brain Sciences, and the Department of Cultural Anthropology.

Roy Richard Grinker was the invited speaker to this year’s annual Autism Awareness Month commemorative lecture. Photo credit: Duke Institute for Brain Sciences

Grinker’s credentials speak to his expertise. He is a professor of Anthropology, International Affairs, and Human Sciences at George Washington University; he has authored five books, several New York Times op-eds, and a soon-to-be-published 600-page volume on the anthropology of Africa; he studied in the Democratic Republic of the Congo as a Fulbright scholar in his early career; and, in the words of Geraldine Dawson, director of the Center for Autism and Brain Development, “he fundamentally changed the way we think about autism.”

Grinker began with an anecdote about his daughter, who is 26 years old and “uses the word ‘autism’ to describe herself—not just her identity, but her skills.”

She likes to do jigsaw puzzles, he said, but in a particular fashion: with the pieces face-down so their shape is the only feature she can use to assemble them, always inexplicably leaving one piece out at the end. He described this as one way she embraces her difference, and a metaphor for her understanding that “there’s always a piece missing for all of us.”

Grinker and Geraldine Dawson, director of the Center for Autism and Brain Development, pose outside Love Auditorium in the minutes before his talk. Source: Duke Institute for Brain Sciences

“What historical and cultural conditions made it possible for people like Isabel to celebrate forms of difference that were a mark of shame only a few decades ago?” Grinker asked.  “To embrace the idea that mental illnesses are an essential feature of what it means to be human?”

He identified three processes as drivers of what he described as the “pivotal historical moment” of the decoupling of stigma and mental illness: high-profile figures, from celebrity talk-show hosts to the Pope, speaking up about their mental illnesses instead of hiding them; a shift from boxing identities into racial, spiritual, gender, and other categories to placing them on a spectrum; and economies learning to appreciate the unique skills of people with mental illness.

This development in the de-stigmatization of mental illness is recent, but so is stigma itself. Grinker explained how the words “normal” and “abnormal” didn’t enter the English vocabulary until the mid-19th century—the idea of “mental illness” had yet to make its debut.

“There have always been people who suffer from chronic sadness or had wildly swinging moods, who stopped eating to the point of starvation, who were addicted to alcohol, or only spoke to themselves.” Grinker said. “But only recently have such behaviors defined a person entirely. Only recently did a person addicted to alcohol become an alcoholic.”

Grinker then traced the development of mental illness as an idea through modern European and American history. He touched on how American slaveowners ascribed mental illness to African Americans as justification for slavery, how hysteria evolved into a feminized disease whose diagnoses became a classist tool after World War I, and how homosexuality was gradually removed from the Diagnostic and Statistical Manual of Mental Disorders (DSM) by secretly gay psychiatrists who worked their way up the rankings of the American Psychiatric Association in the 1960s and 70s.

Source: Duke Institute for Brain Sciences

Next, Grinker described his anthropological research around the world on perceptions of mental illness, from urban South Korea to American Indian tribes to rural villages in the Kalahari Desert. His findings were wide-ranging and eye-opening: while, at the time of Grinker’s research, Koreans viewed mental illness of any kind as a disgrace to one’s heritage, members of Kalahari Desert communities showed no shame in openly discussing their afflictions. Grinker told of one man who spoke unabashedly of his monthly 24-mile walk to the main village for antipsychotic drugs, without which, as was common knowledge among the other villagers, he would hear voices in his head urging him to kill them. Yet, by Grinker’s account, they didn’t see him as ill — “a man who never hallucinates because he takes his medicine is not crazy.”

I could never do justice to Grinker’s presentation without surpassing an already-strained word limit on this post. Suffice it to say, the talk was full of interesting social commentary, colorful insights into the history of mental illness, and words of encouragement for the future of society’s place for diversity in mental health. Grinker concluded on such a note:

“Stigma decreases when a condition affects us all, when we all exist on a spectrum,” Grinker said. “We see this in the shift away from the categorical to the spectral dimension. Regardless, we might need the differences of neurodiversity to make us, humans, interesting, vital, and innovative.”

Post by Maya Iskandarani

Better Butterfly Learners Take Longer to Grow Up

Emilie Snell-Rood studies butterflies to understand the factors that influence plasticity.

The ability of animals to vary their phenotypes, or physical expression of their genes, in different environments is a key element to survival in an ever-changing world.

Emilie Snell-Rood, PhD, of the University of Minnesota, is interested in why this phenomena of plasticity varies. Some animals’ phenotypes are relatively stable despite varying environmental pressures, while others display a wide range of behaviors.

Researchers have looked into how the costs of plasticity limit its variability. While many biologists expected that energetic costs should be adequate explanations for the limits to plasticity, only about 30 percent of studies that have looked for plasticity-related costs have found them.

Butterflies’ learning has provided insight into developmental plasticity.

With her model of butterflies, Snell-Rood has worked to understand why these researchers have come up with little results.

Snell-Rood hypothesized that the life history of an animal, or the timing of major developmental events like weaning, should be of vital importance in the constraints on plasticity, specifically on the type of plasticity involved in learning. Much of learning involves trial and error, which is costly – it requires time, energy, and exposure to potential predators while exploring the environment.

Additionally, behavioral flexibility requires an investment in developing brain tissue to accommodate this learning.

Because of these costs, animals that engage in this kind of learning must forgo reproduction until later in life.

To test the costs of learning, Snell-Rood used butterflies as a subject. Butterflies require developmental plasticity to explore their environments and optimize their food finding strategies. Over time, butterflies get more efficient at landing on the best host plants, using color and other visual cues to find the best food sources.

Studying butterfly families shows that families that are better learners have increased volume in the part of the brain associated with sensory integration. Furthermore, experimentally speeding up an organism’s life history leads to a decline in learning ability.

These results support a tradeoff between an organism’s developmental plasticity and life history. While this strategy is more costly in terms of investment in neural development and energy investment, it provides greater efficacy in adaptation to environment. However, further pressures from resource availability can also influence plasticity.

Looking to the butterfly model, Snell-Rood found that quality nutrition increases egg production as well as areas of the brain associated with plasticity.

Understanding factors that influence an animal’s plasticity is becoming increasingly important. Not only does it allow us to understand the role of plasticity in evolution up to this point, but it allows us to predict how organisms will adapt to novel and changing environments, especially those that are changing because of human influence. For the purposes of conservation, these predictions are vital.

By Sarah Haurin

Looking at Cooking as a Science Experiment

From five-star restaurants to Grandma’s homemade cookies, cooking is an art that has transformed the way we taste food. But haven’t you ever wondered how cooking works? How in the world did people discover how to make Dipping Dots or Jell-O?

Patrick Charbonneau is an Associate Professor of Chemistry here at Duke and last Friday he gave a delicious talk about the science of cooking (with samples!).

Patrick Charbonneau, Duke Chemist and Foodie

Around 10,000 years ago humans discovered that by fermenting milk you could turn it into yogurt, something that is more transportable, lasts longer, and digests easier. In the 1600s a new cooking apparatus called the “bone digester” (pressure cooker) allowed you to cook things faster while enhancing the flavor. When the 1800s came around, a scientist named Eben Horsford discovered that adding an acid with sodium bicarbonate creates baking powder. Soon enough scientific and kitchen minds started to collaborate, and new creations were made in the culinary world. As you can see, a lot of fundamental cooking techniques and ingredients we use today are a product of scientific discoveries.

Old-school pressure cookers. Forerunners of the Instant Pot.

Whisked Toffee

Freezer toffee, AKA caramel

A huge part of cooking is controlling the transformation of matter, or “a change in phase.” Professor Charbonneau presented a very cool example demonstrating how controlling this phase shift can affect your experience eating something. He made the same toffee recipe twice, but he changed it slightly as the melted toffee mixture was cooling. One version you stick straight in the freezer; the other you whisk as it cools. The whisked version turns out crumbly and sweeter; the other one turns into a chewy, shiny caramel. The audience got samples, and I could easily tell how different each version looked and tasted.

Charbonneau explained that while both toffees have the same ingredients, most people prefer the crumbly one because it seems sweeter (I agreed). This is because the chewier one takes longer to dissolve onto your taste buds, so your brain registers it as less sweet.

I was fascinated to learn that a lot of food is mostly just water. It’s weird to think a solid thing could be made of water, yet some foods are up to 99% water and still elastic! We have polymers — long repeating patterns of atoms in a chain — to thank for that. In fact, you can turn almost any liquid into a gel. Polymers take up little space but play a vital role in not only foods but other everyday objects, like contact lenses.

Charbonneau also showed us a seemingly magical way to make cake. He took about half a Dixie cup of cake batter, stuck a whipping siphon charged with nitrous oxide inside it for a second, then threw it in the microwave for thirty seconds. Boom, easy as cake. Out came a cup full of some pretty darn good fluffy chocolate cake. The gas bubbles in the butter and egg batter expand when they are heated up, causing the batter to gel and form a solid network.

Professor Charbonneau is doing stuff like this in his class here at Duke, “The Chemistry and Physics of Cooking,” all the time.

In the past ten years a surge in science-cooking related classes has emerged. The experiments you could do in a kitchen-lab are so cool and can make science appealing to those who might normally shy away from it.

Another cool thing I learned at the stations outside of Charbonneau’s talk was that Dipping Dots are made by dripping melted ice cream into a bowl of liquid nitrogen. The nitrogen is so cold that it flash-freezes the ice cream droplet into a ball-like shape!

Post by Will Sheehan

Will Sheehan

Duke Alumni Share Their SpaceX Experiences

It was 8 o’clock on a Monday night and Teer 203 was packed. A crowd of largely Pratt Engineering students had crammed into practically every chair in the room, as if for lecture. Only, there were no laptops out tonight. No one stood at the blackboard, teaching.

SpaceX launches

SpaceX’s Falcon Heavy and Dragon rockets in simultaneous liftoff

No, these students had given up their Monday evening for something more important. Tonight, engineering professor Rebecca Simmons was videoconferencing with six recent Duke grads—all of whom are employed at the legendary aerospace giant SpaceX, brainchild of tech messiah Elon Musk.

Eager to learn as much as possible about the mythic world of ultracompetitive engineering, the gathered students spent the next hour and fifteen minutes grilling Duke alumni Anny Ning (structures design engineering), Kevin Seybert (integration and test engineering), Matthew Pleatman and Daniel Lazowski (manufacturing engineering), and Zachary Loncar (supply chain) with as many questions as they could squeeze through.

Over the course of the conversation, Duke students seemed particularly interested in the overall culture of SpaceX: What was it like to actually work there? What do the employees think of the SpaceX environment, or the way the company approaches engineering?

One thing all of the alumni were quick to key in on was the powerful emphasis their company placed on flexibility and engagement.

“It’s much harder to find someone that says ‘no’ at SpaceX,” Pleatman said. “It’s way easier to find someone who says ‘yes.’ ”

SpaceX’s workflow, Seybert added, is relentlessly adaptive. There are no strict boundaries on what you can work on in your job, and the employee teams are made up of continually evolving combinations of specialists and polymaths.

“It’s extremely dynamic,” Seybert said. “Whatever the needs of the company are, we will shift people around from week to week to support that.”

“It’s crazy—there is no typical week,” Lazowski added. “Everything’s changing all the time.”

SpaceX Launch

Launch of Hispasat 30W-6 Mission

Ning, for her part, focused a great deal on the flexibility SpaceX both offers and demands. New ideas and a willingness to question old ways of thinking are critical to this company’s approach to innovation, and Ning noted that one of the first things she had to learn was to be continuously on the lookout for ways her methods could be improved.

“You should never hear someone say, ‘Oh, we’re doing this because this is how we’ve always done it,’ ” she said.

The way SpaceX approaches engineering and innovation, Seybert explained, is vastly different from how traditional aerospace companies have tended to operate. SpaceX employees are there because of their passion for their work. They focus on the projects they want to focus on, they move between projects on a day-to-day basis, and they don’t expect to stay at any one engineering company for more than a few years. Everything is geared around putting out the best possible product, as quickly as humanly possible.

So now, the million dollar question: How do you get in?

“One thing that I think links us together is the ability to work hands-on,” Loncar offered.

Pleatman agreed. “If you want to get a job at SpaceX directly out of school, it’s really important to have an engineering project that you’ve worked on. It doesn’t matter what it is, but just something where you’ve really made a meaningful contribution, worked hard, and can really talk through the design from start to finish.”

Overall, passion, enthusiasm and flexibility were overarching themes. And honestly, that seems pretty understandable. We are talking about rockets, after all — what’s not to be excited about? These Duke alums are out engineering the frontier of tomorrow — bringing our species one step closer to its place among the stars.

As Ning put it, “I can’t really picture a future where we’re not out exploring space.”

Post by Daniel Egitto

Artificial Intelligence Knows How You Feel

Ever wondered how Siri works? Afraid that super smart robots might take over the world soon?

On April 3rd researchers from Duke, NCSU and UNC came together for Triangle Machine Learning Day to provoke everyone’s curiosities about the complex field that is Artificial Intelligence. A.I. is an overarching term for smart technologies, ranging from self-driving cars to targeted advertising. We can arrive at artificial intelligence through what’s known as “machine learning.” Instead of explicitly programming a machine with the basic capabilities we want it to have, we can make it so that its code is flexible and adapts based on information it’s presented with. Its knowledge grows as a result of training it. In other words, we’re teaching a computer to learn.

Matthew Philips is working with Kitware to get computers to “see,” also known as “machine vision.” By providing thousands and thousands of images, a computer with the right coding can learn to actually make sense of what an image is beyond different colored pixels.

Machine vision has numerous applications. An effective way to search satellite imagery for arbitrary objects could be huge in the advancement of space technology – a satellite could potentially identify obscure objects or potential lifeforms that stick out in those images. This is something we as humans can’t do ourselves just because of the sheer amount of data there is to go through. Similarly, we could teach a machine to identify cancerous or malignant cells in an image, thus giving us a quick diagnosis if someone is at risk of developing a disease.

The problem is, how do you teach a computer to see? Machines don’t easily understand things like similarity, depth or orientation — things that we as humans do automatically without even thinking about. That’s exactly the type of problem Kitware has been tackling.

One hugely successful piece of Artificial Intelligence you may be familiar with is IBM’s Watson. Labeled as “A.I. for professionals,” Watson was featured on Sixty Minutes and even played Jeopardy on live television. Watson has visual recognition capabilities, can work as a translator, and can even understand things like tone, personality or emotional state. And obviously it can answer crazy hard questions. What’s even cooler is that it doesn’t matter how you ask the question – Watson will know what you mean. Watson is basically Siri on steroids, and the world got a taste of its power after watching it smoke its competitors on Jeopardy. However, Watson is not to be thought of as a physical supercomputer. It is a collection of technologies that can be used in many different ways, depending on how you train it. This is what makes Watson so astounding – through machine learning, its knowledge can adapt to the context it’s being used in.

Source: CBS News.

IBM has been able to develop such a powerful tool thanks to data. Stacy Joines from IBM noted, “Data has transformed every industry, profession, and domain.” From our smart phones to fitness devices, data is being collected about us as we speak (see: digital footprint). While it’s definitely pretty scary, the point is that a lot of data is out there. The more data you feed Watson, the smarter it is. IBM has utilized this abundance of data combined with machine learning to produce some of the most sophisticated AI out there.

Sure, it’s a little creepy how much data is being collected on us. Sure, there are tons of movies and theories out there about how intelligent robots in the future will outsmart humans and take over. But A.I. isn’t a thing to be scared of. It’s a beautiful creation that surpasses all capabilities even the most advanced purely programmable model has. It’s joining the health care system to save lives, advising businesses and could potentially find a new inhabitable planet. What we choose to do with A.I. is entirely up to us.

Post by Will Sheehan

Will Sheehan

ECT: Shockingly Safe and Effective

Husain is interested in putting to rest misconceptions about the safety and efficacy of ECT.

Few treatments have proven as controversial and effective as electroconvulsive therapy (ECT), or ‘shock therapy’ in common parlance.

Hippocrates himself saw the therapeutic benefits of inducing seizures in patients with mental illness, observing that convulsions caused by malaria helped attenuate symptoms of mental illness. However, depictions of ECT as a form of medical abuse, as in the infamous scene from One Flew Over the Cuckoo’s Nest, have prevented ECT from becoming a first-line psychiatric treatment.

The Duke Hospital Psychiatry program recently welcomed back Duke Medical School alumnus Mustafa Husain to deliver the 2018 Ewald “Bud” Busse Memorial Lecture, which is held to commemorate a Duke doctor who pioneered the field of geriatric psychiatry.

Husain, from the University of Texas Southwestern, delivered a comprehensive lecture on neuromodulation, a term for the emerging subspecialty of psychiatric medicine that focuses on physiological treatments that are not medication.

The image most people have of ECT is probably the gruesome depiction seen in “One Flew Over the Cuckoo’s Nest.”

Husain began his lecture by stating that ECT is one of the most effective treatments for psychiatric illness. While medication and therapy are helpful for many people with depression, a considerable proportion of patients’ depression can be categorized as “treatment resistant depression” (TRD). In one of the largest controlled experiments of ECT, Husain and colleagues showed that 82 percent of TRD patients treated with ECT were remitted. While this remission rate is impressive, the rate at which remitted individuals experience a relapse into symptoms is also substantial – over 50% of remitted individuals will experience relapse.

Husain’s study continued to test whether a continuation of ECT would be a potentially successful therapy to prevent relapse in the first six months after acute ECT. He found that continuation of ECT worked as well as the current best combination of drugs used.

From this study, Husain made an interesting observation – the people who were doing best in the 6 months after ECT were elderly patients. He then set out to study the best form of treatment for these depressed elderly patients.

Typically, ECT involves stimulation of both sides of the brain (bilateral), but this treatment is associated with adverse cognitive effects like memory loss. Using right unilateral ECT effectively decreased cognitive side effects while maintaining an appreciable remission rate.

After the initial treatment, patients were again assigned to either receive continued drug treatment or continued ECT. In contrast to the previous study, however, the treatment for continued ECT was designed based on the individual patients’ ratings from a commonly used depression scaling system.

The results of this study show the potential that ECT has in becoming a more common treatment for major depressive disorder: maintenance ECT showed a lower relapse rate than drug treatment following initial ECT. If psychiatrists become more flexible in their prescription of ECT, adjusting the treatment plan to accommodate the changing needs of the patients, a disorder that is exceedingly difficult to treat could become more manageable.

In addition to discussing ECT, Husain shared his research into other methods of neuromodulation, including Magnetic Seizure Therapy (MST). MST uses magnetic fields to induce seizures in a more localized region of the brain than available via ECT.

Importantly, MST does not cause the cognitive deficits observed in patients who receive ECT. Husain’s preliminary investigation found that a treatment course relying on MST was comparable in efficacy to ECT. While further research is needed, Husain is hopeful in the possibilities that interventional psychiatry can provide for severely depressed patients.

By Sarah Haurin 

First Population Health Conference Shares Energy, Examples

Logo: Population Health at Duke‘Population Health’ is the basis of a new department in the School of Medicine, a byword for a lot of new activity across campus , and on Tuesday the subject of a half-day symposium that attempted to bring all this energy together.

For now, population health means a lot of different things to a lot of different people.

The half-day symposium drew an overflow crowd of faculty and staff. (photo – Colin Huth)

“We’re still struggling with a good definition of what population health is,” said keynote speaker Clay Johnston, MD, PhD, dean of the new Dell School of Medicine in Austin, Texas. Smoking cessation programs are something most everyone would agree is taking care of the population outside of the clinic. But improved water quality? Where does that fit?

“We have an intense focus on doctors and their tools,” Johnston said. Our healthcare system is optimized for maximum efficiency in fee-for-service care, that is, getting the most revenue out of the most transactions. “But most of health is outside the clinic,” Johnston said.

Perhaps as a result, the United States pays much more for health care, but lives less well, he said. “We are noticeably off the curve,” when compared to health care costs and outcomes in other countries.

This graphic from a handout shared at the conference shows how population health spans the entire university.

This graphic from a handout shared at the conference shows how population health spans the entire university.

As an example of what might be achieved in population health with some re-thinking and a shift in resources, the Dell School went after the issue of joint pain with input from their engineering and business schools. Rather than diagnosing people toward an orthopedic surgery – for which there was a waitlist of about 14 months – their system worked with patients on alternatives, such as weight loss, physical therapy and behavioral changes before surgery. The 14-month backlog was gone in just three months. Surgeries still happen, of course, but not if they can be comfortably delayed or avoided.

“Payment for prevention needs serious work,” Johnston said. “You need to get people to buy into it,” but in diabetes or depression for example, employers should stand to gain a lot from having healthier employees who miss fewer days, he said.

Health Affairs Chancellor Eugene Washington commented several times, calling the discussion “very interesting and very valuable.” (photo -Colin Huth)

Other examples flowed freely the rest of the afternoon. Duke is testing virtual ‘telemedicine’ appointments versus office visits. Evidence-based prenatal care is being applied to try to avoid expensive neonatal ICU care. Primary care and Emergency Department physicians are being equipped with an app that helps them steer sickle cell patients to appropriate care resources so that they might avoid expensive ED visits.

Family practitioner Eugenie Komives, MD, is part of a team using artificial intelligence and machine learning to try to predict which patients are most likely to be hospitalized in the next six months. That prediction, in turn, can guide primary care physicians and care managers to pay special attention to these patients to help them avoid the hospital. The system is constantly being evaluated, she added. “We don’t want to be doing this if it doesn’t work.”

Community health measures like walkability and grocery stores are being mapped for Durham County on a site called Durham Neighborhood Compass, said Michelle Lyn, MBA, chief of the division of community health. The aim is not only to see where improvements can be made, but to democratize population health information and put it in peoples’ hands. “(Community members) will have ideas we never could have thought of,” Lyn said. “We will be able to see change across our neighborhoods and community.”

Patient input is key to population health, agreed several speakers. “I don’t think we’ve heard them enough,” said Paula Tanabe, PhD, an associate professor of nursing and medicine who studies pain and sickle cell disease.  “We need a bigger patient voice.”

Health Affairs Chancellor and Duke Health CEO Eugene Washington, MD, has made population health one of the themes of his leadership. “We really take seriously this notion of shaping the future of population health,” he said in his introductory remarks. “When I think of the future, I think about how well-positioned we are to have impact on the lives of the community we serve.”

Lesley Curtis, PhD, chair of the newly formed Department of Population Health Sciences in the School of Medicine, said Duke is creating an environment where this kind of work can happen.

“I, as an organizer of this, didn’t know about half of these projects today!” Curtis said. “There’s so much going on at an organic level that the challenge to us is to identify what’s going on and figure out how to go forward at scale.”

Post by Karl Leif Bates

Stretchable, Twistable Wires for Wearable Electronics

A new conductive “felt” carries electricity even when twisted, bent and stretched. Credit: Matthew Catenacci

The exercise-tracking power of a Fitbit may soon jump from your wrist and into your clothing.

Researchers are seeking to embed electronics such as fitness trackers and health monitors into our shirts, hats, and shoes. But no one wants stiff copper wires or silicon transistors deforming their clothing or poking into their skin.

Scientists in Benjamin Wiley’s lab at Duke have created new conductive “felt” that can be easily patterned onto fabrics to create flexible wires. The felt, composed of silver-coated copper nanowires and silicon rubber, carries electricity even when bent, stretched and twisted, over and over again.

“We wanted to create wiring that is stretchable on the body,” said Matthew Catenacci, a graduate student in Wiley’s group.

The conductive felt is made of stacks of interwoven silver-coated copper nanotubes filled with a stretchable silicone rubber (left). When stretched, felt made from more pliable rubber is more resilient to small tears and holes than felts made of stiffer rubber (middle). These tears can be seen in small cavities in the felt (right). Credit: Matthew Catenacci

To create a flexible wire, the team first sucks a solution of copper nanowires and water through a stencil, creating a stack of interwoven nanowires in the desired shape. The material is similar to the interwoven fibers that comprise fabric felt, but on a much smaller scale, said Wiley, an associate professor of chemistry at Duke.

“The way I think about the wires are like tiny sticks of uncooked spaghetti,” Wiley said. “The water passes through, and then you end up with this pile of sticks with a high porosity.”

The interwoven nanowires are heated to 300 F to melt the contacts together, and then silicone rubber is added to fill in the gaps between the wires.

To show the pliability of their new material, Catenacci patterned the nanowire felt into a variety of squiggly, snaking patterns. Stretching and twisting the wires up to 300 times did not degrade the conductivity.

The material maintains its conductivity when twisted and stretched. Credit: Matthew Catenacci

“On a larger scale you could take a whole shirt, put it over a vacuum filter, and with a stencil you could create whatever wire pattern you want,” Catenacci said. “After you add the silicone, so you will just have a patch of fabric that is able to stretch.”

Their felt is not the first conductive material that displays the agility of a gymnast. Flexible wires made of silver microflakes also exhibit this unique set of properties. But the new material has the best performance of any other material so far, and at a much lower cost.

“This material retains its conductivity after stretching better than any other material with this high of an initial conductivity. That is what separates it,” Wiley said.

Stretchable Conductive Composites from Cu-Ag Nanowire Felt,” Matthew J. Catenacci, Christopher Reyes, Mutya A. Cruz and Benjamin J. Wiley. ACS Nano, March 14, 2018. DOI: 10.1021/acsnano.8b00887

Post by Kara Manke

DNA Breakage: What Doesn’t Kill You…

What doesn’t kill you makes you stronger―at least according to Kelly Clarkson’s recovery song for middle school crushes, philosopher Friedrich Nietzsche, and New York University researcher Viji Subramanian.

During the creation of sperm or eggs, DNA molecules exchange genetic material. This increases the differences between offspring and their parents and the overall species diversity and is thought to make an individual and a species stronger.

However, to trade genetic information — through a process called recombination — the DNA molecules must break at points along the chromosomes, risking permanent damage and loss of genomic integrity. In humans, errors during recombination can lead to infertility, fetal loss, and birth defects.

Subramanian, a postdoctoral researcher in the lab of Andreas Hochwagen at NYU, spoke at Duke on February 26. She studies how cells prevent excessive DNA breakage and how they regulate repair.

Subramanian uses budding yeast to study the ‘synaptonemal complex,’ a structure that forms between pairing chromosomes as shown in the above image. Over three hundred DNA breakage hotspots exist in the budding yeast’s synaptonemal complex. Normally, double-stranded DNA breaks go from none to some and then return to none.

However, when Subramanian removed the synaptonemal complex, the breaks still appeared, but they did not completely disappear by the end of the process. She  concluded that synaptonemal complex shuts down DNA break formation. The synaptonemal complex therefore is one way cells prevent excessive DNA breakage.

The formation of the synaptonemal complex

 

During DNA breakage repair, preference must occur between the pairing chromosomes in order for recombination to correctly transpire. A protein called Mek1 promotes this bias by suppressing DNA in select areas. Early in the process of DNA breakage and repair Mek1 levels are high, while synaptonemal complex density is low. Later, the synaptonemal complex increases while the Mek1 decreases.

This led to Subramanian’s conclusion that synaptonemal complex is responsible for removing Mek1, allowing in DNA repair. She then explored if the protein pch2 regulates the removal of Mek1. In pch2-mutant budding yeast cells, DNA breaks were not repaired.

Subramanian showed that at least one aspect of DNA breakage and repair occurs through the Mek1 protein suppression of repair, creating selectivity between chromosomes. The synaptonemal complex then uses pch2 to remove Mek1 allowing DNA breakage repair.

Subramanian had another question about this process though: how is breakage ensured in small chromosomes? Because there are fewer possible breaking points, the chance of recombination seems lower in small chromosomes. However, Subramanian discovered that zones of high DNA break potential exist near the chromosome ends, allowing numerous breaks to form even in smaller chromosomes. This explains why smaller chromosomes actually exhibit a higher density of DNA breaks and recombination since their end zones occupy a larger percentage of their total surface area.

In the future, Subramanian wants to continue studying the specific mechanics behind DNA breaks and repair, including how the chromosomes reorganize during and after this process. She is also curious about how Mek1 suppresses repair and has more than 200 Mek1 mutants in her current study.

Kelly Clarkson may prove that heartbreaks don’t destroy you, but Viji Subramanian proves that DNA breaks create a stronger, more unique genetic code.         

Post by Lydia Goff

        

Page 2 of 63

Powered by WordPress & Theme by Anders Norén