Duke Research Blog

Following the people and events that make up the research community at Duke.

Heating Up the Summer, 3D Style

While some students like to spend their summer recovering from a long year of school work, others are working diligently in the Innovation Co-Lab in the Telcom building on West Campus.

They’re working on the impacts of dust and particulate matter (PM) pollution on solar panel performance, and discovering new technologies that map out the 3D volume of the ocean.

The Co-Lab is one of three 3D printing labs located on campus. It allows students and faculty the opportunity to creatively explore research through the use of new and emerging technologies.

Third-year PhD candidate Michael Valerino said his long term research project focuses on how dust and air pollution impacts the performance of solar panels.

“I’ve been designing a low-cost prototype which will monitor the impact of dust and air pollution on solar panels,” said Valerino. “The device is going to be used to monitor the impacts of dust and particulate matter (PM) pollution on solar panel performance. This processis known as soiling. This is going to be a low-cost alternative (~$200 ) to other monitoring options that are at least $5,000.”

Most of the 3D printers come with standard Polylactic acid (PLA) material for printing. However, because his first prototype completely melted in India’s heat, Valerino decided to switch to black carbon fiber and infused nylon.

“It really is a good fit for what I want to do,” he said. “These low-cost prototypes will be deployed in China, India, and the Arabian Peninsula to study global soiling impacts.”

In a step-by-step process, he applied acid-free glue to the base plate that holds the black carbon fiber and infused nylon. He then placed the glass plate into the printer and closely examined how the thick carbon fiber holds his project together.

Michael Bergin, a professor of civil and environmental engineering professor at Duke collaborated with the Indian Institute of Technology-Gandhinagar and the University of Wisconsin last summer to work on a study about soiling.

The study indicated that there was a decrease in solar energy as the panels became dirtier over time. The solar cells jumped 50 percent in efficiency after being cleaned for the first time in several weeks. Valerino’s device will be used to expand Bergin’s work.

As Valerino tackles his project, Duke student volunteers and high school interns are in another part of the Co-Lab developing technology to map the ocean floor.

The Blue Devil Ocean Engineering team will be competing in the Shell Ocean Discovery XPRIZE, a global technology competition challenging teams to advance deep-sea technologies for autonomous, fast and high-resolution ocean exploration. (Their mentor, Martin Brooke, was recently featured on Science Friday.)

The team is developing large, highly redundant carbon drones that are eight feet across. The drones will fly over the ocean and drop pods into the water that will sink to collect sonar data.

Tyler Bletsch, a professor of the practice in electrical and computer engineering, is working alongside the team. He describes the team as having the most creative approach in the competition.

“We have many parts of this working, but this summer is really when it needs to come together,” Bletsch said. “Last year, we made it through round one of the competition and secured $100,000 for the university. We’re now using that money for the final phase of the competition.”

The final phase of the competition is scheduled to be held fall 2018.
Though campus is slow this summer, the Innovation Co-Lab is keeping busy. You can keep up-to-date with their latest projects here.

Post by Alexis Owens

 

Becoming the First: Nick Carnes

Editor’s Note: In the “Becoming the First” series,  first-generation college student and Rubenstein Scholar Lydia Goff explores the experiences of Duke researchers who were the first in their families to attend college.

A portrait of Duke Professor Nick Carnes

Nick Carnes

Should we care that we are governed by professionals and millionaires? This is one of the questions Nick Carnes, an assistant professor in the Sanford School of Public Policy, seeks to answer with his research. He explores unequal social class representation in the political process and how it affects policy making. But do any real differences even exist between politicians from lower socioeconomic classes and those from the upper classes? Carnes believes they do, not only because of his research but also because of his personal experiences.

When Carnes entered Princeton University as a political science graduate student, he was the only member of his cohort who had done restaurant, construction or factory work. While obtaining his undergraduate degree from the University of Tulsa, he worked twenty hours a week and during the summer clocked in at sixty to seventy hours a week between two jobs. He considered himself and his classmates “similar on paper,” just like how politicians from a variety of socioeconomic classes can also appear comparable. However, Carnes noticed that he approached some problems differently than his classmates and wondered why. After attributing his distinct approach to his working class background, without the benefits of established college graduate family members (his mother did go to college while he was growing up), he began developing his current research interests.

Carnes considers “challenging the negative stereotypes about working class people” the most important aspect of his research. When he entered college, his first meeting with his advisor was filled with confusion as he tried to decipher what a syllabus was. While his working class status did restrict his knowledge of college norms, he overcame these limitations. He is now a researcher, writer, and professor who considers his job “the best in the world” and whose own story proves that working class individuals can conquer positions more often inhabited by the experienced. As Carnes states, “There’s no good reason to not have working class people in office.” His research seeks to reinforce that.

His biggest challenge is that the data he needs to analyze does not exist in a well-documented manner. Much of his research involves gathering data so that he can generate results. His published book, White-Collar Government: The Hidden Role of Class in Economic Policy Making, and his book coming out in September, The Cash Ceiling: Why Only the Rich Run for Office–and What We Can Do About It, contain the data and results he has produced. Presently, he is beginning a project on transnational governments because “cash ceilings exist in every advanced democracy.” Carnes’ research proves we should care that professionals and millionaires run our government. Through his story, he exemplifies that students who come from families without generations of college graduates can still succeed.    

 

Post by Lydia Goff

 

Quantifying Sleepiness and How It Relates to Depression

Sleep disturbance is a significant issue for many individuals with depressive illnesses. While most individuals deal with an inability to sleep, or insomnia, about 20-30% of depressed patients report the opposite problem – hypersomnia, or excessive sleep duration.

David Plante’s work investigates the relationship between depressive disorders and hypersomnolence. Photo courtesy of sleepfoundation.org

Patients who experience hypersomnolence report excessive daytime sleepiness (EDS) and often seem to be sleep-deprived, making the condition difficult to identify and poorly researched.

David Plante’s research focuses on a neglected type of sleep disturbance: hypersomnolence.

David T. Plante, MD, of the University of Wisconsin School of Medicine and Public Health, studies the significance of hypersomnolence in depression. He said the condition is resistant to treatment, often persisting even after depression has been treated, and its role in increasing risk of depression in previously healthy individuals needs to be examined.

One problem in studying daytime sleepiness is quantifying it. Subjective measures include the Epworth sleepiness scale, a quick self-report of how likely you are to fall asleep in a variety of situations. Objective scales are often involved processes, such as the Multiple Sleep Latency Test (MSLT), which requires an individual to attempt to take 4-5 naps, each 2 hours apart, in a lab while EEG records brain activity.

The MSLT measures how long it takes a person to fall asleep. Individuals with hypersomnolence will fall asleep faster than other patients, but determining a cutoff for what constitutes healthy and what qualifies as hypersomnolence has made the test an inexact measure. Typical cutoffs of 5-8 minutes provide a decent measure, but further research has cast doubt on this test’s value in studying depression.

The Wisconsin Sleep Cohort Study is an ongoing project begun in 1988 that follows state employees and includes a sleep study every four years. From this study, Plante has found an interesting and seemingly paradoxical relationship: while an increase in subjective measures of sleepiness is associated with increased likelihood of depression, objective measures like the MSLT associate depression with less sleepiness. Plante argues that this paradoxical relationship does not represent an inability for individuals to report their own sleepiness, but rather reflects the limitations of the MSLT.

Plante proposed several promising candidates for quantitative measures of excessive daytime sleepiness. One candidate, which is already a tool for studying sleep deprivation, is a ‘psychomotor vigilance task,’ where lapses in reaction time correlate with daytime sleepiness. Another method involves infrared measurements of the dilation of the pupil. Pupils dilate when a person is sleepy, so this somatic reaction could be useful.

High density EEG allowed Plante to identify the role of disturbed slow wave sleep in hypersomnolence.

Another area of interest for Plante is the signs of depressive sleepiness in the brain. Using high density EEG, which covers the whole head of the subject, Plante found that individuals with hypersomnolence experience less of the sleep cycle most associated with restoration, known as slow wave sleep. He identified a potential brain circuitry associated with sleepiness, but emphasized a need for methods like transcranial magnetic stimulation to get a better picture of the relationship between this circuitry and observed sleepiness.

By Sarah Haurin

Becoming the First: Erika Weinthal

Editor’s Note: In the “Becoming the First” series,  first-generation college student and Rubenstein Scholar Lydia Goff explores the experiences of Duke researchers who were the first in their families to attend college.

A portrait of Erika Weinthal

Erika Weinthal

In her corner office with a wall of windows and stuffed bookshelves, Erika Weinthal keeps a photo of her father. He came to the United States from Germany in 1940. And for a German Jew, that was extremely late. According to the family stories, Weinthal’s father left on the second to last boat from Italy. It is no surprise that he was never a big traveler after his arrival to America. As Weinthal describes it, “America…was the country that saved him.” Not only did it protect him, but it also gave his children opportunities that he did not have, such as going to college.

Weinthal, Lee Hill Snowdon Professor of Environmental Policy in Duke’s Nicholas School of the Environment, took this opportunity to become the first in her family to attend college, launching her career researching environmental policy and water security in areas including the former Soviet Union, Middle East, East Africa, India and the United States.

In high school, Weinthal traveled as an exchange student to Germany, a country her relatives could never understand her desire to visit. “As a child of a refugee, you didn’t talk about the war,” she explains as she describes how this silence created her curiosity about what happened. That journey to Bremen marked only the first of many trips around the world. In the Middle East, she examines environmental policy between countries that share water. In India, she has researched the relationship between wildlife and humans near protected areas. “What do you do when protected wildlife destroys crops and threatens livelihoods?” she asks, proving that since her curiosity about the war, she has not stopped asking questions.

However, her specific interest in environmental science and policy came straight from a different war: the Cold War. She became obsessed with everything Russian partly thanks to a high school teacher who agreed to teach her Russian one-on-one. The teacher introduced Weinthal to Russian literature and poetry. While her parents, like many parents, would have loved for her to become a doctor or a lawyer, they still trusted her when she enrolled in Oberlin College intent on studying Soviet politics. A class on Soviet environment politics further increased her interest in water security.

Currently, her work contends that water should be viewed as a basic human need separate from the political conflicts in Palestine and Israel. She has studied how protracted conflict in the region has led to the deterioration of water quality in the Gaza Strip, creating a situation in which water is now unfit for human consumption. Weinthal argues that these regions should not view water as property to be secured but rather as a human right they should guarantee.

Erika Weinthal’s father in 1940

As a child of a refugee and a first-generation college student, Weinthal says “you grow up essentially so grateful for what others have sacrificed for you.” Her dad believed in giving back to the next generation. He accomplished that goal and, in the process, gave the world a researcher who’s invested in environmental policy and human rights.

Post by Lydia Goff

 

Detangling Stigma and Mental Illness

Can you imagine a world without stigma? Where a diagnosis of autism or schizophrenia didn’t inevitably stick people with permanent labels of “handicap,” “abnormal,” “disturbed,” or “dependent”?

Roy Richard Grinker can. In fact, he thinks we’re on the way to one.

It’s a subject he’s studied and lectured on extensively—stigmas surrounding mental health conditions, that is. His expertise, influence, and unique insight in the field led him to April 12, where he was the distinguished speaker of an annual lecture commemorating Autism Awareness Month. The event was co-sponsored by the Duke Center for Autism and Brain Development, the Duke Institute for Brain Sciences, and the Department of Cultural Anthropology.

Roy Richard Grinker was the invited speaker to this year’s annual Autism Awareness Month commemorative lecture. Photo credit: Duke Institute for Brain Sciences

Grinker’s credentials speak to his expertise. He is a professor of Anthropology, International Affairs, and Human Sciences at George Washington University; he has authored five books, several New York Times op-eds, and a soon-to-be-published 600-page volume on the anthropology of Africa; he studied in the Democratic Republic of the Congo as a Fulbright scholar in his early career; and, in the words of Geraldine Dawson, director of the Center for Autism and Brain Development, “he fundamentally changed the way we think about autism.”

Grinker began with an anecdote about his daughter, who is 26 years old and “uses the word ‘autism’ to describe herself—not just her identity, but her skills.”

She likes to do jigsaw puzzles, he said, but in a particular fashion: with the pieces face-down so their shape is the only feature she can use to assemble them, always inexplicably leaving one piece out at the end. He described this as one way she embraces her difference, and a metaphor for her understanding that “there’s always a piece missing for all of us.”

Grinker and Geraldine Dawson, director of the Center for Autism and Brain Development, pose outside Love Auditorium in the minutes before his talk. Source: Duke Institute for Brain Sciences

“What historical and cultural conditions made it possible for people like Isabel to celebrate forms of difference that were a mark of shame only a few decades ago?” Grinker asked.  “To embrace the idea that mental illnesses are an essential feature of what it means to be human?”

He identified three processes as drivers of what he described as the “pivotal historical moment” of the decoupling of stigma and mental illness: high-profile figures, from celebrity talk-show hosts to the Pope, speaking up about their mental illnesses instead of hiding them; a shift from boxing identities into racial, spiritual, gender, and other categories to placing them on a spectrum; and economies learning to appreciate the unique skills of people with mental illness.

This development in the de-stigmatization of mental illness is recent, but so is stigma itself. Grinker explained how the words “normal” and “abnormal” didn’t enter the English vocabulary until the mid-19th century—the idea of “mental illness” had yet to make its debut.

“There have always been people who suffer from chronic sadness or had wildly swinging moods, who stopped eating to the point of starvation, who were addicted to alcohol, or only spoke to themselves.” Grinker said. “But only recently have such behaviors defined a person entirely. Only recently did a person addicted to alcohol become an alcoholic.”

Grinker then traced the development of mental illness as an idea through modern European and American history. He touched on how American slaveowners ascribed mental illness to African Americans as justification for slavery, how hysteria evolved into a feminized disease whose diagnoses became a classist tool after World War I, and how homosexuality was gradually removed from the Diagnostic and Statistical Manual of Mental Disorders (DSM) by secretly gay psychiatrists who worked their way up the rankings of the American Psychiatric Association in the 1960s and 70s.

Source: Duke Institute for Brain Sciences

Next, Grinker described his anthropological research around the world on perceptions of mental illness, from urban South Korea to American Indian tribes to rural villages in the Kalahari Desert. His findings were wide-ranging and eye-opening: while, at the time of Grinker’s research, Koreans viewed mental illness of any kind as a disgrace to one’s heritage, members of Kalahari Desert communities showed no shame in openly discussing their afflictions. Grinker told of one man who spoke unabashedly of his monthly 24-mile walk to the main village for antipsychotic drugs, without which, as was common knowledge among the other villagers, he would hear voices in his head urging him to kill them. Yet, by Grinker’s account, they didn’t see him as ill — “a man who never hallucinates because he takes his medicine is not crazy.”

I could never do justice to Grinker’s presentation without surpassing an already-strained word limit on this post. Suffice it to say, the talk was full of interesting social commentary, colorful insights into the history of mental illness, and words of encouragement for the future of society’s place for diversity in mental health. Grinker concluded on such a note:

“Stigma decreases when a condition affects us all, when we all exist on a spectrum,” Grinker said. “We see this in the shift away from the categorical to the spectral dimension. Regardless, we might need the differences of neurodiversity to make us, humans, interesting, vital, and innovative.”

Post by Maya Iskandarani

Better Butterfly Learners Take Longer to Grow Up

Emilie Snell-Rood studies butterflies to understand the factors that influence plasticity.

The ability of animals to vary their phenotypes, or physical expression of their genes, in different environments is a key element to survival in an ever-changing world.

Emilie Snell-Rood, PhD, of the University of Minnesota, is interested in why this phenomena of plasticity varies. Some animals’ phenotypes are relatively stable despite varying environmental pressures, while others display a wide range of behaviors.

Researchers have looked into how the costs of plasticity limit its variability. While many biologists expected that energetic costs should be adequate explanations for the limits to plasticity, only about 30 percent of studies that have looked for plasticity-related costs have found them.

Butterflies’ learning has provided insight into developmental plasticity.

With her model of butterflies, Snell-Rood has worked to understand why these researchers have come up with little results.

Snell-Rood hypothesized that the life history of an animal, or the timing of major developmental events like weaning, should be of vital importance in the constraints on plasticity, specifically on the type of plasticity involved in learning. Much of learning involves trial and error, which is costly – it requires time, energy, and exposure to potential predators while exploring the environment.

Additionally, behavioral flexibility requires an investment in developing brain tissue to accommodate this learning.

Because of these costs, animals that engage in this kind of learning must forgo reproduction until later in life.

To test the costs of learning, Snell-Rood used butterflies as a subject. Butterflies require developmental plasticity to explore their environments and optimize their food finding strategies. Over time, butterflies get more efficient at landing on the best host plants, using color and other visual cues to find the best food sources.

Studying butterfly families shows that families that are better learners have increased volume in the part of the brain associated with sensory integration. Furthermore, experimentally speeding up an organism’s life history leads to a decline in learning ability.

These results support a tradeoff between an organism’s developmental plasticity and life history. While this strategy is more costly in terms of investment in neural development and energy investment, it provides greater efficacy in adaptation to environment. However, further pressures from resource availability can also influence plasticity.

Looking to the butterfly model, Snell-Rood found that quality nutrition increases egg production as well as areas of the brain associated with plasticity.

Understanding factors that influence an animal’s plasticity is becoming increasingly important. Not only does it allow us to understand the role of plasticity in evolution up to this point, but it allows us to predict how organisms will adapt to novel and changing environments, especially those that are changing because of human influence. For the purposes of conservation, these predictions are vital.

By Sarah Haurin

Looking at Cooking as a Science Experiment

From five-star restaurants to Grandma’s homemade cookies, cooking is an art that has transformed the way we taste food. But haven’t you ever wondered how cooking works? How in the world did people discover how to make Dipping Dots or Jell-O?

Patrick Charbonneau is an Associate Professor of Chemistry here at Duke and last Friday he gave a delicious talk about the science of cooking (with samples!).

Patrick Charbonneau, Duke Chemist and Foodie

Around 10,000 years ago humans discovered that by fermenting milk you could turn it into yogurt, something that is more transportable, lasts longer, and digests easier. In the 1600s a new cooking apparatus called the “bone digester” (pressure cooker) allowed you to cook things faster while enhancing the flavor. When the 1800s came around, a scientist named Eben Horsford discovered that adding an acid with sodium bicarbonate creates baking powder. Soon enough scientific and kitchen minds started to collaborate, and new creations were made in the culinary world. As you can see, a lot of fundamental cooking techniques and ingredients we use today are a product of scientific discoveries.

Old-school pressure cookers. Forerunners of the Instant Pot.

Whisked Toffee

Freezer toffee, AKA caramel

A huge part of cooking is controlling the transformation of matter, or “a change in phase.” Professor Charbonneau presented a very cool example demonstrating how controlling this phase shift can affect your experience eating something. He made the same toffee recipe twice, but he changed it slightly as the melted toffee mixture was cooling. One version you stick straight in the freezer; the other you whisk as it cools. The whisked version turns out crumbly and sweeter; the other one turns into a chewy, shiny caramel. The audience got samples, and I could easily tell how different each version looked and tasted.

Charbonneau explained that while both toffees have the same ingredients, most people prefer the crumbly one because it seems sweeter (I agreed). This is because the chewier one takes longer to dissolve onto your taste buds, so your brain registers it as less sweet.

I was fascinated to learn that a lot of food is mostly just water. It’s weird to think a solid thing could be made of water, yet some foods are up to 99% water and still elastic! We have polymers — long repeating patterns of atoms in a chain — to thank for that. In fact, you can turn almost any liquid into a gel. Polymers take up little space but play a vital role in not only foods but other everyday objects, like contact lenses.

Charbonneau also showed us a seemingly magical way to make cake. He took about half a Dixie cup of cake batter, stuck a whipping siphon charged with nitrous oxide inside it for a second, then threw it in the microwave for thirty seconds. Boom, easy as cake. Out came a cup full of some pretty darn good fluffy chocolate cake. The gas bubbles in the butter and egg batter expand when they are heated up, causing the batter to gel and form a solid network.

Professor Charbonneau is doing stuff like this in his class here at Duke, “The Chemistry and Physics of Cooking,” all the time.

In the past ten years a surge in science-cooking related classes has emerged. The experiments you could do in a kitchen-lab are so cool and can make science appealing to those who might normally shy away from it.

Another cool thing I learned at the stations outside of Charbonneau’s talk was that Dipping Dots are made by dripping melted ice cream into a bowl of liquid nitrogen. The nitrogen is so cold that it flash-freezes the ice cream droplet into a ball-like shape!

Post by Will Sheehan

Will Sheehan

Duke Alumni Share Their SpaceX Experiences

It was 8 o’clock on a Monday night and Teer 203 was packed. A crowd of largely Pratt Engineering students had crammed into practically every chair in the room, as if for lecture. Only, there were no laptops out tonight. No one stood at the blackboard, teaching.

SpaceX launches

SpaceX’s Falcon Heavy and Dragon rockets in simultaneous liftoff

No, these students had given up their Monday evening for something more important. Tonight, engineering professor Rebecca Simmons was videoconferencing with six recent Duke grads—all of whom are employed at the legendary aerospace giant SpaceX, brainchild of tech messiah Elon Musk.

Eager to learn as much as possible about the mythic world of ultracompetitive engineering, the gathered students spent the next hour and fifteen minutes grilling Duke alumni Anny Ning (structures design engineering), Kevin Seybert (integration and test engineering), Matthew Pleatman and Daniel Lazowski (manufacturing engineering), and Zachary Loncar (supply chain) with as many questions as they could squeeze through.

Over the course of the conversation, Duke students seemed particularly interested in the overall culture of SpaceX: What was it like to actually work there? What do the employees think of the SpaceX environment, or the way the company approaches engineering?

One thing all of the alumni were quick to key in on was the powerful emphasis their company placed on flexibility and engagement.

“It’s much harder to find someone that says ‘no’ at SpaceX,” Pleatman said. “It’s way easier to find someone who says ‘yes.’ ”

SpaceX’s workflow, Seybert added, is relentlessly adaptive. There are no strict boundaries on what you can work on in your job, and the employee teams are made up of continually evolving combinations of specialists and polymaths.

“It’s extremely dynamic,” Seybert said. “Whatever the needs of the company are, we will shift people around from week to week to support that.”

“It’s crazy—there is no typical week,” Lazowski added. “Everything’s changing all the time.”

SpaceX Launch

Launch of Hispasat 30W-6 Mission

Ning, for her part, focused a great deal on the flexibility SpaceX both offers and demands. New ideas and a willingness to question old ways of thinking are critical to this company’s approach to innovation, and Ning noted that one of the first things she had to learn was to be continuously on the lookout for ways her methods could be improved.

“You should never hear someone say, ‘Oh, we’re doing this because this is how we’ve always done it,’ ” she said.

The way SpaceX approaches engineering and innovation, Seybert explained, is vastly different from how traditional aerospace companies have tended to operate. SpaceX employees are there because of their passion for their work. They focus on the projects they want to focus on, they move between projects on a day-to-day basis, and they don’t expect to stay at any one engineering company for more than a few years. Everything is geared around putting out the best possible product, as quickly as humanly possible.

So now, the million dollar question: How do you get in?

“One thing that I think links us together is the ability to work hands-on,” Loncar offered.

Pleatman agreed. “If you want to get a job at SpaceX directly out of school, it’s really important to have an engineering project that you’ve worked on. It doesn’t matter what it is, but just something where you’ve really made a meaningful contribution, worked hard, and can really talk through the design from start to finish.”

Overall, passion, enthusiasm and flexibility were overarching themes. And honestly, that seems pretty understandable. We are talking about rockets, after all — what’s not to be excited about? These Duke alums are out engineering the frontier of tomorrow — bringing our species one step closer to its place among the stars.

As Ning put it, “I can’t really picture a future where we’re not out exploring space.”

Post by Daniel Egitto

Artificial Intelligence Knows How You Feel

Ever wondered how Siri works? Afraid that super smart robots might take over the world soon?

On April 3rd researchers from Duke, NCSU and UNC came together for Triangle Machine Learning Day to provoke everyone’s curiosities about the complex field that is Artificial Intelligence. A.I. is an overarching term for smart technologies, ranging from self-driving cars to targeted advertising. We can arrive at artificial intelligence through what’s known as “machine learning.” Instead of explicitly programming a machine with the basic capabilities we want it to have, we can make it so that its code is flexible and adapts based on information it’s presented with. Its knowledge grows as a result of training it. In other words, we’re teaching a computer to learn.

Matthew Philips is working with Kitware to get computers to “see,” also known as “machine vision.” By providing thousands and thousands of images, a computer with the right coding can learn to actually make sense of what an image is beyond different colored pixels.

Machine vision has numerous applications. An effective way to search satellite imagery for arbitrary objects could be huge in the advancement of space technology – a satellite could potentially identify obscure objects or potential lifeforms that stick out in those images. This is something we as humans can’t do ourselves just because of the sheer amount of data there is to go through. Similarly, we could teach a machine to identify cancerous or malignant cells in an image, thus giving us a quick diagnosis if someone is at risk of developing a disease.

The problem is, how do you teach a computer to see? Machines don’t easily understand things like similarity, depth or orientation — things that we as humans do automatically without even thinking about. That’s exactly the type of problem Kitware has been tackling.

One hugely successful piece of Artificial Intelligence you may be familiar with is IBM’s Watson. Labeled as “A.I. for professionals,” Watson was featured on Sixty Minutes and even played Jeopardy on live television. Watson has visual recognition capabilities, can work as a translator, and can even understand things like tone, personality or emotional state. And obviously it can answer crazy hard questions. What’s even cooler is that it doesn’t matter how you ask the question – Watson will know what you mean. Watson is basically Siri on steroids, and the world got a taste of its power after watching it smoke its competitors on Jeopardy. However, Watson is not to be thought of as a physical supercomputer. It is a collection of technologies that can be used in many different ways, depending on how you train it. This is what makes Watson so astounding – through machine learning, its knowledge can adapt to the context it’s being used in.

Source: CBS News.

IBM has been able to develop such a powerful tool thanks to data. Stacy Joines from IBM noted, “Data has transformed every industry, profession, and domain.” From our smart phones to fitness devices, data is being collected about us as we speak (see: digital footprint). While it’s definitely pretty scary, the point is that a lot of data is out there. The more data you feed Watson, the smarter it is. IBM has utilized this abundance of data combined with machine learning to produce some of the most sophisticated AI out there.

Sure, it’s a little creepy how much data is being collected on us. Sure, there are tons of movies and theories out there about how intelligent robots in the future will outsmart humans and take over. But A.I. isn’t a thing to be scared of. It’s a beautiful creation that surpasses all capabilities even the most advanced purely programmable model has. It’s joining the health care system to save lives, advising businesses and could potentially find a new inhabitable planet. What we choose to do with A.I. is entirely up to us.

Post by Will Sheehan

Will Sheehan

ECT: Shockingly Safe and Effective

Husain is interested in putting to rest misconceptions about the safety and efficacy of ECT.

Few treatments have proven as controversial and effective as electroconvulsive therapy (ECT), or ‘shock therapy’ in common parlance.

Hippocrates himself saw the therapeutic benefits of inducing seizures in patients with mental illness, observing that convulsions caused by malaria helped attenuate symptoms of mental illness. However, depictions of ECT as a form of medical abuse, as in the infamous scene from One Flew Over the Cuckoo’s Nest, have prevented ECT from becoming a first-line psychiatric treatment.

The Duke Hospital Psychiatry program recently welcomed back Duke Medical School alumnus Mustafa Husain to deliver the 2018 Ewald “Bud” Busse Memorial Lecture, which is held to commemorate a Duke doctor who pioneered the field of geriatric psychiatry.

Husain, from the University of Texas Southwestern, delivered a comprehensive lecture on neuromodulation, a term for the emerging subspecialty of psychiatric medicine that focuses on physiological treatments that are not medication.

The image most people have of ECT is probably the gruesome depiction seen in “One Flew Over the Cuckoo’s Nest.”

Husain began his lecture by stating that ECT is one of the most effective treatments for psychiatric illness. While medication and therapy are helpful for many people with depression, a considerable proportion of patients’ depression can be categorized as “treatment resistant depression” (TRD). In one of the largest controlled experiments of ECT, Husain and colleagues showed that 82 percent of TRD patients treated with ECT were remitted. While this remission rate is impressive, the rate at which remitted individuals experience a relapse into symptoms is also substantial – over 50% of remitted individuals will experience relapse.

Husain’s study continued to test whether a continuation of ECT would be a potentially successful therapy to prevent relapse in the first six months after acute ECT. He found that continuation of ECT worked as well as the current best combination of drugs used.

From this study, Husain made an interesting observation – the people who were doing best in the 6 months after ECT were elderly patients. He then set out to study the best form of treatment for these depressed elderly patients.

Typically, ECT involves stimulation of both sides of the brain (bilateral), but this treatment is associated with adverse cognitive effects like memory loss. Using right unilateral ECT effectively decreased cognitive side effects while maintaining an appreciable remission rate.

After the initial treatment, patients were again assigned to either receive continued drug treatment or continued ECT. In contrast to the previous study, however, the treatment for continued ECT was designed based on the individual patients’ ratings from a commonly used depression scaling system.

The results of this study show the potential that ECT has in becoming a more common treatment for major depressive disorder: maintenance ECT showed a lower relapse rate than drug treatment following initial ECT. If psychiatrists become more flexible in their prescription of ECT, adjusting the treatment plan to accommodate the changing needs of the patients, a disorder that is exceedingly difficult to treat could become more manageable.

In addition to discussing ECT, Husain shared his research into other methods of neuromodulation, including Magnetic Seizure Therapy (MST). MST uses magnetic fields to induce seizures in a more localized region of the brain than available via ECT.

Importantly, MST does not cause the cognitive deficits observed in patients who receive ECT. Husain’s preliminary investigation found that a treatment course relying on MST was comparable in efficacy to ECT. While further research is needed, Husain is hopeful in the possibilities that interventional psychiatry can provide for severely depressed patients.

By Sarah Haurin 

Page 1 of 62

Powered by WordPress & Theme by Anders Norén