Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Neuroscience (Page 1 of 8)

New Blogger Nirja Trivedi: Neuroscience Junior with Infinite Curiosity

My name is Nirja Trivedi and I’m a junior from Seattle interested in the intersections between health, technology and business. At Duke, I’m the co-president of P.A.S.H., a writer for the Standard and a member of B.O.W.

Nirja Trivedi blocking the sun with her hand

Nirja Trivedi

During high school, I considered liberal arts and scientific research to be separate disciplines: if technology was my strength then philosophy must be my weakness. In my two years at Duke, I have experienced the duality of these fields through participating in the Global Health Focus Program, developing my own research projects, working with professors and now applying to write for Duke Research. Science truly is for everyone; no matter your field, interests or opinion. Research and discovery are conduits for every mind. Research isn’t just the forefront of innovation, it paves the way for the future.

Growing up with a passion for service and influenced by my family in the medical field, the research I leaned towards combined aspects of community and health. My senior project in high school examined traumatic brain injury (TBI) in youth sports, which provided the research-based approach for designing my own Concussion Prevention Program. After my first semester, I wanted to discover what kinds of research I wanted to fully integrate myself in. I began research with the Duke Institute of Brain Sciences and spent my summer volunteering for the Richman Lab, which examines the effects of psychosocial factors like discrimination, social hierarchies and power. After I declared my Neuroscience major, I spent the year assisting in studies at the Autism Clinic, sparking my interest in technology.

Nirja Trivedi on a mountain top

Nirja Trivedi on a mountain top.

Now going into my third year, my interests in scientific discovery have only grown. From insight into the human psyche and social economic behavior to medical advances, I love the complexity of the human mind and how it fuels innovation.

My unrestricted interests guided me to the Innovation & Entrepreneurship Certificate as well as this writing position, both which foster an environment of curiosity and inspiration. Through writing, I hope to connect with faculty, discover areas of research I never knew existed, widen my breadth of scientific knowledge, and connect students to research opportunities. The threshold of knowledge is where you draw the line – why not make it infinite?

Post by Nirja Trivedi

New Blogger Sarah Haurin, Neuroscience Sophomore With a Thing for Criminal Minds

Hello! My name is Sarah Haurin (rhymes with Heron), and I am a sophomore at Duke. Along with being pre-med, I am pursuing a double major in neuroscience and German. I grew up outside of Philadelphia, Pennsylvania, and I originally fell in love with Duke both because of its vast research opportunities and also its mild winters. In grade school, a requirement to read nonfiction books led me to start reading popular science books for fun. Beginning with books about forensic science and articles about the chemistry of cooking, I soon expanded my interest to include natural and health sciences.

Since then, I have discovered my favorite genres to be abnormal psychology and biomedical research (my favorites being You Are Not So Smart and The Psychopath Whisperer), which interestingly enough make great beach reads (as evidenced by this picture of me from my family’s most recent vacation to Hilton Head Island, SC). In high school, I decided to take this love of reading scientific literature to a new place, and I joined the school newspaper, which allowed me to share recent and exciting findings with my peers through my articles in our health and science pages.

Sarah reading non-fiction at the beach.

I have always loved writing, which is what originally led me to joining my high school newspaper, and through my roles as section editor and eventually editor-in-chief, I came to appreciate the whole writing and publishing process. At Duke, I have written several articles for The Chronicle about the impressive and diverse ongoing research going on here at Duke.

I hope that being a well-rounded person, by allowing myself to enjoy activities not directly related to my majors, will eventually help me to be a better doctor, but for now I just enjoy the ability to combine my loves of writing and science. I hope to be able to further pursue this combination by writing for the Duke Research Blog.

One of the aspects of Duke’s community that I love the most is its diversity, which extends from the people who make up the student and faculty to the passions and interests that they pursue. I hope that writing for the Duke Research Blog will provide me with the opportunity to meet more of the incredibly passionate people who make up Duke’s campus.

Post by Sarah Haurin

Students Share Research Journeys at Bass Connections Showcase

From the highlands of north central Peru to high schools in North Carolina, student researchers in Duke’s Bass Connections program are gathering data in all sorts of unique places.

As the school year winds down, they packed into Duke’s Scharf Hall last week to hear one another’s stories.

Students and faculty gathered in Scharf Hall to learn about each other’s research at this year’s Bass Connections showcase. Photo by Jared Lazarus/Duke Photography.

The Bass Connections program brings together interdisciplinary teams of undergraduates, graduate students and professors to tackle big questions in research. This year’s showcase, which featured poster presentations and five “lightning talks,” was the first to include teams spanning all five of the program’s diverse themes: Brain and Society; Information, Society and Culture; Global Health; Education and Human Development; and Energy.

“The students wanted an opportunity to learn from one another about what they had been working on across all the different themes over the course of the year,” said Lori Bennear, associate professor of environmental economics and policy at the Nicholas School, during the opening remarks.

Students seized the chance, eagerly perusing peers’ posters and gathering for standing-room-only viewings of other team’s talks.

The different investigations took students from rural areas of Peru, where teams interviewed local residents to better understand the transmission of deadly diseases like malaria and leishmaniasis, to the North Carolina Museum of Art, where mathematicians and engineers worked side-by-side with artists to restore paintings.

Machine learning algorithms created by the Energy Data Analytics Lab can pick out buildings from a satellite image and estimate their energy consumption. Image courtesy Hoël Wiesner.

Students in the Energy Data Analytics Lab didn’t have to look much farther than their smart phones for the data they needed to better understand energy use.

“Here you can see a satellite image, very similar to one you can find on Google maps,” said Eric Peshkin, a junior mathematics major, as he showed an aerial photo of an urban area featuring buildings and a highway. “The question is how can this be useful to us as researchers?”

With the help of new machine-learning algorithms, images like these could soon give researchers oodles of valuable information about energy consumption, Peshkin said.

“For example, what if we could pick out buildings and estimate their energy usage on a per-building level?” said Hoël Wiesner, a second year master’s student at the Nicholas School. “There is not really a good data set for this out there because utilities that do have this information tend to keep it private for commercial reasons.”

The lab has had success developing algorithms that can estimate the size and location of solar panels from aerial photos. Peshkin and Wiesner described how they are now creating new algorithms that can first identify the size and locations of buildings in satellite imagery, and then estimate their energy usage. These tools could provide a quick and easy way to evaluate the total energy needs in any neighborhood, town or city in the U.S. or around the world.

“It’s not just that we can take one city, say Norfolk, Virginia, and estimate the buildings there. If you give us Reno, Tuscaloosa, Las Vegas, Pheonix — my hometown — you can absolutely get the per-building energy estimations,” Peshkin said. “And what that means is that policy makers will be more informed, NGOs will have the ability to best service their community, and more efficient, more accurate energy policy can be implemented.”

Some students’ research took them to the sidelines of local sports fields. Joost Op’t Eynde, a master’s student in biomedical engineering, described how he and his colleagues on a Brain and Society team are working with high school and youth football leagues to sort out what exactly happens to the brain during a high-impact sports game.

While a particularly nasty hit to the head might cause clear symptoms that can be diagnosed as a concussion, the accumulation of lesser impacts over the course of a game or season may also affect the brain. Eynde and his team are developing a set of tools to monitor both these impacts and their effects.

A standing-room only crowd listened to a team present on their work “Tackling Concussions.” Photo by Jared Lazarus/Duke Photography.

“We talk about inputs and outputs — what happens, and what are the results,” Eynde said. “For the inputs, we want to actually see when somebody gets hit, how they get hit, what kinds of things they experience, and what is going on in the head. And the output is we want to look at a way to assess objectively.”

The tools include surveys to estimate how often a player is impacted, an in-ear accelerometer called the DASHR that measures the intensity of jostles to the head, and tests of players’ performance on eye-tracking tasks.

“Right now we are looking on the scale of a season, maybe two seasons,” Eynde said. “What we would like to do in the future is actually follow some of these students throughout their career and get the full data for four years or however long they are involved in the program, and find out more of the long-term effects of what they experience.”

Kara J. Manke, PhD

Post by Kara Manke

Mental Shortcuts, Not Emotion, May Guide Irrational Decisions

If you participate in a study in my lab, the Huettel Lab at Duke, you may be asked to play an economic game. For example, we may give you $20 in house money and offer you the following choice:

  1. Keep half of the $20 for sure
  2. Flip a coin: heads you keep all $20; tails you lose all $20

In such a scenario, most participants choose 1, preferring a sure win over the gamble.

Now imagine this choice, again starting with $20 in house money:

  1. Lose half of the $20 for sure
  2. Flip a coin: heads you keep all $20; tails you lose all $20

In this scenario, most participants prefer the gamble over a sure loss.

If you were paying close attention, you’ll note that both examples are actually numerically identical – keeping half of $20 is the same as losing half of $20 – but changing whether the sure option is framed as a gain or a loss results in different decisions to play it safe or take a risk. This phenomenon is known as the Framing Effect. The behavior that it elicits is weird, or as psychologists and economists would say, “irrational”, so we think it’s worth investigating!

Brain activity when people make choices consistent with (hot colors) or against (cool colors) the Framing Effect.

Brain activity when people make choices consistent with (hot colors) or against (cool colors) the Framing Effect.

In a study published March 29 in the Journal of Neuroscience, my lab used brain imaging data to test two competing theories for what causes the Framing Effect.

One theory is that framing is caused by emotion, perhaps because the prospect of accepting a guaranteed win feels good while accepting a guaranteed loss feels scary or bad. Another theory is that the Framing Effect results from a decision-making shortcut. It may be that a strategy of accepting sure gains and avoiding sure losses tends to work well, and adopting this blanket strategy saves us from having to spend time and mental effort fully reasoning through every single decision and all of its possibilities.

Using functional magnetic resonance imaging (fMRI), we measured brain activity in 143 participants as they each made over a hundred choices between various gambles and sure gains or sure losses. Then we compared our participants’ choice-related brain activity to brain activity maps drawn from Neurosynth, an analysis tool that combines data from over 8,000 published fMRI studies to generate neural maps representing brain activity associated with different terms, just as “emotions,” “resting,” or “working.”

As a group, when our participants made choices consistent with the Framing Effect, their average brain activity was most similar to the brain maps representing mental disengagement (i.e. “resting” or “default”). When they made choices inconsistent with the Framing Effect, their average brain activity was most similar to the brain maps representing mental engagement (i.e. “working” or task”). These results supported the theory that the Framing Effect results from a lack of mental effort, or using a decision-making shortcut, and that spending more mental effort can counteract the Framing Effect.

Then we tested whether we could use individual participants’ brain activity to predict participants’ choices on each trial. We found that the degree to which each trial’s brain activity resembled the brain maps associated with mental disengagement predicted whether that trial’s choice would be consistent with the Framing Effect. The degree to which each trial’s brain activity resembled brain maps associated with emotion, however, was not predictive of choices.

Our findings support the theory that the biased decision-making seen in the Framing Effect is due to a lack of mental effort rather than due to emotions.

This suggests potential strategies for prompting people to make better decisions. Instead of trying to appeal to people’s emotions – likely a difficult task requiring tailoring to different individuals – we would be better off taking the easier and more generalizable approach of making good decisions quick and easy for everyone to make.

Guest post by Rosa Li

What is Money Really Worth?

“Yesterday, I was at an event and I sat next to an economist,” Brian Hare told my class. “I asked him: how old is money? He was completely lost.”

I was in Hare’s class on a Monday at noon, laughing at his description of the interaction. We had so far been exploring the origins of humans’ particular ways of making sense of the world through his course in Human Cognitive Evolution and we were faced with a slide that established the industrial period as less than 200 years old. As compared to a hunting and gathering lifestyle, this stretch of time is minuscule on an evolutionary scale.

Slide from Dr. Hare’s class. Reproduced with permission.

Why then do so many studies employ money as a proxy for the measurement of human behaviors that have been shaped by hundreds of thousands of years? This kind of research is trying to get at “prosociality,” (the ability to be altruistic and cooperative towards others) or empathy and guilt aversion, just to name a few.

I had started to wonder about this months before as a summer intern at the University of Tokyo. As I listened to a graduate student describe an experiment employing money to understand how humans behaved cooperatively, I grew puzzled. I eventually asked: Why was money used in this experiment? The argument was made that money was enough of a motivator for this sample population of college students to generalize that if they chose to share it, it must mean something.

During a panel discussion about prosociality at the American Association for the Advancement of Science meeting in Boston last month, my chance came to ask the question again. Alan Sanfey, professor at the Donders Institute for Brain, Cognition and Behavior, used experimental paradigms that rewarded participants with money to tease out the particular effects of guilt on generous behavior.

“Is money a good proxy for understanding evolutionarily ancient behavior?” I asked. Robin Dunbar, professor of evolutionary psychology at Oxford University took a dig at my question and mentioned that the barter system would have likely been the best ancient representative of money. However, the barter system likely came to life during the agricultural period, which itself is less than 10,000 years old.

Dollar bills. Public domain.

Stephen Pluháček, an attendee at the event and a senior scholar at the University of New Hampshire, said in a followup email to me that he “was interested in [my] question to the panel and disappointed by their response — which I found indicative of the ways we can become so habituated to a way of looking at things that we find it difficult to even hear questions that challenge our foundational assumptions.”

“As I said in our brief conversation, I am not convinced that money can stand as a proxy for prosocial behavior (trust, generosity) in humans prior to the advent of agriculture,” Pluháček wrote. “And even barter or gift exchange may be limited in their applicability to early humans (as well as to modern humans prior to the cognitive revolution).” 

So, I’m not alone in my skepticism. However, in my discussion with Leonard White, my advisor and associate director for education in the Duke Institute for Brain Sciences, he pointed out:

“The brain is remarkably facile. We have this amazing capacity for proxy substitution.”

In essence, this would mean that our brain would be able to consider money as a reward just like any reward that might have mediated the evolution of our behavior over time. We would thus be able to test subjects with “modern” stimuli, it appears.

It is clear that an evolutionary narrative is important to creating a more complete picture of contemporary human behavior. But sometimes the proxies we choose to make these measures don’t fit very well with our long history.

By Shanen Ganapathee

 

Creating Technology That Understands Human Emotions

“If you – as a human – want to know how somebody feels, for what might you look?” Professor Shaundra Daily asked the audience during an ECE seminar last week.

“Facial expressions.”
“Body Language.”
“Tone of voice.”
“They could tell you!”

Over 50 students and faculty gathered over cookies and fruits for Dr. Daily’s talk on designing applications to support personal growth. Dr. Daily is an Associate Professor in the Department of Computer and Information Science and Engineering at the University of Florida interested in affective computing and STEM education.

Dr. Daily explaining the various types of devices used to analyze people’s feelings and emotions. For example, pressure sensors on a computer mouse helped measure the frustration of participants as they filled out an online form.

Affective Computing

The visual and auditory cues proposed above give a human clues about the emotions of another human. Can we use technology to better understand our mental state? Is it possible to develop software applications that can play a role in supporting emotional self-awareness and empathy development?

Until recently, technologists have largely ignored emotion in understanding human learning and communication processes, partly because it has been misunderstood and hard to measure. Asking the questions above, affective computing researchers use pattern analysis, signal processing, and machine learning to extract affective information from signals that human beings express. This is integral to restore a proper balance between emotion and cognition in designing technologies to address human needs.

Dr. Daily and her group of researchers used skin conductance as a measure of engagement and memory stimulation. Changes in skin conductance, or the measure of sweat secretion from sweat gland, are triggered by arousal. For example, a nervous person produces more sweat than a sleeping or calm individual, resulting in an increase in skin conductance.

Galvactivators, devices that sense and communicate skin conductivity, are often placed on the palms, which have a high density of the eccrine sweat glands.

Applying this knowledge to the field of education, can we give a teacher physiologically-based information on student engagement during class lectures? Dr. Daily initiated Project EngageMe by placing galvactivators like the one in the picture above on the palms of students in a college classroom. Professors were able to use the results chart to reflect on different parts and types of lectures based on the responses from the class as a whole, as well as analyze specific students to better understand the effects of their teaching methods.

Project EngageMe: Screenshot of digital prototype of the reading from the galvactivator of an individual student.

The project ended up causing quite a bit of controversy, however, due to privacy issues as well our understanding of skin conductance. Skin conductance can increase due to a variety of reasons – a student watching a funny video on Facebook might display similar levels of conductance as an attentive student. Thus, the results on the graph are not necessarily correlated with events in the classroom.

Educational Research

Daily’s research blends computational learning with social and emotional learning. Her projects encourage students to develop computational thinking through reflecting on the community with digital storytelling in MIT’s Scratch, learning to use 3D printers and laser cutters, and expressing ideas using robotics and sensors attached to their body.

VENVI, Dr. Daily’s latest research, uses dance to teach basic computational concepts. By allowing users to program a 3D virtual character that follows dance movements, VENVI reinforces important programming concepts such as step sequences, ‘for’ and ‘while’ loops of repeated moves, and functions with conditions for which the character can do the steps created!

 

 

Dr. Daily and her research group observed increased interest from students in pursuing STEM fields as well as a shift in their opinion of computer science. Drawings from Dr. Daily’s Women in STEM camp completed on the first day consisted of computer scientist representations as primarily frazzled males coding in a small office, while those drawn after learning with VENVI included more females and engagement in collaborative activities.

VENVI is a programming software that allows users to program a virtual character to perform a sequence of steps in a 3D virtual environment!

In human-to-human interactions, we are able draw on our experiences to connect and empathize with each other. As robots and virtual machines grow to take increasing roles in our daily lives, it’s time to start designing emotionally intelligent devices that can learn to empathize with us as well.

Post by Anika Radiya-Dixit

Science Meets Policy, and Maybe They Even Understand Each Other!

As we’ve seen many times, when complex scientific problems like stem cells, alternative energy or mental illness meet the policy world, things can get a little messy. Scientists generally don’t know much about law and policy, and very few policymakers are conversant with the specialized dialects of the sciences.

A screenshot of SciPol’s handy news page.

Add the recent rapid emergence of autonomous vehicles, artificial intelligence and gene editing, and you can see things aren’t going to get any easier!

To try to help, Duke’s Science and Society initiative has launched an ambitious policy analysis group called SciPol that hopes to offer great insights into the intersection of scientific knowledge and policymaking. Their goal is to be a key source of non-biased, high-quality information for policymakers, academics, commercial interests, nonprofits and journalists.

“We’re really hoping to bridge the gap and make science and policy accessible,” said Andrew Pericak, a contributor and editor of the service who has a 2016 masters in environmental management from the Nicholas School.

The program also will serve as a practical training ground for students who aspire to live and work in that rarefied space between two realms, and will provide them with published work to help them land internships and jobs, said SciPol director Aubrey Incorvaia, a 2009 masters graduate of the Sanford School of Public Policy.

Aubrey Incorvaia chatted with law professor Jeff Ward (center) and Science and Society fellow Thomas Williams at the kickoff event.

SciPol launched quietly in the fall with a collection of policy development briefs focused on neuroscience, genetics and genomics. Robotics and artificial intelligence coverage began at the start of January. Nanotechnology will launch later this semester and preparations are being made for energy to come online later in the year. Nearly all topics are led by a PhD in that field.

“This might be a different type of writing than you’re used to!” Pericak told a meeting of prospective undergraduate and graduate student authors at an orientation session last week.

Some courses will be making SciPol brief writing a part of their requirements, including law professor Jeff Ward’s section on the frontier of robotics law and ethics. “We’re doing a big technology push in the law school, and this is a part of it,” Ward said.

Because the research and writing is a learning exercise, briefs are published only after a rigorous process of review and editing.

A quick glance at the latest offerings shows in-depth policy analyses of aerial drones, automated vehicles, genetically modified salmon, sports concussions and dietary supplements that claim to boost brain power.

To keep up with the latest developments, the SciPol staff maintains searches on WestLaw, the Federal Register and other sources to see where science policy is happening. “But we are probably missing some things, just because the government does so much,” Pericak said.

Post by Karl Leif Bates

Brain Makes Order From Disorder

A team of scientists from Duke, the National Institutes of Health and Johns Hopkins biomedical engineering has found that the formation and retrieval of new memories relies on disorganized brain waves, not organized ones, which is somewhat contrary to what neuroscientists have previously believed. Brain waves, or oscillations, are the brain’s way of organizing activity and are known to be important to learning, memory, and thinking.

Alex Vaz is a Duke MD/PhD student and biomedical engineering alumnus.

Although brain waves have been measured and studied for decades, neuroscientists still aren’t sure what they mean and whether or not they help cognition, said Alex Vaz, an M.D.-Ph.D. student at Duke who is the first author on the paper.

In a study appearing Jan. 6 in NeuroImage, the neuroscientists showed that brain activity became less synchronized during the formation and retrieval of new memories. This was particularly true in a brain region known as the medial temporal lobe, a structure thought to play a critical role in the formation of both short-term and long-term memories

Excessive synchronization of brain oscillations has been implicated in Parkinson’s disease, epilepsy, and even psychiatric disorders. Decreasing brain wave synchronization by electrical stimulation deep in the brain has been found to decrease the tremors of Parkinson’s. But the understanding of brain waves in movement disorders is ahead of the understanding of human memory.

The researchers had neurosurgeons at the National Institutes of Health implant recording electrodes onto the brain surface of 33 epileptic patients during seizure evaluation and then asked them to form and retrieve memories of unrelated pairs of words, such as ‘dog’ and ‘lime.’

They found that  during memory formation, brain activity became more disorganized in the frontal lobe, an area involved in

A graphical abstract from Alex’s paper.

executive control and attention, and in the temporal lobe, an area more implicated in memory and language.

“We think this study, and others like it, provide a good starting point for understanding possible treatments for memory disorders,” Vaz said. “The aging American population will be facing major neurocognitive disorders such as Alzheimer’s disease and vascular dementia and will be demanding more medical attention.”

CITATION: “Dual origins of measured phase-amplitude coupling reveal distinct neural mechanisms underlying episodic memory in the human cortex,” Alex P. Vaz, Robert B. Yaffe, John H. Wittig, Sara K. Inati, Kareem A. Zaghloul. NeuroImage, Online Jan. 6, 2017. DOI: 10.1016/j.neuroimage.2017.01.001

http://www.sciencedirect.com/science/article/pii/S1053811917300010

Post by Karl Leif Bates

Karl Leif Bates

Treating Traumatic Brain Injury

After a traumatic brain injury (TBI), the brain produces an inflammatory response. This prolonged swelling is known as cerebral edema and can be fatal. Unfortunately, the only medications available just address symptoms and cannot directly treat the inflammation.

Daniel Laskowitz

Daniel Laskowitz, M.D. M.H.S, is a professor of neurology.

Some people can walk out okay after suffering from this injury, yet others can become comatose or may even die. This raises the intriguing question: why do people with similar injuries end up with vastly different outcomes? TBI affects nearly 2 million Americans every year and nearly 52,000 of these injuries are fatal.

“To a certain extent, the way the body responds to injury is probably genetically hardwired,” said Dr. Daniel Laskowitz, a neurologist at Duke who has been working on the mysteries of traumatic brain injuries for two decades. He said in medical school, he preferred the approach of treating the whole body and not super specializing. He chose to work specifically with brain injury because he could treat patients with other conditions along with brain injury.

One of Dr. Laskowitz’s first publications was about brain injury. As a fellow training in neurology in the mid-1990s, he looked at genetic factors that could make a difference in the outcome of a brain injury and found that genetic variation in a protein called apolipoprotein E (apoE) played a role.  ApoE comes in three slightly different flavors, and one of the common forms of apoE (apoE4) was associated with bad outcomes after brain injury. This raised the question of what apoE was doing in the brain to affect outcome after injury.

In 1997, he published an article about the effect of apoE on mice suffering a stroke and found that mice with the apoE allele had a better recovery than mice with an apoE deficiency. These findings were later repeated in an article in 2001,which found that following traumatic brain injury, animals with apoE had better outcomes than animals without this protein.

Since it was found that apoE could improve an injured patient’s neurologic outcomes, it became a model for medication to treat brain injuries. However, apoE does not easily cross the blood-brain-barrier, making it a challenging molecule to dispense as a drug.

Dr. Laskowitz’s lab has spent almost a decade looking at how apoE works. They have recently developed a peptide made of 5 amino acids, CN-105, that is based off of this protein and is able to cross the blood-brain-barrier, giving it the potential to be distributed as a treatment. This has been tested in mice and shown to improve outcomes.

In July, CN-105  completed a first phase clinical trial and found that  drug administration was safe and well tolerated. In the coming year, a phase 2 study will look at whether  CN-105 improves outcomes in patients with brain hemorrhages.

The plan is to give the peptide through an IV every six hours for three days, the time period when most of the swelling happens after injury.

Dr. Laskowitz’s research has already had a significant impact on the treatment of brain injury, and hopefully, this new medication could be another great contribution to this field.

Ryan SheltonGuest Post by Ryan Shelton, North Carolina School of Math and Science, Class of 2017

Life Lessons from a Neuroscientist

I recently had the privilege of sitting down with Dr. Anne Buckley, a professor and  neuropathologist working in Dr. Chay Kuo’s cell biology lab at Duke. I got a first-hand account of her research on neuron development and function in mice. But just as fascinating to me were the life lessons she had learned during her time as a researcher.

Anne Buckley, M.D. Ph.D., is an assistant professor of pathology

Anne Buckley, M.D. Ph.D., is an assistant professor of pathology

Buckley’s research looks at brain tumors in mice. She recently found that some of the mice developed the tumors in an area full of neurons, the roof of the fourth ventricle, which is of particular interest because humans have developed tumors in the same location. This discovery could show how neurological pathways affect tumor formation and progression.

Buckley also gave me some critical words of advice, cautioning me that research isn’t for everyone.

“Research is not glamorous, and not always rewarding,” she warned me. When she first started research, Buckley learned a hard lesson: work doesn’t necessarily lead to results. “For every question I went after, I found ten more unresolved,” she said. “To be a researcher, it takes a lot of perseverance and resilience. A lot of long nights.”

But that’s also the beauty of research. Buckley says that she’s learned to find happiness in the small successes, and that she “enjoys the process, enjoys the challenge.”

And when discoveries happen?

“When I look at data, and I see something unexpected, I get really excited,” she says. “I know something that no one else knows. Tomorrow, everyone will know. But tonight, I’m the only person in the world who knows.”

kendra_zhong_headshotGuest Post by Kendra Zhong, North Carolina School of Science and Math, Class of 2017

Page 1 of 8

Powered by WordPress & Theme by Anders Norén