Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Faculty Page 1 of 14

Vulci 3000: Technology in Archaeology

This is Anna’s second post from a dig site in Italy this summer. Read the first one here.

Duke PhD Candidate Antonio LoPiano on Site

Once home to Etruscan and Roman cities, the ruins found at Vulci date to earlier than the 8th century B.C.E.

As archaeologists dig up the remains of these ancient civilizations, they are better able to understand how humans from the past lived their daily lives. The problem is, they can only excavate each site once.

No matter how careful the diggers are, artifacts and pieces of history can be destroyed in the process. Furthermore, excavations take a large amount of time, money and strenuous labor to complete. As a result, it’s important to carefully choose the location.

Map of the Vulci Landscape Created Using GIS Technology

In response to these challenges Dr. Maurizio Forte decided to supplement the excavation of ancient Vulci sites by using innovative non-invasive technologies. 

Considering that it once housed entire cities, Vulci is an extremely large site. To optimize excavation time, money, and resources, Dr. Forte used technologies to predict the most important urban areas of the site. Forte and his team also used remote sensing which allowed them to interpret the site prior to digging. 

Georadar Imaging
Duke Post Doc Nevio Danelon Gathering Data for Photogrammetry

Having decided where on the site to look, the team was then able to digitally recreate both the landscape as well as the excavation trench in 3D. This allowed them to preserve the site in its entirety and uncover the history that lay below. Maps of the landscape are created using Web-GIS (Geographic Information Systems). These are then combined with 3D models created using photogrammetry to develop a realistic model of the site.

Forte decided to make the excavation entirely paperless. All “paperwork”  on site is done on tablets. There is also an onsite lab that analyzes all of the archaeological discoveries and archives them into a digital inventory.

This unique combination of archaeology and technology allows Forte and his team to study, interpret and analyze the ancient Etruscan and Roman cities beneath the ground of the site in a way that has never been done before. He is able to create exact models of historic artifacts, chapels and even entire cities that could otherwise be lost for good.

3D Model Created Using Photogrammetry

Forte also thinks it is important to share what is uncovered with the public. One way he is doing this is through integrating the excavation with virtual reality applications.

I’m actually on site with Forte and the team now. One of my responsibilities is to take photos with the Insta360x which is compatible with the OculusGo, allowing people to experience what it’s like to be in the trench with virtual reality. The end goal is to create interactive applications that could be used by museums or individuals. 

Ultimately, this revolutionary approach to archaeology brings to light new perspectives on historical sites and utilizes innovative technology to better understand discoveries made in excavations.

By: Anna Gotskind ’22

Vulci 3000: A High-Tech Excavation

This summer I have the incredible opportunity to work with the Vulci 3000 Bass Connections team. The project focuses on combining archaeology and innovative technology to excavate and understand an ancient Etruscan and Roman site. Over the next several weeks I will be writing a series of articles highlighting the different parts of the excavation. This first installment recounts the history of the project and what we plan to accomplish in Vulci.

Covered in tall grasses and grazing cows it’s hard to imagine that the Vulci Archaeology Park was ever something more than a beautiful countryside. However, in reality, it was home to one of the largest, most important cities of ancient Etruria. In fact, it was one of the biggest cities in the 1st millennium BCE on the entire Italian peninsula. Buried under the ground are the incredible remains of Iron Age, Etruscan, Roman, and Medieval settlements.

Duke’s involvement with the Vulci site began in 2015 when Maurizio Forte, the William and Sue Gross Professor of Classical Studies Art, Art History, and Visual Studies visited the site. What was so unique about the site was that most of it was untouched.

One of the perils of archaeology is that any site can only be physically excavated once and it is inevitable for some parts to be damaged regardless of how careful the team is. Vulci presented a unique opportunity. Because much of the site was still undisturbed, Forte could utilize innovative technology to create digital landscapes that could be viewed in succession as the site was excavated. This would allow him and his team to revisit the site at each stage of excavation. In 2015 he applied for his first permit to begin researching the Vulci site.

In 2016 Forte created a Bass Connections project titled Digital Cities and Polysensing Environments. That summer they ventured to Italy to begin surveying the Vulci site. Because Vulci is a large site it would take too much time and money to excavate the city. Instead, Forte and his team decided to find the most important spots to excavate. They did this by combining remote sensing data and procedural modeling to analyze the various layers underground. They collected data using magnetometry and ground-penetrating radar. They also used drones to capture aerial photography of the site.

These technologies allowed the team to locate the urban areas of the site through the discovery of large buildings and streets revealed by the aerial photographs, radiometrically-calibrated orthomaps, and 3D point cloud/mesh models.

Anne-Lise Baylé Cleaning a Discovered Artifact on Site

The project continued into 2017 and 2018 with a team returning to the site each summer to excavate. Within the trench were archaeologists ranging from undergrads to postdocs digging, scraping and brushing for months to discover what lay beneath the surface. As they began to uncover rooms, pottery, coins, and even a cistern, groups outside the trench continued to advanced technology to collect data and improve the understanding of the site.

Nevio Danelon Releasing a Drone

One unit focused on drone sensing to digitally create multispectral imagery as well as high-resolution elevation models. This allowed them to use soil and crop marks to better interpretation and classify the archaeological features.

By combining traditional archaeology and innovative technology the team has been able to more efficiently discover important, ancient artifacts and analyze them in order to understand the ancient Etruscan and Roman civilizations that once called Vulci their home.

Photo Taken Using the Insta360 Camera in “Planet” Mode

This year, archaeologists return to the site to continue excavation. As another layer of Vulci is uncovered, students and faculty will use technology like drones, photogrammetry, geophysical prosecutions and GIS to document and interpret the site. We will also be using a 360 camera to capture VR compatible content for the OculusGo in order to allow anybody to visit Vulci virtually.

By Anna Gotskind

800+ Teams Pitched Their Best Big Ideas. With Your Help, This Duke Team Has a Chance to Win

A Duke University professor says the time is ripe for new research on consciousness, and he needs your help.

More than 800 teams pitched their best “big ideas” to a competition sponsored by the National Science Foundation (@NSF) to help set the nation’s long-term research agenda. Only 33 are still in the running for the grand prize, and a project on the science of consciousness led by Duke artificial intelligence expert Vincent Conitzer is among them!

You can help shape the NSF’s research questions of the future by watching Conitzer’s video pitch and submitting your comments on the importance and potential impact of the ideas at https://nsf2026imgallery.skild.com/entries/theory-of-conscious-experience.

But act fast. The public comment period ends Wednesday, June 26. Winners will be announced and prizes awarded by October 2019. Stay tuned.

Watch all the video pitches until June 26 at nsf2026imgallery.skild.com.

Overdiagnosis and the Future of Cancer Medicine

For many years, the standard strategy for fighting against cancer has been to find it early with screening when the person is still healthy, then hit it with a merciless treatment regimen to make it go away.

But not all tumors will become life-threatening cancers. Many, in fact, would have caused no issues for the rest of the patients’ lives had they not been found by screening. These cases belong to the category of overdiagnosis, one of the chief complaints against population-level screening programs.

Scientists are reconsidering the way to treat tumors because the traditional hit-it-hard approach has often caused the cancer to seemingly go away, only to have a few cells survive and the entire tumor roar back later with resistance to previously effective medicine.

Dr. Marc Ryser, the professor who gave this meaty talk

In his May 23 talk to Duke Population Health, “Cancer Overdiagnosis: A Discourse on Population Health, Biologic Mechanism and Statistics,” Marc Ryser, an assistant professor at Duke’s Departments of Population Health Sciences and Mathematics, walked us through how parallel developments across different disciplines have been reshaping our cancer battle plan. He said the effort to understand the true prevalence of overdiagnosis is a point of focus in this shift.

Past to Future: the changing cancer battle plan
Credit: Marc Ryser, edit: Brian Du

Ryser started with the longstanding biological theory behind how tumors develop. Under the theory of clonal sweeps, a relatively linear progression of successive key mutations sweeps through the tumor, giving it increasing versatility until it is clinically diagnosed by a doctor as cancer.

Clonal sweeps model, each shade is a new clone that introduces a mutation credit: Sievers et al. 2016

With this as the underpinning model, the battle plan of screen early, treat hard (point A) makes sense because it would be better to break the chain of progression early rather than later when the disease is more developed and much more aggressive. So employing screening extensively across the population for the various types of cancer is the sure choice, right?

But the data at the population level for many different categories of cancers doesn’t support this view (point B). Excluding the cases of cervical cancer and colorectal cancer, which have benefited greatly from screening interventions, the incidence of advanced cases of breast cancer and other cancers have stayed at similar levels or actually continued to increase during the years of screening interventions. This has raised the question of when screening is truly the best option.

Scientists are thinking now in terms of a “benefit-harm balance” when mass-screening public health interventions are carried out. Overdiagnosis would pile up on the harms side, because it introduces unnecessary procedures that are associated with adverse effects.

Thinking this way would be a major adjustment, and it has brought with it major confusion.

Paralleling this recent development on the population level, new biological understanding of how tumors develop has also introduced confusion. Scientists have discovered that tumors are more heterogeneous than the clonal sweeps model would make it appear. Within one tumor, there may be many different subpopulations of cancer cells, of varying characteristics and dangerousness, competing and coexisting.

Additional research has since suggested a more complex, evolutionary and ecological based model known as the Big Bang-mutual evolution model. Instead of the “stepwise progression from normal to increasingly malignant cells with the acquisition of successive driver mutations, some cancers appear to evolve more like a Big Bang, where the malignant ability is already concentrated in the founder cell,” Ryser said.

As the first cell starts to replicate, its descendants evolve in parallel into different subpopulations expressing different characteristics. While more research has been published in favor of this model, some scientists remain skeptical.

Ryser’s research contributes to this ongoing discussion. In comparing the patterns by which mutations are present or absent in cancerous and benign tumors, he obtained results favoring the Big Bang-mutual evolution model. Rather than seeing a neat region of mutation within the tumor, which would align with the clonal sweeps model, he saw mutations dispersed throughout the tumor, like the spreading of newborn stars in the wake of the Big Bang.

How to think about mutations within a tumor
credit: NASA

The more-complicated Big Bang-mutual evolution model justifies an increasingly nuanced approach to cancer treatment that has been developing in the past few years. Known as precision medicine (point C), its goal is to provide the best treatment available to a person based on their unique set of characteristics: genetics, lifestyle, and environment. As cancer medicine evolves with this new paradigm, when to screen will remain a key question, as will the benefit-harm balance.

There’s another problem, though: Overdiagnosis is incredibly hard to quantify. In fact, it’s by nature not possible to directly measure it. That’s where another area of Ryser’s research seeks to find the answers. He is working to accurately model overdiagnosis to estimate its extent and impact.

Going forward, his research goal is to try to understand how to bring together different scales to best understand overdiagnosis. Considering it in the context of the multiscale developments he mentioned in his talk may be the key to better understand it.

Post by Brian Du

When policy changes opportunities — intentionally and unintentionally

Assistant professor Deondra Rose researches the intersection between political history and policymaking. Photo from Duke Sanford School of Public Policy.

The intent of the National Defense of Education Act in 1958 was not to expand the rights for women in higher education. But it happened.

That’s something Duke Sanford assistant professor Deondra Rose called “accidental egalitarianism” in her talk at The Regulator Bookshop in Durham on Jan. 29. Rose discussed citizenship from the lens of historical policy research.

During the 1950s, the Soviet Union and the United States were in fierce competition to be the first country in space. This was the push the US government needed to start putting more funding towards higher education for a greater subset of the population. The 1944 GI Bill was the beginning of government-funded need-based financial aid for higher education, but until this point, aid had only been given to white men.

In Congress at this time, southern Democrats did not want to pass any legislation that would affect segregation, but at the same time, policymakers also needed to produce a policy that would be approved by northerners, who would not pass any policy that appeared discriminatory. They made the wording of the National Defense of Education Act intentionally vague to please both sides, and as a result it greatly expanded provisions for scholarships and loans to all kinds of students in higher education.

Much of professor Rose’s talk, entitled “Citizens by Degree” after her book, was centered around breaking down the idea of citizenship into different degrees of afforded opportunities. “First class citizens” — usually white Americans — are generally afforded all of the rights that come with being an American citizen without opposition, she said. Second class citizens, usually minorities and women, can miss out on opportunities for advancement afforded to others because of their minority status.

Rose also discussed how we can re-define the implications of certain terms such as “welfare state” to be used positively. Government assistance is not simply temporary assistance to new families, families with children or food stamps, but also includes Pell Grants and need-based financial aid. Similarly, “regulation” sometimes carries negative connotations, but Title IX can be thought of “regulation” that ensures women equal access in higher education.

Photo from Annual White House Summit on Historically Black Colleges and Universities (whitehouse.gov).

Rose’s latest research focuses on the relationship between policy, citizenship and education, and her next book is about historically black colleges and universities (HBCUs). In her political history research, she found that Title III is all about HBCUs, and the wording of the act suggests we as a nation ought to support and prize these institutions.

Rose wants to learn more about the role the US government has played in empowering HBCUs, and the role of HBCUs in restructuring political power — for example, 60 percent of black judges in the US have at least one degree from an HBCU.

At one point in history, the obstacle to higher education for second class citizens was access, then affordability, but have those two obstacles been completely overcome? What are new obstacles to higher education?

Rose believes that policies have the power to reshape politics by reshaping citizens, and we must keep finding and tackling obstacles to higher education.

By Victoria Priester

Understanding the Universe, Large and Small

From the miniscule particles underlying matter, to vast amounts of data from the far reaches of outer space, Chris Walter, a professor of physics at Duke, pursues research into the great mysteries of the universe, from the infinitesimal to the infinite.

Chris Walter is a professor of physics

As an undergraduate at the University of California at Santa Cruz, he thought he would become a theoretical physicist, but while continuing his education at the California Institute of Technology (Caltech), he found himself increasingly drawn to experimental physics, deriving knowledge of the universe by observing its phenomena.

Neutrinos — miniscule particles emitted during radioactive decay — captured his attention, and he began work with the KamiokaNDE (Kamioka Nucleon Decay Experiment, now typically written as Kamiokande) at the Kamioka Observatory in Hida, Japan. Buried deep underground
in an abandoned mine to shield the detectors from cosmic rays and submerged in water, Kamiokande offered Walter an opportunity to study a long-supposed but still unproven hypothesis: that neutrinos were massless.

Recalling one of his most striking memories from his time in the lab, he described observing and finding answers in Cherenkov light – a ‘sonic boom’ of light. Sonic booms are created by breaking the sound barrier in air.  However, the speed of light changes in different media – the speed of light in water is less than the speed of light in a vacuum — and a particle accelerator could accelerate particles beyond the speed of light in water.  Walter described it like a ring of light bursting out of the darkness.

In his time at the Kamioka Observatory, he was a part of groundbreaking neutrino research on the mass of neutrinos. Long thought to have been massless, Kamiokande discovered the property of neutron oscillation – that neutrinos could change from flavor to flavor, indicating that, contrary to popular belief, they had mass. Seventeen years later, in 2015, the leader of his team, Takaaki Kajita, would be co-awarded the Nobel Prize for Physics, citing research from their collaboration.

Chris Walter (left) and his Duke physics collaborator and partner, Kate Scholberg (right), on a lift inside the Super-Kamiokande neutrino detector.

Neutrinos originated from the cosmic rays in outer space, but soon another mystery from the cosmos captured Walter’s attention.

“If you died and were given the chance to know the answer to just one question,” he said, “for me, it would be, ‘What is dark energy?’”

Observations made in the 1990s, as Walter was concluding his time at the Kamioka Observatory, found that the expansion of the universe was accelerating. The nature of the dark energy causing this accelerating expansion remained unknown to scientists, and it offered a new course of study in the field of astrophysics.

Walter has recently joined the Large Synoptic Survey Telescope (LSST) as part of a 10-year, 3D survey of the entire sky, gathering over 20 terabytes of data nightly and detecting thousands of changes in the night sky, observing asteroids, galaxies, supernovae, and other astronomical phenomena. With new machine learning techniques and supercomputing methods to process the vast quantities of data, the LSST offers incredible new opportunities for understanding the universe. 

To Walter, this is the next big step for research into the nature of dark energy and the great questions of science.

A rendering of the Large Synoptic Survey Telescope. (Note the naked humans for scale)

Guest Post by Thomas Yang, NCSSM 2019

HIV Can Be Treated, But Stigma Kills

Three decades ago, receiving an HIV diagnosis was comparable to being handed a death sentence. But today, this is no longer the case.

Advances in HIV research have led to treatments that can make the virus undetectable and untransmittable in less than six months, a fact that goes overlooked by many. Treatments today can make HIV entirely manageable for individuals.

However, thousands of Americans are still dying of HIV-related causes each year, regardless of the fact that HIV treatments are accessible and effective. So where is the disconnect coming from?

On the 30th anniversary of World AIDS Day, The Center for Sexual and Gender Diversity at Duke University hosted a series of events surrounding around this year’s international theme: “Know Your Status.”

One of these events was a panel discussion featuring three prominent HIV/AIDS treatment advocates on campus, Dr. Mehri McKellar, Dr. Carolyn McAllaster, and Dr. Kent Weinhold, who answered questions regarding local policy and current research at Duke.

From left to right: Kent Weinhold, Carolyn McAllaster, Mehri McKellar and moderator Jesse Mangold in Duke’s Center for Sexual and Gender Diversity

The reason HIV continues to spread and kill, Dr. McKellar explained, is less about accessibility, and more about stigma. Research has shown that stigma shame leads to poor health outcomes in HIV patients, and unfortunately, stigma shame is a huge problem in communities across the US.

Especially in the South, she said, there is very little funding for initiatives to reduce stigma surrounding HIV/AIDS, and people are suffering as a result.

In 2016, the CDC reported that the South was responsible for 52 percent of all new HIV diagnoses and 47 percent of all HIV-related deaths in the US.

If people living with HIV don’t feel supported by their community and comfortable in their environment, it makes it very difficult for them to obtain proper treatment. Dr. McKellar’s patients have told her that they don’t feel comfortable getting their medications locally because they know the local pharmacist, and they’re ashamed to be picking up HIV medications from a familiar face.

 

HIV/AIDS Diagnoses and Deaths in the US 1981-2007 (photo from the CDC)

In North Carolina, the law previously required HIV-positive individuals to disclose their status and use a condom with sexual partners, even if they had received treatment and could no longer transmit the virus. Violating this law resulted in prosecution and a prison sentence for many individuals, which only enforced the negative stigma surrounding HIV. Earlier this year, Dr. McAllaster helped efforts to create and pass a new version of the law, which will make life a lot easier for people living with HIV in North Carolina.

So what is Duke doing to help the cause? Well, In 2005, Duke opened the Center for AIDS Research (also known as CFAR), which is now directed by Dr. Kent Weinhold. In the last decade, they have focused their efforts mainly on improving the efficacy of the HIV vaccine. The search for a successful vaccine has been long and frustrating for CFAR and the Duke Human Vaccine Institute, but Dr. Weinhold is optimistic that they will be able to reach the realistic goal of 60 percent effectiveness in the future, although he shied away from predicting any sort of timeline for this outcome.

Pre-exposure prophylaxis or PrEP (photo from NIAID)

Duke also opened a PrEP Clinic in 2016 to provide preventative treatment for individuals who might be at risk of getting HIV. PrEP stands for pre-exposure prophylaxis, and it is a medication that is taken before exposure to HIV to prevent transmission of the virus. Put into widespread use, this treatment is another way to reduce negative HIV stigma.

The problem persists, however, that the people who most need PrEP aren’t getting it. The group that has the highest incidence of HIV is males who are young, black and gay. But the group most commonly receiving PrEP is older, white, gay men. Primary care doctors, especially in the South, often won’t prescribe PrEP either. Not because they can’t, but because they don’t support it, or don’t know enough about it.

And herein lies the problem, the panelists said: Discrimination and bias are often the results of inadequate education. The more educated people are about the truth of living with HIV, and the effectiveness of current treatments, the more empathetic they will be towards HIV-positive individuals.

There’s no reason for the toxic shame that exists nationwide, and attitudes need to change. It’s important for us to realize that in today’s world, HIV can be treated, but stigma kills.

Post by Anne Littlewood

The Importance of Evidence in Environmental Conservation

What counts as good evidence?

In medical research, a professional might answer this question as you would expect: evidence can be trusted if it is the result of a randomized, controlled, double-blind experiment, meaning the evidence is only as strong as the experiment design. And in medicine, it’s possible (and important) to procure this kind of strong evidence.

But when it comes to conservation, it’s a whole different story.

Dr. David Gill (photo from The Nicholas School)

The natural world is complicated, and far beyond our control. When studying the implications of conservation, it’s not so easy to design the kind of experiment that will produce “good” evidence.

David Gill, a professor in Duke’s Nicholas School for the Environment, recently led a study featured in the journal Nature that needed to  define what constitutes good evidence in the realm of marine conservation. Last Wednesday, he made a guest appearance in my Bass Connections meeting to share his work and a perspective on the importance of quality evidence.

Gill’s research has been centered around evaluating the effectiveness of Marine Protected Areas (or MPAs) as a way of protecting marine life. Seven percent of the world’s oceans are currently designated as MPAs, and by 2020, the goal is to increase this number to 10 percent. MPAs arguably have massive effects on ecosystem health and coastal community functioning, but where is the evidence for this claim?

Although past investigations have provided support for creating MPAs,  Gill and his team were concerned with the quality of this evidence, and the link between how MPAs are managed and how well they work. There have historically been acute gaps in study design when researching the effects of MPAs. Few experiments have included pre-MPA conditions or an attempt to control for other factors. Most of these studies have been done in hindsight, and have looked only at the ecological effects within the boundaries of MPAs, without any useful baseline data or control sites to compare them to.

As a result of these limitations, the evidence base is weak. Generating good evidence is a massive undertaking when you are attempting to validate a claim by counting several thousand moving fish.

Gill’s measure of ecosystem health includes counting fish. (Photo from Avoini)

So is there no way to understand the impacts of MPAs? Should conservation scientists just give up? The answer is no, absolutely not.

To produce better evidence, Gill and his team needed to design a study that would isolate the effects of MPAs. To do this, they needed to account for location biases and other confounding variables such as the biophysical conditions of the environment, the population density of nearby human communities, and the national regulations in each place.

The solution they came up with was to compare observations of current conditions within MPAs to “counterfactual” evidence, which is defined as what would have happened had the MPA not been there. Using statistical matching of MPAs to nearby non-MPA and pre-MPA sites, they were able to obtain high-quality results.

A happy sea turtle pictured in a marine protected area (photo from English Foreign and Commonwealth Office.)

The research showed that across 16,000 sampled sites, MPAs had positive ecological impacts on fish biomass in 71 percent of sites. They also discovered that MPAs with adequate staffing had far greater ecological impacts than those without, which is a pretty interesting piece of feedback when it comes to future development. It’s probably not worth it to create MPAs before there is sufficient funding in place to maintain them.

Gill doesn’t claim that his evidence is flawless; he fully admits to the shortcomings in this study, such as the fact that there is very little data on temperate, coldwater regions — mostly because there are few MPAs in these regions.

The field is ripe for improvement, and he suggests that future research look into the social impacts of MPAs and the implications of these interventions for different species. As the evidence continues to improve, it will be increasingly possible to maximize the win-wins when designing MPAs.

Conservation science isn’t perfect, but neither is medicine. We’ll get there.

Coding: A Piece of Cake

Image result for cake

Imagine a cake, your favorite cake. Has your interest been piqued?

“Start with Cake” has proved an effective teaching strategy for Mine Cetinkaya-Rundel in her introduction-level statistics classes. In her talk “Teaching Computing via Visualization,” she lays out her classroom approaches to helping students maintain an interest in coding despite its difficulty. Just like a cooking class, a taste of the final product can motivate students to master the process. Cetinkaya-Rundel, therefore, believes that instead of having students begin with the flour and sugar and milk, they should dive right into the sweet frosting. While bringing cake to the first day of class has a great success rate for increasing a class’s attention span (they’ll sugar crash in their next classes, no worries), what this statistics professor actually refers to is showing the final visualizations. By giving students large amounts of pre-written code and only one or two steps to complete during the first few class periods, they can immediately recognize coding’s potential. The possibilities become exciting and capture their attention so that fewer students attempt to vanish with the magic of drop/add period. For the student unsure about coding, immediately writing their own code can seem overwhelming and steal the joy of creating.

Example of a visualization Cetinkaya-Rundel uses in her classes

To accommodate students with less background in coding, Cetinkaya-Rundel believes that skipping the baby steps proves a better approach than slowing the pace. By jumping straight into larger projects, students can spend more time wrestling their code and discovering the best strategies rather than memorizing the definition of a histogram. The idea is to give the students everything on day one, and then slowly remove the pre-written coding until they are writing on their own. The traditional classroom approach involves teaching students line-by-line until they have enough to create the desired visualizations. While Cetinkaya-Rundel admits that her style may not suit every individual and creating the assignments does require more time, she stands by her eat-dessert-first perspective on teaching. Another way she helps students maintain their original curiosity is by cherishing day one through pre-installed packages which allow students to start playing with visualizations and altering code right away.

Not only does Cetinkaya-Rundel give mouth-watering cakes as the end results for her students but she also sometimes shows them burnt and crumbling desserts. “People like to critique,” she explains as she lays out how to motivate students to begin writing original code. When she gives her students a sloppy graph and tells them to fix it, they are more likely to find creative solutions and explore how to make the graph most appealing to them. As the scaffolding falls away and students begin diverging from the style guides, Cetinkaya-Rundel has found that they have a greater understanding of and passion for coding. A spoonful of sugar really does help the medicine go down.  

    Post by Lydia Goff

Drug Homing Method Helps Rethink Parkinson’s

The brain is the body’s most complex organ, and consequently the least understood. In fact, researchers like Michael Tadross, MD, PhD, wonder if the current research methods employed by neuroscientists are telling us as much as we think.

Michael Tadross is using novel approaches to tease out the causes of neuropsychiatric diseases at a cellular level.

Current methods such as gene editing and pharmacology can reveal how certain genes and drugs affect the cells in a given area of the brain, but they’re limited in that they don’t account for differences among different cell types. With his research, Tadross has tried to target specific cell types to better understand mechanisms that cause neuropsychiatric disorders.

To do this, Tadross developed a method to ensure a drug injected into a region of the brain will only affect specific cell types. Tadross genetically engineered the cell type of interest so that a special receptor protein, called HaloTag, is expressed at the cell membrane. Additionally, the drug of interest is altered so that it is tethered to the molecule that binds with the HaloTag receptor. By connecting the drug to the Halo-Tag ligand, and engineering only the cell type of interest to express the specific Halo-Tag receptor, Tadross effectively limited the cells affected by the drug to just one type. He calls this method “Drugs Acutely Restricted by Tethering,” or DART.

Tadross has been using the DART method to better understand the mechanisms underlying Parkinson’s disease. Parkinson’s is a neurological disease that affects a region of the brain called the striatum, causing tremors, slow movement, and rigid muscles, among other motor deficits.

Only cells expressing the HaloTag receptor can bind to the AMPA-repressing drug, ensuring virtually perfect cell-type specificity.

Patients with Parkinson’s show decreased levels of the neurotransmitter dopamine in the striatum. Consequently, treatments that involve restoring dopamine levels improve symptoms. For these reasons, Parkinson’s has long been regarded as a disease caused by a deficit in dopamine.

With his technique, Tadross is challenging this assumption. In addition to death of dopaminergic neurons, Parkinson’s is associated with an increase of the strength of synapses, or connections, between neurons that express AMPA receptors, which are the most common excitatory receptors in the brain.

In order to simulate the effects of Parkinson’s, Tadross and his team induced the death of dopaminergic neurons in the striatum of mice. As expected, the mice displayed significant motor impairments consistent with Parkinson’s. However, in addition to inducing the death of these neurons, Tadross engineered the AMPA-expressing cells to produce the Halo-Tag protein.

Tadross then treated the mice striatum with a common AMPA receptor blocker tethered to the Halo-Tag ligand. Amazingly, blocking the activity of these AMPA-expressing neurons, even in the absence of the dopaminergic neurons, reversed the effects of Parkinson’s so that the previously affected mice moved normally.

Tadross’s findings with the Parkinson’s mice exemplifies how little we know about cause and effect in the brain. The key to designing effective treatments for neuropsychiatric diseases, and possibly other diseases outside the nervous system, may be in teasing out the relationship of specific types of cells to symptoms and targeting the disease that way.

The ingenious work of researchers like Tadross will undoubtedly help bring us closer to understanding how the brain truly works.

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

 

Page 1 of 14

Powered by WordPress & Theme by Anders Norén