Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Faculty Page 1 of 14

Scientists Made a ‘T-Ray’ Laser That Runs on Laughing Gas

‘T-Ray’ laser finally arrives in practical, tunable form. Duke physicist Henry Everitt worked on it over two decades. Courtesy of Chad Scales, US Army Futures Command

It was a Frankenstein moment for Duke alumnus and adjunct physics professor Henry Everitt.

After years of working out the basic principles behind his new laser, last Halloween he was finally ready to put it to the test. He turned some knobs and toggled some switches, and presto, the first bright beam came shooting out.

“It was like, ‘It’s alive!’” Everitt said.

This was no laser for presenting Powerpoint slides or entertaining cats. Everitt and colleagues have invented a new type of laser that emits beams of light in the ‘terahertz gap,’ the no-man’s-land of the electromagnetic spectrum between microwaves and infrared light.

Terahertz radiation, or ‘T-rays,’ can see through clothing and packaging, but without the health hazards of harmful radiation, so they could be used in security scanners to spot concealed weapons without subjecting people to the dangers of X-rays.

It’s also possible to identify substances by the characteristic frequencies they absorb when T-rays hit them, which makes terahertz waves ideal for detecting toxins in the air or gases between the stars. And because such frequencies are higher than those of radio waves and microwaves, they can carry more bandwidth, so terahertz signals could transmit data many times faster than today’s cellular or Wi-Fi networks.

“Imagine a wireless hotspot where you could download a movie to your phone in a fraction of a second,” Everitt said.

Yet despite the potential payoffs, T-rays aren’t widely used because there isn’t a portable, cheap or easy way to make them.

Now Everitt and colleagues at Harvard University and MIT have invented a small, tunable T-ray laser that might help scientists tap into the terahertz band’s potential.

While most terahertz molecular lasers take up an area the size of a ping pong table, the new device could fit in a shoebox. And while previous sources emit light at just one or a few select frequencies, their laser could be tuned to emit over the entire terahertz spectrum, from 0.1 to 10 THz.

The laser’s tunability gives it another practical advantage, researchers say: the ability to adjust how far the T-ray beam travels. Terahertz signals don’t go very far because water vapor in the air absorbs them. But because some terahertz frequencies are more strongly absorbed by the atmosphere than others, the tuning capability of the new laser makes it possible to control how far the waves travel simply by changing the frequency. This might be ideal for applications like keeping car radar sensors from interfering with each other, or restricting wireless signals to short distances so potential eavesdroppers can’t intercept them and listen in.

Everitt and a team co-led by Federico Capasso of Harvard and Steven Johnson of MIT describe their approach this week in the journal Science. The device works by harnessing discrete shifts in the energy levels of spinning gas molecules when they’re hit by another laser emitting infrared light.

Their T-ray laser consists of a pencil-sized copper tube filled with gas, and a 1-millimeter pinhole at one end. A zap from the infrared laser excites the gas molecules within, and when the molecules in this higher energy state outnumber the ones in a lower one, they emit T-rays.

The team dubbed their gizmo the “laughing gas laser” because it uses nitrous oxide, though almost any gas could work, they say.

Duke professor Henry Everitt and MIT graduate student Fan Wang and colleagues have invented a new laser that emits beams of light in the ‘terahertz gap,’ the no-man’s-land of the electromagnetic spectrum.

Everitt started working on terahertz laser designs 35 years ago as a Duke undergraduate in the mid-1980s, when a physics professor named Frank De Lucia offered him a summer job.

De Lucia was interested in improving special lasers called “OPFIR lasers,” which were the most powerful sources of T-rays at the time. They were too bulky for widespread use, and they relied on an equally unwieldy infrared laser called a CO2 laser to excite the gas inside.

Everitt was tasked with trying to generate T-rays with smaller gas laser designs. A summer gig soon grew into an undergraduate honors thesis, and eventually a Ph.D. from Duke, during which he and De Lucia managed to shrink the footprint of their OPFIR lasers from the size of an axe handle to the size of a toothpick.

But the CO2 lasers they were partnered with were still quite cumbersome and dangerous, and each time researchers wanted to produce a different frequency they needed to use a different gas. When more compact and tunable sources of T-rays came to be, OPFIR lasers were largely abandoned.

Everitt would shelf the idea for another decade before a better alternative to the CO2 laser came along, a compact infrared laser invented by Harvard’s Capasso that could be tuned to any frequency over a swath of the infrared spectrum.

By replacing the CO2 laser with Capasso’s laser, Everitt realized they wouldn’t need to change the laser gas anymore to change the frequency. He thought the OPFIR laser approach could make a comeback. So he partnered with Johnson’s team at MIT to work out the theory, then with Capasso’s group to give it a shot.

The team has moved to patent their design, but there is still a long way before it finds its way onto store shelves or into consumers’ hands. Nonetheless, the researchers — who couldn’t resist a laser joke — say the outlook for the technique is “very bright.”

This research was supported by the U.S. Army Research Office (W911NF-19-2-0168, W911NF-13-D-0001) and by the National Science Foundation (ECCS-1614631) and its Materials Research Science and Engineering Center Program (DMR-1419807).

CITATION: “Widely Tunable Compact Terahertz Gas Lasers,” Paul Chevalier, Arman Armizhan, Fan Wang, Marco Piccardo, Steven G. Johnson, Federico Capasso, Henry Everitt. Science, Nov. 15, 2019. DOI: 10.1126/science.aay8683.

How Small is a Proton? Smaller Than Anyone Thought

The proton, that little positively-charged nugget inside an atom, is fractions of a quadrillionth of a meter smaller than anyone thought, according to new research appearing Nov. 7 in Nature.

Haiyan Gao of Duke Physics

In work they hope solves the contentious “proton radius puzzle” that has been roiling some corners of physics in the last decade, a team of scientists including Duke physicist Haiyan Gao have addressed the question of the proton’s radius in a new way and discovered that it is 0.831 femtometers across, which is about 4 percent smaller than the best previous measurement using electrons from accelerators. (Read the paper!)

A single femtometer is 0.000000000000039370 inches imperial, if that helps, or think of it as a millionth part of a billionth part of a meter. And the new radius is just 80 percent of that.

But this is a big — and very small — deal for physicists, because any precise calculation of energy levels in an atom will be affected by this measure of the proton’s size, said Gao, who is the Henry Newson professor of physics in Trinity College of Arts & Sciences.

Bohr model of Hydrogen. One proton, one electron, as simple as they come.

What the physicists actually measured is the radius of the proton’s charge distribution, but that’s never a smooth, spherical point, Gao explained. The proton is made of still smaller bits, called quarks, that have their own charges and those aren’t evenly distributed. Nor does anything sit still. So it’s kind of a moving target.

One way to measure a proton’s charge radius is to scatter an electron beam from the nucleus of an atom of hydrogen, which is made of just one proton and one electron. But the electron must only perturb the proton very gently to enable researchers to infer the size of the charge involved in the interaction. Another approach measures the difference between two atomic hydrogen energy levels. Past results from these two methods have generally agreed.

Artist’s conception of a very happy muon by Particle Zoo

But in 2010, an experiment at the Paul Scherrer Institute replaced the electron in a hydrogen atom with a muon, a much heavier and shorter-lived member of the electron’s particle family. The muon is still negatively charged like an electron, but it’s about 200 times heavier, so it can orbit much closer to the proton. Measuring the difference between muonic hydrogen energy levels, these physicists obtained a proton charge radius that is highly precise, but much smaller than the previously accepted value. And this started the dispute they’ve dubbed the “proton charge radius puzzle.”

To resolve the puzzle, Gao and her collaborators set out to do a completely new type of electron scattering experiment with a number of innovations. And they looked at electron scattering from both the proton and the electron of the hydrogen atom at the same time. They also managed to get the beam of electrons scattered at near zero degrees, meaning it came almost straight forward, which enabled the electron beam to “feel” the proton’s charge response more precisely.

Voila, a 4-percent-smaller proton. “But actually, it’s much more complicated,” Gao said, in a major understatement.

The work was done at the Department of Energy’s Thomas Jefferson National Accelerator Facility in Newport News, Virginia, using new equipment supported by both the National Science Foundation and the Department of Energy, and some parts that were purpose-built for this experiment. “To solve the argument, we needed a new approach,” Gao said.

Gao said she has been interested in this question for nearly 20 years, ever since she became aware of two different values for the proton’s charge radius, both from electron scattering experiments.  “Each one claimed about 1 percent uncertainty, but they disagreed by several percent,” she said.

And as always in modern physics, had the answer not worked out so neatly, it might have called into question parts of the Standard Model of particle physics. But alas, not this time.

“This is particularly important for a number of reasons,” Gao said. The proton is a fundamental building block of visible matter, and the energy level of hydrogen is a basic unit of measure that all physicists rely on.

The new measure may also help advance new insights into quantum chromodynamics (QCD), the theory of strong interaction in quarks and gluons, Gao said. “We really don’t understand how QCD works.”

“This is a very, very big deal,” she said. “The field is very excited about it. And I should add that this experiment would not have been so successful without the heroic contributions from our highly talented and hardworking graduate students and postdocs from Duke.”

This work was funded in part by the U. S. National Science Foundation (NSF MRI PHY-1229153) and by the U.S. Department of Energy (Contract No. DE-FG02-03ER41231), including contract No. DE-AC05-06OR23177 under which Jefferson Science Associates, LLC operates Thomas Jefferson National Accelerator Facility.

CITATION: “A Small Proton Charge Radius from An Electron-Proton Scattering Experiment,”  W. Xiong, A. Gasparian, H. Gao, et al. Nature, Nov. 7, 2019. DOI: 10.1038/s41586-019-1721-2 (ONLINE)

Across the Atlantic: Caribbean Music and Diaspora in the UK

According to Professor Deonte Harris, many of us here in the U.S. have a fascination with Black music. But at the same time, we tend not to realize that it’s. . . well, Black music.

Harris, an International Comparative Studies professor at Duke, holds a freshly minted Ph.D. in Ethnomusicology from UCLA. At the moment, his research focuses especially on the practice and influence of Afro-Caribbean music and diaspora in London.

Image result for deonte harris ethnomusicology
Deonte Harris, Ph.D.,
Assistant Professor of the Practice of the International Comparative Studies Program

He chose to conduct his research in the UK because of its large overseas Caribbean population and because he found that not much scholarship was dedicated to Black Europe. “It’s such a rich space to think about different historical entanglements that affect the lives and trajectories of Black people,” he explained.

Those entanglements include the legacies of colonialism, the Slave Trade, empire, and much more. The racialization of such historical processes is necessary to note.

For example, Harris found that a major shift in Black British music occurred in the 1950s due to anti-Black racism in England. Black individuals were not allowed to socialize in white spaces, so they formed community in their own way: through soundsystems.

These soundsystem originated in Jamaica and debuted in the UK in the postwar years. A soundsystem was the organization of Black individuals, music, and machines, typically in basements and warehouses, for the enjoyment of Black music and company. It became a medium through which a Black community could form in a racialized nation.

Notting Hill Carnival 2007 004.jpg
Notting Hill Carnival, London: An annual celebration of Black British culture.
Photo by Dominic Alves.

Today, Black British music has greatly expanded, but still remains rooted in sound systems.

While the formation of community has been positive, Harris explains that much of his research is a highly complex and often disheartening commentary on Blackness.

Blackness has been created as a category by dominant society: the white community, mostly colonizers. Black music became a thing only because of the push to otherize Black Britons; in many ways, Black culture exists only as an “other” in relation to whiteness. This raises a question of identity that Harris continues to examine: Who has the power to represent self?

In the U.S. especially, Black music is a crucial foundation to American popular music. But as in the UK, it finds its origins in community, folk traditions, and struggle. The industrial nature of the U.S. allows that struggle to be commercialized and disseminated across the globe, creating a sort of paradox. According to Harris, Black individuals must reconcile “being recognized and loved globally, but understanding that people still despise who you are.”

To conduct his research, Harris mostly engages in fieldwork. He spends a significant amount of time in London, engaging with Black communities and listening to live music. His analysis typically involves both sonic and situational elements.

But the most valuable part of Harris’ fieldwork, perhaps, is the community that he himself finds. “Ethnomusicology has for me been a very transformative experience,” he said. “It has helped me to create new global relationships with people ⁠— I consider myself now to have homes in several different places.”

By Irene Park

A Community Dedicated to the Highest Quality: Duke

When I was named to the new position of Vice President for Research at Duke last month, it was the culmination of an extensive process that examined how Duke performs all aspects of research. But rather than being seen as an end-point, I hope that everyone will view this as a beginning.

Lawrence Carin, Vice President for Research

This re-examination of Duke’s research was led by President Price and included many leaders from across Duke, including Provost Sally Kornbluth, Chancellor Eugene Washington, and the Dean of the School of Medicine, Mary Klotman. We also engaged the services of an External Advisory Panel that included Ann M. Arvin (chair), professor of pediatrics and microbiology and former vice provost and dean of research at Stanford University; Edward M. Stolper, William E. Leonhard Professor of Geology and former provost at Caltech; and Barry S. Coller, David Rockefeller Professor, physician-in-chief and vice president for medical affairs at Rockefeller University.

(L-R) Ann Arvin, Edward Stolper and Barry Coller

During the Advisory Panel’s visit to Duke, Dr. Coller made a comment that stuck with me. He said: “You can always tell the effectiveness of an organization by how anyone within it responds when they answer the phone and hear this request: `Please connect me to the quality department.’ ” At a high-functioning organization, Dr. Coller said, anyone answering that query would say that they are responsible for quality, as is everyone in our organization.

There is no separate department of quality; everyone is a member of the quality department. This, I think, is something that all of us working and studying at Duke should take to heart, concerning everything we do at Duke, including all aspects of research. We need to not only be in charge of quality within our own research, but also take the responsibility to each other and to the institution, to assure that everything throughout the Duke research enterprise is done with the highest quality. That means excellence in all its forms, including ethics and integrity.

Nursing students work with assistant professor Rémi Hueckel.

My hope is that the Duke research enterprise will be characterized as a highly functioning operation. Toward that end, if you should see something that can be improved, seek to improve it. Often this can be done directly, by utilizing the agency we may have in a given situation. If the issue requires broader engagement, communicate with your School leadership, or with the Duke Office of Scientific Integrity (DOSI). Let’s work as a team.

Duke’s move to a Vice President for Research position, with oversight over all aspects of our  research in both the School of Medicine and the schools and departments on the campus side, is a reflection of what many in the administration have been calling a “One Duke” philosophy.

WE NEED TO … TAKE THE RESPONSIBILITY TO EACH OTHER AND TO THE INSTITUTION, TO ASSURE THAT EVERYTHING THROUGHOUT THE DUKE RESEARCH ENTERPRISE IS DONE WITH THE HIGHEST QUALITY.

Lawrence Carin

We are increasingly one university, with the lines between different schools blurring, and when it comes to research, that is one of our great competitive advantages over peer institutions.

I hope we also can go beyond “One Duke” and emphasize “Our Duke.” While every member of the Duke research community must be a member of the Duke Research Quality Department, this is particularly true of our faculty. Faculty should not just feel that they are employees of Duke; as faculty, each of us has a leadership responsibility for stewardship of Duke as a whole.

Student Sabrina Tran studying algae at the Marine Lab.

Given the pressure faculty face to find and secure research funding and to publish high-quality research, it is natural that faculty tend to focus on their own relatively narrow part of Duke– their  lab and students. While this is to be expected for junior faculty, we must expect something more from our senior faculty.

Senior Duke faculty have a responsibility to foster an environment dedicated to the highest quality of ethics and integrity, if for no other reason than that the actions of a tiny minority can impact the reputation of the entire university, as we’ve seen in a few isolated but high-profile examples that affected  all of us individually, and our institution’s reputation nationally.

By Our Duke, I mean a culture that emphasizes that the research enterprise at Duke belongs to all of us, and all of us have a responsibility to care for it.

Barton Haynes, MD, Director of the Human Vaccine Institute (with gloves).

To help me with the expanded duties and responsibilities of being Vice President for Research, we have identified four faculty who will be  engaged on research challenges, helping me and my OVPR colleagues with direction and implementation.

Susan Alberts (Biology & Evolutionary Anthropology), Sonke Johnsen (Biology), Hashim Al-Hashimi (Biochemistry) and Andrew Muir (Division of Gastroenterology in Department of Medicine) have agreed to become regular members of the VPR team, meeting with me and other Duke leaders regularly.

We intend to pull in a larger group of Duke faculty to supplement and help us. Our goal is not to make every faculty member an administrator. Our goal is to have increasing faculty ownership of Duke’s overall research enterprise, providing inputs and guidance, and helping us set priorities.

To say everyone is a member of the Quality Department is easy; to make it happen is another thing. We will be engaging faculty extensively in this process. It is critical that we do this together, as a community dedicated to achieving the highest quality in everything we do at Duke.

I look forward to working with all of you to pursue this new vision of an integrated, caring, high-quality Duke research community.

A student in biomedical engineering.
Post by Lawrence Carin, Vice President for Research

Vulci 3000: Technology in Archaeology

This is Anna’s second post from a dig site in Italy this summer. Read the first one here.

Duke PhD Candidate Antonio LoPiano on Site

Once home to Etruscan and Roman cities, the ruins found at Vulci date to earlier than the 8th century B.C.E.

As archaeologists dig up the remains of these ancient civilizations, they are better able to understand how humans from the past lived their daily lives. The problem is, they can only excavate each site once.

No matter how careful the diggers are, artifacts and pieces of history can be destroyed in the process. Furthermore, excavations take a large amount of time, money and strenuous labor to complete. As a result, it’s important to carefully choose the location.

Map of the Vulci Landscape Created Using GIS Technology

In response to these challenges Dr. Maurizio Forte decided to supplement the excavation of ancient Vulci sites by using innovative non-invasive technologies. 

Considering that it once housed entire cities, Vulci is an extremely large site. To optimize excavation time, money, and resources, Dr. Forte used technologies to predict the most important urban areas of the site. Forte and his team also used remote sensing which allowed them to interpret the site prior to digging. 

Georadar Imaging
Duke Post Doc Nevio Danelon Gathering Data for Photogrammetry

Having decided where on the site to look, the team was then able to digitally recreate both the landscape as well as the excavation trench in 3D. This allowed them to preserve the site in its entirety and uncover the history that lay below. Maps of the landscape are created using Web-GIS (Geographic Information Systems). These are then combined with 3D models created using photogrammetry to develop a realistic model of the site.

Forte decided to make the excavation entirely paperless. All “paperwork”  on site is done on tablets. There is also an onsite lab that analyzes all of the archaeological discoveries and archives them into a digital inventory.

This unique combination of archaeology and technology allows Forte and his team to study, interpret and analyze the ancient Etruscan and Roman cities beneath the ground of the site in a way that has never been done before. He is able to create exact models of historic artifacts, chapels and even entire cities that could otherwise be lost for good.

3D Model Created Using Photogrammetry

Forte also thinks it is important to share what is uncovered with the public. One way he is doing this is through integrating the excavation with virtual reality applications.

I’m actually on site with Forte and the team now. One of my responsibilities is to take photos with the Insta360x which is compatible with the OculusGo, allowing people to experience what it’s like to be in the trench with virtual reality. The end goal is to create interactive applications that could be used by museums or individuals. 

Ultimately, this revolutionary approach to archaeology brings to light new perspectives on historical sites and utilizes innovative technology to better understand discoveries made in excavations.

By: Anna Gotskind ’22

Vulci 3000: A High-Tech Excavation

This summer I have the incredible opportunity to work with the Vulci 3000 Bass Connections team. The project focuses on combining archaeology and innovative technology to excavate and understand an ancient Etruscan and Roman site. Over the next several weeks I will be writing a series of articles highlighting the different parts of the excavation. This first installment recounts the history of the project and what we plan to accomplish in Vulci.

Covered in tall grasses and grazing cows it’s hard to imagine that the Vulci Archaeology Park was ever something more than a beautiful countryside. However, in reality, it was home to one of the largest, most important cities of ancient Etruria. In fact, it was one of the biggest cities in the 1st millennium BCE on the entire Italian peninsula. Buried under the ground are the incredible remains of Iron Age, Etruscan, Roman, and Medieval settlements.

Duke’s involvement with the Vulci site began in 2015 when Maurizio Forte, the William and Sue Gross Professor of Classical Studies Art, Art History, and Visual Studies visited the site. What was so unique about the site was that most of it was untouched.

One of the perils of archaeology is that any site can only be physically excavated once and it is inevitable for some parts to be damaged regardless of how careful the team is. Vulci presented a unique opportunity. Because much of the site was still undisturbed, Forte could utilize innovative technology to create digital landscapes that could be viewed in succession as the site was excavated. This would allow him and his team to revisit the site at each stage of excavation. In 2015 he applied for his first permit to begin researching the Vulci site.

In 2016 Forte created a Bass Connections project titled Digital Cities and Polysensing Environments. That summer they ventured to Italy to begin surveying the Vulci site. Because Vulci is a large site it would take too much time and money to excavate the city. Instead, Forte and his team decided to find the most important spots to excavate. They did this by combining remote sensing data and procedural modeling to analyze the various layers underground. They collected data using magnetometry and ground-penetrating radar. They also used drones to capture aerial photography of the site.

These technologies allowed the team to locate the urban areas of the site through the discovery of large buildings and streets revealed by the aerial photographs, radiometrically-calibrated orthomaps, and 3D point cloud/mesh models.

Anne-Lise Baylé Cleaning a Discovered Artifact on Site

The project continued into 2017 and 2018 with a team returning to the site each summer to excavate. Within the trench were archaeologists ranging from undergrads to postdocs digging, scraping and brushing for months to discover what lay beneath the surface. As they began to uncover rooms, pottery, coins, and even a cistern, groups outside the trench continued to advanced technology to collect data and improve the understanding of the site.

Nevio Danelon Releasing a Drone

One unit focused on drone sensing to digitally create multispectral imagery as well as high-resolution elevation models. This allowed them to use soil and crop marks to better interpretation and classify the archaeological features.

By combining traditional archaeology and innovative technology the team has been able to more efficiently discover important, ancient artifacts and analyze them in order to understand the ancient Etruscan and Roman civilizations that once called Vulci their home.

Photo Taken Using the Insta360 Camera in “Planet” Mode

This year, archaeologists return to the site to continue excavation. As another layer of Vulci is uncovered, students and faculty will use technology like drones, photogrammetry, geophysical prosecutions and GIS to document and interpret the site. We will also be using a 360 camera to capture VR compatible content for the OculusGo in order to allow anybody to visit Vulci virtually.

By Anna Gotskind

800+ Teams Pitched Their Best Big Ideas. With Your Help, This Duke Team Has a Chance to Win

A Duke University professor says the time is ripe for new research on consciousness, and he needs your help.

More than 800 teams pitched their best “big ideas” to a competition sponsored by the National Science Foundation (@NSF) to help set the nation’s long-term research agenda. Only 33 are still in the running for the grand prize, and a project on the science of consciousness led by Duke artificial intelligence expert Vincent Conitzer is among them!

You can help shape the NSF’s research questions of the future by watching Conitzer’s video pitch and submitting your comments on the importance and potential impact of the ideas at https://nsf2026imgallery.skild.com/entries/theory-of-conscious-experience.

But act fast. The public comment period ends Wednesday, June 26. Winners will be announced and prizes awarded by October 2019. Stay tuned.

Watch all the video pitches until June 26 at nsf2026imgallery.skild.com.

Overdiagnosis and the Future of Cancer Medicine

For many years, the standard strategy for fighting against cancer has been to find it early with screening when the person is still healthy, then hit it with a merciless treatment regimen to make it go away.

But not all tumors will become life-threatening cancers. Many, in fact, would have caused no issues for the rest of the patients’ lives had they not been found by screening. These cases belong to the category of overdiagnosis, one of the chief complaints against population-level screening programs.

Scientists are reconsidering the way to treat tumors because the traditional hit-it-hard approach has often caused the cancer to seemingly go away, only to have a few cells survive and the entire tumor roar back later with resistance to previously effective medicine.

Dr. Marc Ryser, the professor who gave this meaty talk

In his May 23 talk to Duke Population Health, “Cancer Overdiagnosis: A Discourse on Population Health, Biologic Mechanism and Statistics,” Marc Ryser, an assistant professor at Duke’s Departments of Population Health Sciences and Mathematics, walked us through how parallel developments across different disciplines have been reshaping our cancer battle plan. He said the effort to understand the true prevalence of overdiagnosis is a point of focus in this shift.

Past to Future: the changing cancer battle plan
Credit: Marc Ryser, edit: Brian Du

Ryser started with the longstanding biological theory behind how tumors develop. Under the theory of clonal sweeps, a relatively linear progression of successive key mutations sweeps through the tumor, giving it increasing versatility until it is clinically diagnosed by a doctor as cancer.

Clonal sweeps model, each shade is a new clone that introduces a mutation credit: Sievers et al. 2016

With this as the underpinning model, the battle plan of screen early, treat hard (point A) makes sense because it would be better to break the chain of progression early rather than later when the disease is more developed and much more aggressive. So employing screening extensively across the population for the various types of cancer is the sure choice, right?

But the data at the population level for many different categories of cancers doesn’t support this view (point B). Excluding the cases of cervical cancer and colorectal cancer, which have benefited greatly from screening interventions, the incidence of advanced cases of breast cancer and other cancers have stayed at similar levels or actually continued to increase during the years of screening interventions. This has raised the question of when screening is truly the best option.

Scientists are thinking now in terms of a “benefit-harm balance” when mass-screening public health interventions are carried out. Overdiagnosis would pile up on the harms side, because it introduces unnecessary procedures that are associated with adverse effects.

Thinking this way would be a major adjustment, and it has brought with it major confusion.

Paralleling this recent development on the population level, new biological understanding of how tumors develop has also introduced confusion. Scientists have discovered that tumors are more heterogeneous than the clonal sweeps model would make it appear. Within one tumor, there may be many different subpopulations of cancer cells, of varying characteristics and dangerousness, competing and coexisting.

Additional research has since suggested a more complex, evolutionary and ecological based model known as the Big Bang-mutual evolution model. Instead of the “stepwise progression from normal to increasingly malignant cells with the acquisition of successive driver mutations, some cancers appear to evolve more like a Big Bang, where the malignant ability is already concentrated in the founder cell,” Ryser said.

As the first cell starts to replicate, its descendants evolve in parallel into different subpopulations expressing different characteristics. While more research has been published in favor of this model, some scientists remain skeptical.

Ryser’s research contributes to this ongoing discussion. In comparing the patterns by which mutations are present or absent in cancerous and benign tumors, he obtained results favoring the Big Bang-mutual evolution model. Rather than seeing a neat region of mutation within the tumor, which would align with the clonal sweeps model, he saw mutations dispersed throughout the tumor, like the spreading of newborn stars in the wake of the Big Bang.

How to think about mutations within a tumor
credit: NASA

The more-complicated Big Bang-mutual evolution model justifies an increasingly nuanced approach to cancer treatment that has been developing in the past few years. Known as precision medicine (point C), its goal is to provide the best treatment available to a person based on their unique set of characteristics: genetics, lifestyle, and environment. As cancer medicine evolves with this new paradigm, when to screen will remain a key question, as will the benefit-harm balance.

There’s another problem, though: Overdiagnosis is incredibly hard to quantify. In fact, it’s by nature not possible to directly measure it. That’s where another area of Ryser’s research seeks to find the answers. He is working to accurately model overdiagnosis to estimate its extent and impact.

Going forward, his research goal is to try to understand how to bring together different scales to best understand overdiagnosis. Considering it in the context of the multiscale developments he mentioned in his talk may be the key to better understand it.

Post by Brian Du

When policy changes opportunities — intentionally and unintentionally

Assistant professor Deondra Rose researches the intersection between political history and policymaking. Photo from Duke Sanford School of Public Policy.

The intent of the National Defense of Education Act in 1958 was not to expand the rights for women in higher education. But it happened.

That’s something Duke Sanford assistant professor Deondra Rose called “accidental egalitarianism” in her talk at The Regulator Bookshop in Durham on Jan. 29. Rose discussed citizenship from the lens of historical policy research.

During the 1950s, the Soviet Union and the United States were in fierce competition to be the first country in space. This was the push the US government needed to start putting more funding towards higher education for a greater subset of the population. The 1944 GI Bill was the beginning of government-funded need-based financial aid for higher education, but until this point, aid had only been given to white men.

In Congress at this time, southern Democrats did not want to pass any legislation that would affect segregation, but at the same time, policymakers also needed to produce a policy that would be approved by northerners, who would not pass any policy that appeared discriminatory. They made the wording of the National Defense of Education Act intentionally vague to please both sides, and as a result it greatly expanded provisions for scholarships and loans to all kinds of students in higher education.

Much of professor Rose’s talk, entitled “Citizens by Degree” after her book, was centered around breaking down the idea of citizenship into different degrees of afforded opportunities. “First class citizens” — usually white Americans — are generally afforded all of the rights that come with being an American citizen without opposition, she said. Second class citizens, usually minorities and women, can miss out on opportunities for advancement afforded to others because of their minority status.

Rose also discussed how we can re-define the implications of certain terms such as “welfare state” to be used positively. Government assistance is not simply temporary assistance to new families, families with children or food stamps, but also includes Pell Grants and need-based financial aid. Similarly, “regulation” sometimes carries negative connotations, but Title IX can be thought of “regulation” that ensures women equal access in higher education.

Photo from Annual White House Summit on Historically Black Colleges and Universities (whitehouse.gov).

Rose’s latest research focuses on the relationship between policy, citizenship and education, and her next book is about historically black colleges and universities (HBCUs). In her political history research, she found that Title III is all about HBCUs, and the wording of the act suggests we as a nation ought to support and prize these institutions.

Rose wants to learn more about the role the US government has played in empowering HBCUs, and the role of HBCUs in restructuring political power — for example, 60 percent of black judges in the US have at least one degree from an HBCU.

At one point in history, the obstacle to higher education for second class citizens was access, then affordability, but have those two obstacles been completely overcome? What are new obstacles to higher education?

Rose believes that policies have the power to reshape politics by reshaping citizens, and we must keep finding and tackling obstacles to higher education.

By Victoria Priester

Understanding the Universe, Large and Small

From the miniscule particles underlying matter, to vast amounts of data from the far reaches of outer space, Chris Walter, a professor of physics at Duke, pursues research into the great mysteries of the universe, from the infinitesimal to the infinite.

Chris Walter is a professor of physics

As an undergraduate at the University of California at Santa Cruz, he thought he would become a theoretical physicist, but while continuing his education at the California Institute of Technology (Caltech), he found himself increasingly drawn to experimental physics, deriving knowledge of the universe by observing its phenomena.

Neutrinos — miniscule particles emitted during radioactive decay — captured his attention, and he began work with the KamiokaNDE (Kamioka Nucleon Decay Experiment, now typically written as Kamiokande) at the Kamioka Observatory in Hida, Japan. Buried deep underground
in an abandoned mine to shield the detectors from cosmic rays and submerged in water, Kamiokande offered Walter an opportunity to study a long-supposed but still unproven hypothesis: that neutrinos were massless.

Recalling one of his most striking memories from his time in the lab, he described observing and finding answers in Cherenkov light – a ‘sonic boom’ of light. Sonic booms are created by breaking the sound barrier in air.  However, the speed of light changes in different media – the speed of light in water is less than the speed of light in a vacuum — and a particle accelerator could accelerate particles beyond the speed of light in water.  Walter described it like a ring of light bursting out of the darkness.

In his time at the Kamioka Observatory, he was a part of groundbreaking neutrino research on the mass of neutrinos. Long thought to have been massless, Kamiokande discovered the property of neutron oscillation – that neutrinos could change from flavor to flavor, indicating that, contrary to popular belief, they had mass. Seventeen years later, in 2015, the leader of his team, Takaaki Kajita, would be co-awarded the Nobel Prize for Physics, citing research from their collaboration.

Chris Walter (left) and his Duke physics collaborator and partner, Kate Scholberg (right), on a lift inside the Super-Kamiokande neutrino detector.

Neutrinos originated from the cosmic rays in outer space, but soon another mystery from the cosmos captured Walter’s attention.

“If you died and were given the chance to know the answer to just one question,” he said, “for me, it would be, ‘What is dark energy?’”

Observations made in the 1990s, as Walter was concluding his time at the Kamioka Observatory, found that the expansion of the universe was accelerating. The nature of the dark energy causing this accelerating expansion remained unknown to scientists, and it offered a new course of study in the field of astrophysics.

Walter has recently joined the Large Synoptic Survey Telescope (LSST) as part of a 10-year, 3D survey of the entire sky, gathering over 20 terabytes of data nightly and detecting thousands of changes in the night sky, observing asteroids, galaxies, supernovae, and other astronomical phenomena. With new machine learning techniques and supercomputing methods to process the vast quantities of data, the LSST offers incredible new opportunities for understanding the universe. 

To Walter, this is the next big step for research into the nature of dark energy and the great questions of science.

A rendering of the Large Synoptic Survey Telescope. (Note the naked humans for scale)

Guest Post by Thomas Yang, NCSSM 2019

Page 1 of 14

Powered by WordPress & Theme by Anders Norén