Following the people and events that make up the research community at Duke

Category: Chemistry Page 1 of 6

“Do No Harm to Whom?” Challenge Trials & COVID-19

Sticky post
DAVIDE BONAZZI / SALZMANART

Imagine: you wake on a chilly November morning, alarm blaring, for your 8:30 am class. You toss aside the blankets and grab your phone. Shutting the alarm off reveals a Washington Post notification. But this isn’t your standard election headline. You almost drop your phone in shock. It can’t be, you think. This is too good to be true. It’s not — a second later, you get a text from the SymMon app, notifying you of your upcoming appointment in the Bryan Center.

A vaccine for COVID-19 is finally available, and you’re getting one.

This scenario could be less far-fetched than one might think: the Centers for Disease Control and Prevention has told officials to prepare for a vaccine as soon as November 1st. To a country foundering due to the economic and social effects of COVID-19, this comes as incredible news — a bright spot on a bleak horizon. But to make a vaccine a reality, traditional phase 3 clinical trials may not be enough. What are challenge trials? Should they be used? What’s at stake, and what are the ethical implications of the path we choose?

At Duke Science and Society’s “Coronavirus Conversations: The Science and Ethics of Human Challenge Trials for COVID-19” on Aug. 24, Kim Krawiec of the Duke School of Law posed these and other questions to three experts in health.

Dr. Marc Lipsitch, Director of the Center for Communicable Disease Dynamics at the Harvard School of Public Health, began by comparing traditional phase 3 trials and challenge trials. 

In both kinds of trials, vaccines are tested for their “safety and ability to provoke an immune response” in phases 1 and 2. In phase 3 trials, large numbers (typically thousands or tens of thousands) of individuals are randomly assigned either the vaccine being tested or a placebo. Scientists observe how many vaccinated individuals become infected compared to participants who received a placebo. This information enables scientists to assess the efficacy — as well as rarer side effects — of the vaccine. 

Marc Lipsitch

In challenge trials, instead of random assignment, small numbers of low-risk individuals are deliberately infected in order to more directly study the efficacy of vaccine and treatment candidates. Though none are underway yet, the advocacy group 1Day Sooner has built a list of more than 35,000 volunteers willing to participate.

Dr. Cameron Wolfe, an Infectious Disease Specialist, Associate Professor of Medicine, and Clinical Expert In Respiratory and Infectious Disease at the Duke Medical School, provided an overview of the current vaccine landscape.

Cameron Wolfe

There are currently at least 150 potential vaccine candidates, from preclinical to approved stages of development. Two vaccines, developed by Russia’s Gamelaya Research Institute and China’s CanSinoBIO, have skipped phase 3, but are little more than an idiosyncrasy to Dr. Wolfe, as there is “minimal clarity about their safety and efficacy.” Three more vaccines of interest — Moderna’s mRNA vaccine, Pfizer’s mRNA vaccine, and Oxford and AstraZeneca’s adenovirus vaccine — are all in phase 3 trials with around 30,000 enrollees. Scientists will be watching for a “meaningful infection and a durable immune response.”

Dr. Nir Eyal, the Henry Rutgers Professor of Bioethics and Director of The Center for Population-Level Bioethics at Rutgers University, explained how challenge trials could fit into the vaccine roadmap.

According to Dr. Eyal, challenge trials would most likely be combined with phase 3 trials. One way this could look is the use of challenge trials to weed out vaccine candidates before undergoing more expensive phase 3 trials. Additionally, if phase 3 trials fail to produce meaningful results about efficacy, a challenge trial could be used to obtain information while still collecting safety data from the more comprehensive phase 3 trial.

Nir Eyal

Dr. Eyal emphasized the importance of challenge trials for expediting the arrival of the vaccine. According to his own calculations, getting a vaccine — and making it widely available — just one month sooner would avert the loss of 720,000 years of life and 40 million years of poverty, mostly concentrated in the developing world. (Dr. Eyal stressed that his estimate is extremely conservative as it neglects many factors, including loss of life from avoidance of child vaccines, cancer care, malaria treatment, etc.) Therefore, speed is of “great humanitarian value.”

Dr. Wolfe added that because phase 3 trials rely on a lot of transmission, if the US gets better at mitigating the virus, “the distinction between protective efficacy and simple placebo will take longer to see.” A challenge study, however, is “always a well defined time period… you can anticipate when you’ll get results.” 

The panelists then discussed the ethics of challenge trials in the absence of effective treatment — as Krawiec put it, “making people sick without knowing if we can make them better.”

Dr. Wolfe pointed to the flu, citing challenge trials that have been conducted even though current treatments are not uniformly effective (“tamiflu is no panacea”). He then conceded that the biggest challenge is not a lack of effective therapies, but the current inability to “say to a patient, ‘you will not have a severe outcome.’ It varies so much from person to person, I guess.” (See one troubling example of that variance.)

Dr. Eyal acknowledged the trouble of informed consent when the implications are scarcely known, but argued that “in extraordinary times, business as usual is no longer the standard.” He asserted that if people volunteer with full understanding of what they are committing to, there is no reason to assume they are less informed than when making other decisions where the outcome is as yet unknown. 

Dr. Lipsitch compared this to the military: “we are not cheating if we cannot provide a roadmap of future wars because they are not yet known to us.” Rather, we commend brave soldiers (and hope they come home safe). 

Furthermore, Dr. Eyal asserted that “informed consent is not a comprehensive understanding of the disease,” lest much of the epidemiological research from the 1970s be called into question too. Instead, volunteers should be considered informed as long as they comprehend questions like, “‘we can’t give you an exact figure yet; do you understand?’”

Agreeing, Dr. Wolfe stated that when critics of challenge trials ask, isn’t your mission to do no harm?, he asks, “Do no harm in regards to whom?” “Who is in front of you matters,” Dr. Wolfe confirmed, “that’s why we put up safeguards. But as clinicians it can be problematic [to stop there]. It’s not just about the patient, but to do no harm in regards to the broader community.”

The experts then discussed what they’d like to see in challenge trials.

Dr. Wolfe said he’d like to see challenge trials carried out with a focus on immunology components, side effect profiles, and a “barrage” of biological safety and health standards for hospitals and facilities. 

Dr. Eyal stated the need for exclusion criteria (young adults, perhaps age 20-25, with no risk factors), a “high high high” quality of informed consent ideally involving a third party, and access to therapies and critical care for all volunteers, even those without insurance. 

Dr. Lipsitch stressed the scientific importance of assessing participants from a “virological, not symptom bent.” He mused that the issue of viral inoculum was a thorny one — should scientists “titrate down” to where many participants won’t get infected and more volunteers will be needed overall? Or should scientists keep it concentrated, and contend with the increased risk? 

Like many questions pondered during the hour — from the ideal viral strain to use to the safest way to collect information about high risk patients — this one remained unanswered. 

So don’t mark November 1st on your calendar just yet. But if you do get that life-changing notification, there’s a chance you’ll have human challenge trials to thank.

Post By Zella Hanson

Researchers created a tiny circuit through a single water molecule, and here’s what they found

Graphic by Limin Xiang, Arizona State University

Many university labs may have gone quiet amid coronavirus shutdowns, but faculty continue to analyze data, publish papers and write grants. In this guest post from Duke chemistry professor David Beratan and colleagues, the researchers describe a new study showing how water’s ability to shepherd electrons can change with subtle shifts in a water molecule’s 3-D structure:

Water, the humble combination of hydrogen and oxygen, is essential for life. Despite its central place in nature, relatively little is known about the role that single water molecules play in biology.

Researchers at Duke University, in collaboration with Arizona State University, Pennsylvania State University and University of California-Davis have studied how electrons flow though water molecules, a process crucial for the energy-generating machinery of living systems. The team discovered that the way that water molecules cluster on solid surfaces enables the molecules to be either strong or weak mediators of electron transfer, depending on their orientation. The team’s experiments show that water is able to adopt a higher- or a lower-conducting form, much like the electrical switch on your wall. They were able to shift between the two structures using large electric fields.

In a previous paper published fifteen years ago in the journal Science, Duke chemistry professor David Beratan predicted that water’s mediation properties in living systems would depend on how the water molecules are oriented.

Water assemblies and chains occur throughout biological systems. “If you know the conducting properties of the two forms for a single water molecule, then you can predict the conducting properties of a water chain,” said Limin Xiang, a postdoctoral scholar at University of California, Berkeley, and the first author of the paper.

“Just like the piling up of Lego bricks, you could also pile up a water chain with the two forms of water as the building blocks,” Xiang said.

In addition to discovering the two forms of water, the authors also found that water can change its structure at high voltages. Indeed, when the voltage is large, water switches from a high- to a low-conductive form. In fact, it is may be possible that this switching could gate the flow of electron charge in living systems.

This study marks an important first step in establishing water synthetic structures that could assist in making electrical contact between biomolecules and electrodes. In addition, the research may help reveal nature’s strategies for maintaining appropriate electron transport through water molecules and could shed light on diseases linked to oxidative damage processes.

The researchers dedicate this study to the memory of Prof. Nongjian (NJ) Tao.

CITATION: “Conductance and Configuration of Molecular Gold-Water-Gold Junctions Under Electric Fields,” Limin Xiang, Peng Zhang, Chaoren Liu, Xin He, Haipeng B. Li, Yueqi Li, Zixiao Wang, Joshua Hihath, Seong H. Kim, David N. Beratan and Nongjian Tao. Matter, April 20, 2020. DOI: 10.1016/j.matt.2020.03.023

Guest post by David Beratan and Limin Xiang

Big SMILES All Around for Polymer Chemists at Duke, MIT and Northwestern

Science is increasingly asking artificial intelligence machines to help us search and interpret huge collections of data, and it’s making a difference.

But unfortunately, polymer chemistry — the study of large, complex molecules — has been hampered in this effort because it lacks a crisp, coherent language to describe molecules that are not tidy and orderly.

Think nylon. Teflon. Silicone. Polyester. These and other polymers are what the chemists call “stochastic,” they’re assembled from predictable building blocks and follow a finite set of attachment rules, but can be very different in the details from one strand to the next, even within the same polymer formulation.

Plastics, love ’em or hate ’em, they’re here to stay.
Foto: Mathias Cramer/temporealfoto.com

Chemistry’s old stick and ball models and shorthand chemical notations aren’t adequate for a long molecule that can best be described as a series of probabilities that one kind of piece might be in a given spot, or not.

Polymer chemists searching for new materials for medical treatments or plastics that won’t become an environmental burden have been somewhat hampered by using a written language that looks like long strings of consonants, equal signs, brackets, carets and parentheses. It’s also somewhat equivocal, so the polymer Nylon-6-6 ends up written like this: 

{<C(=O)CCCCC(=O)<,>NCCCCCCN>}

Or this,

{<C(=O)CCCCC(=O)NCCCCCCN>}

And when we get to something called ‘concatenation syntax,’ matters only get worse.  

Stephen Craig, the William T. Miller Professor of Chemistry, has been a polymer chemist for almost two decades and he says the notation language above has some utility for polymers. But Craig, who now heads the National Science Foundation’s Center for the Chemistry of Molecularly Optimized Networks (MONET), and his MONET colleagues thought they could do better.

Stephen Craig

“Once you have that insight about how a polymer is grown, you need to define some symbols that say there’s a probability of this kind of structure occurring here, or some other structure occurring at that spot,” Craig says. “And then it’s reducing that to practice and sort of defining a set of symbols.”

Now he and his MONET colleagues at MIT and Northwestern University have done just that, resulting in a new language – BigSMILES – that’s an adaptation of the existing language called SMILES (simplified molecular-input line-entry system). They they think it can reduce this hugely combinatorial problem of describing polymers down to something even a dumb computer can understand.

And that, Craig says, should enable computers to do all the stuff they’re good at – searching huge datasets for patterns and finding needles in haystacks.

The initial heavy lifting was done by MONET members Prof. Brad Olsen and his co-worker Tzyy-Shyang Lin at MIT who conceived of the idea and developed the set of symbols and the syntax together. Now polymers and their constituent building blocks and variety of linkages might be described like this:

Examples of bigSMILES symbols from the recent paper

It’s certainly not the best reading material for us and it would be terribly difficult to read aloud, but it becomes child’s play for a computer.

Members of MONET spent a couple of weeks trying to stump the new language with the weirdest polymers they could imagine, which turned up the need for a few more parts to the ‘alphabet.’ But by and large, it holds up, Craig says. They also threw a huge database of polymers at it and it translated them with ease.

“One of the things I’m excited about is how the data entry might eventually be tied directly to the synthetic methods used to make a particular polymer,” Craig says. “There’s an opportunity to actually capture and process more information about the molecules than is typically available from standard characterizations. If that can be done, it will enable all sorts of discoveries.”

BigSMILES was introduced to the polymer community by an article in ACS Central Science last week, and the MONET team is eager to see the response.

“Can other people use it and does it work for everything?” Craig asks. “Because polymer structure space is effectively infinite.” Which is just the kind of thing you need Big Data and machine learning to address. “This is an area where the intersection of chemistry and data science can have a huge impact,” Craig says.

Love at First Whiff

Many people turn to the Internet to find a Mr. or Ms. Right. But lemurs don’t have to cyberstalk potential love interests to find a good match — they just give them a sniff.

A study of lemur scents finds that an individual’s distinctive body odor reflects genetic differences in their immune system, and that other lemurs can detect these differences by smell.

Smell check: Fritz the ring-tailed lemur sniffs a tree for traces of other lemurs’ scents at the Duke Lemur Center.
Smell check: Fritz the ring-tailed lemur sniffs a tree for traces of other lemurs’ scents. Photo by David Haring, Duke Lemur Center.

From just one whiff, these primates are able to tell which prospective partners have immune genes different from their own. The ability to sniff out mates with different immune genes could make their offspring’s immune systems more diverse and able to fight more pathogens, said first author Kathleen Grogan, who did the research while working on her Ph.D. with professor Christine Drea at Duke University.

The results appeared online August 22 in the journal BMC Evolutionary Biology.

Lemurs advertise their presence by scent marking — rubbing stinky glands against trees to broadcast information about their sex, kin, and whether they are ready to mate.

Lemurs can tell whether a mate’s immune genes are a good genetic match by the scents they leave behind.
Lemurs can tell whether a mate’s immune genes are a good genetic match by the scents they leave behind. Photo by David Haring, Duke Lemur Center

For the study, Grogan, Drea and colleagues collected scent secretions from roughly 60 lemurs at the Duke Lemur Center, the Indianapolis Zoo, and the Cincinnati Zoo. The team used a technique called gas chromatography-mass spectrometry to tease out the hundreds of compounds that make up each animal’s signature scent.

They also analyzed the lemurs’ DNA, looking for differences within a cluster of genes called MHC that help trigger the body’s defenses against foreign invaders such as bacteria and viruses.

Their tests reveal that the chemical cocktail lemurs emit varies depending on which MHC types they carry.

To see if potential mates can smell the difference, the researchers presented lemurs with pairs of wooden rods smeared with the bodily secretions of two unfamiliar mates and observed their responses. Within seconds, the animals were drawn to the smells wafting from the rods, engaging in a frenzy of licking, sniffing, or rubbing their own scents on top.

In 300 trials, the team found that females paid more attention to the scents of males whose immune genes differed from their own.

MHC genes code for proteins that help the immune system recognize foreign invaders and distinguish “friend” from “foe.” Since different genetic versions respond to different sets of foreign substances, Grogan said, sniffing out genetically dissimilar mates produces offspring more capable of fighting a broad range of pathogens.

Just because females spent more time checking out the scents of dissimilar males doesn’t necessarily make them more likely to have kids together, Grogan said. Moving forward, she and her colleagues plan to use maternity and paternity DNA test results from wild lemurs living in Beza Mahafaly Reserve in Madagascar to see if lemur couples are more different in their MHC type than would be expected by chance.

Similar results have been found in humans, but this is the first time the ability to sniff out partners based on their immune genes has been shown in such distant primate kin, said Grogan, who is currently a postdoctoral fellow at Pennsylvania State University.

“Growing evidence suggests that primates rely on olfactory cues way more than we thought they did,” Grogan said. “It’s possible that all primates can do this.”

This research was supported by the National Science Foundation (BCS #0409367, IOS #0719003), the National Institutes of Health (F32 GM123634–01), and the Duke University Center for Science Education.

CITATION: “Genetic Variation at MHC class II Loci Influences Both Olfactory Signals and Scent Discrimination in Ring-Tailed Lemurs,” Kathleen E. Grogan, Rachel L. Harris, Marylène Boulet, and Christine M. Drea. BMC Evolutionary Biology, August 22, 2019. DOI: 10.1186/s12862-019-1486-0

Post by Robin A. Smith

Nature Shows a U-Turn Path to Better Solar Cells

The technical-sounding category of “light-driven charge-transfer reactions,” becomes more familiar to non-physicists when you just call it photosynthesis or solar electricity.

When a molecule (in a leaf or solar cell) is hit by an energetic photon of light, it first absorbs the little meteor’s energy, generating what chemists call an excited state. This excited state then almost immediately (like trillionths of a second) shuttles an electron away to a charge acceptor to lower its energy. That transference of charge is what drives plant life and photovoltaic current.

A 20 Megawatt solar farm ( Aerial Innovations via wikimedia commons)

The energy of the excited state plays an important role in determining solar energy conversion efficiency. That is, the more of that photon’s energy that can be retained in the charge-separated state, the better. For most solar-electric devices, the excited state rapidly loses energy, resulting in less efficient devices.

But what if there were a way to create even more energetic excited states from that incoming photon?

Using a very efficient photosynthesizing bacterium as their inspiration, a team of Duke chemists that included graduate students Nick Polizzi and Ting Jiang, and faculty members David Beratan and Michael Therien, synthesized a “supermolecule” to help address this question.

“Nick and Ting discovered a really cool trick about electron transfer that we might be able to adapt to improving solar cells,” said Michael Therien, the William R. Kenan, Jr. Professor of Chemistry. “Biology figured this out eons ago,” he said.

“When molecules absorb light, they have more energy,” Therien said. “One of the things that these molecular excited states do is that they move charge. Generally speaking, most solar energy conversion structures that chemists design feature molecules that push electron density in the direction they want charge to move when a photon is absorbed. The solar-fueled microbe, Rhodobacter sphaeroides, however, does the opposite. What Nick and Ting demonstrated is that this could also be a winning strategy for solar cells.”

Ting Jiang
Nick Polizzi

The chemists devised a clever synthetic molecule that shows the advantages of an excited state that pushes electron density in the direction opposite to where charge flows. In effect, this allows more of the energy harvested from a photon to be used in a solar cell. 

“Nick and Ting’s work shows that there are huge advantages to pushing electron density in the exact opposite direction where you want charge to flow,” Therien said in his top-floor office of the French Family Science Center. “The biggest advantage of an excited state that pushes charge the wrong way is it stops a really critical pathway for excited state relaxation.”

“So, in many ways it’s a Rube Goldberg Like conception,” Therien said. “It is a design strategy that’s been maybe staring us in the face for several years, but no one’s connected the dots like Nick and Ting have here.”

In a July 2 commentary for the Proceedings of the National Academy of Sciences, Bowling Green State University chemist and photoscientist Malcom D.E. Forbes calls this work “a great leap forward,” and says it “should be regarded as one of the most beautiful experiments in physical chemistry in the 21st century.”

Here’s a schematic from the paper.
(Image by Nick Polizzi)

CITATION: “Engineering Opposite Electronic Polarization of Singlet and Triplet States Increases the Yield of High-Energy Photoproducts,” Nicholas Polizzi, Ting Jiang, David Beratan, Michael Therien. Proceedings of the National Academy of Sciences, June 10, 2019. DOI: 10.1073/pnas.1901752116 Online: https://www.pnas.org/content/early/2019/07/01/1908872116

Don’t Drink the Tap

Have you ever questioned the quality of the water you drink every day? Or worried that cooking with tap water might be dangerous? For most of us, the answer to these questions is probably no. However, students from a Bass Connections team at Duke say we may want to think otherwise.

Image result for image of water

From bottle refilling stations to the tap, drinking water is so habitual and commonplace that we often take it for granted. Only in moments of crisis do we start worrying about what’s in the water we drink daily. The reality is that safe drinking water isn’t accessible for a lot of people.

Image result for pink hog farm water
Pig waste discoloring lagoon water

Images like this hog farm motivated the Bass Connections project team DECIPHER to take a closer look at the quality of water in North Carolina. On April 16 they presented their concerning findings from three case studies looking at lead contamination, coal ash impoundments, and aging infrastructure at the Motorco Music Hall.

Motorco in Durham. The talk was inside, though.

Nadratun Chowdhury, a Ph.D. student in Civil and Environmental Engineering, investigated lead contamination in water. Lead is an abundant and corrosion-resistant material, making it appealing for use in things like paint, batteries, faucets and pipes. While we’ve successfully removed lead from paint and gasoline, a lot of old water pipes in use today are still fashioned from lead. That’s not good – lead is very toxic and can leach into the water.

Just how toxic is it? Anything over a blood-lead level concentration of fifty parts per billion – fifty drops of water in a giant Olympic swimming pool – is considered dangerous. According to Duke graduate student Aaron Reuben, this much lead in one’s blood is correlated with downward social mobility, serious health concerns, diminished capacity to regulate thoughts and emotions, and hyperactivity. Lower income and minority areas are more at risk due to the higher likelihood of owning contaminated older homes.

Rupanjali Karthik, a Master of Laws student, conducted research on the intersection of water and aging infrastructure in Orange County. Breaks in water pipes are common and can result in serious consequences, like the loss of 9 million gallons of drinkable water. Sometimes it takes 8 or 9 months just to find the location of a broken pipe. In 2018, the UNC-Chapel Hill water main break caused a huge shortage on campus and at the medical center.

Excess fluoridation is also an issue caused by aging infrastructure. In February 2017, a combination of human and machine error caused an excessive fluoride concentration coming out of an Orange County Water Treatment Plant. People were advised not to use their water even to shower. A UNC basketball game had to move locations, and stores were completely swept of bottled water.

Another issue is that arsenic, a known carcinogen, is often used as the fluoridation agent. We definitely don’t want that in our drinking water. Fluoridation isn’t even that necessary these days when we have toothpaste and mouthwash that supports our dental health.

Tommy Lin, an undergraduate studying Chemistry and Computer Science, topped off the group’s presentation with findings surrounding coal ash in Belmont, NC. Coal ash, the residue after coal is burned in power plants, can pollute rivers and seep into ground water, affecting domestic wells of neighboring communities. This creates a cocktail of highly concentrated heavy metals and carcinogens. Drinking it can cause damage to your nervous system, cancer, and birth defects, among other things. Not so great.

The group’s presentation.

Forty-five plastic water bottles. That’s how much water it takes Laura, a Belmont resident, to cook her middle-sized family Thanksgiving. She knows that number because it’s been her family’s tradition the past three years. The Allen Plant Steam Station is a big culprit of polluting water with coal ash. Tons of homes nearby the station, like Laura’s, are told not to use the tap water. You can find these homes excessively stockpiled with cases on cases of plastic water bottles.

These issues aren’t that apparent to people unless they have been directly impacted. Lead, aging infrastructure, and coal ash all pose real threats but are also very invisible problems. Kathleen Burns, a Ph.D. student in English, notes that only in moments of crisis will people start to care, but by then it may be too late.

So, what can people do? Not much, according to the Bass Connections team. They noted that providing clean water is very much a structural issue which will require some complex steps to be solved. So, for now, you may want to go buy a Brita.

Will Sheehan
Post by Will Sheehan

Teaching a Machine to Spot a Crystal

A collection of iridescent crystals grown in space

Not all protein crystals exhibit the colorful iridescence of these crystals grown in space. But no matter their looks, all are important to scientists. Credit: NASA Marshall Space Flight Center (NASA-MSFC).

Protein crystals don’t usually display the glitz and glam of gemstones. But no matter their looks, each and every one is precious to scientists.

Patrick Charbonneau, a professor of chemistry and physics at Duke, along with a worldwide group of scientists, teamed up with researchers at Google Brain to use state-of-the-art machine learning algorithms to spot these rare and valuable crystals. Their work could accelerate drug discovery by making it easier for researchers to map the structures of proteins.

“Every time you miss a protein crystal, because they are so rare, you risk missing on an important biomedical discovery,” Charbonneau said.

Knowing the structure of proteins is key to understanding their function and possibly designing drugs that work with their specific shapes. But the traditional approach to determining these structures, called X-ray crystallography, requires that proteins be crystallized.

Crystallizing proteins is hard — really hard. Unlike the simple atoms and molecules that make up common crystals like salt and sugar, these big, bulky molecules, which can contain tens of thousands of atoms each, struggle to arrange themselves into the ordered arrays that form the basis of crystals.

“What allows an object like a protein to self-assemble into something like a crystal is a bit like magic,” Charbonneau said.

Even after decades of practice, scientists have to rely in part on trial and error to obtain protein crystals. After isolating a protein, they mix it with hundreds of different types of liquid solutions, hoping to find the right recipe that coaxes them to crystallize. They then look at droplets of each mixture under a microscope, hoping to spot the smallest speck of a growing crystal.

“You have to manually say, there is a crystal there, there is none there, there is one there, and usually it is none, none, none,” Charbonneau said. “Not only is it expensive to pay people to do this, but also people fail. They get tired and they get sloppy, and it detracts from their other work.”

Three microscope images of protein crystallization solutions

The machine learning software searches for points and edges (left) to identify crystals in images of droplets of solution. It can also identify when non-crystalline solids have formed (middle) and when no solids have formed (right).

Charbonneau thought perhaps deep learning software, which is now capable of recognizing individual faces in photographs even when they are blurry or caught from the side, should also be able to identify the points and edges that make up a crystal in solution.

Scientists from both academia and industry came together to collect half a million images of protein crystallization experiments into a database called MARCO. The data specify which of these protein cocktails led to crystallization, based on human evaluation.

The team then worked with a group led by Vincent Vanhoucke from Google Brain to apply the latest in artificial intelligence to help identify crystals in the images.

After “training” the deep learning software on a subset of the data, they unleashed it on the full database. The A.I. was able to accurately identify crystals about 95 percent of the time. Estimates show that humans spot crystals correctly only 85 percent of the time.

“And it does remarkably better than humans,” Charbonneau said. “We were a little surprised because most A.I. algorithms are made to recognize cats or dogs, not necessarily geometrical features like the edge of a crystal.”

Other teams of researchers have already asked to use the A.I. model and the MARCO dataset to train their own machine learning algorithms to recognize crystals in protein crystallization experiments, Charbonneau said. These advances should allow researchers to focus more time on biomedical discoveries instead of squinting at samples.

Charbonneau plans to use the data to understand how exactly proteins self-assemble into crystals, so that researchers rely less on chance to get this “magic” to happen.

“We are trying to use this data to see if we can get more insight into the physical chemistry of self-assembly of proteins,” Charbonneau said.

CITATION: “Classification of crystallization outcomes using deep convolutional neural networks,” Andrew E. Bruno, et al. PLOS ONE, June 20, 2018. DOI: 10.1371/journal.pone.0198883

 

Post by Kara Manke

Looking at Cooking as a Science Experiment

From five-star restaurants to Grandma’s homemade cookies, cooking is an art that has transformed the way we taste food. But haven’t you ever wondered how cooking works? How in the world did people discover how to make Dipping Dots or Jell-O?

Patrick Charbonneau is an Associate Professor of Chemistry here at Duke and last Friday he gave a delicious talk about the science of cooking (with samples!).

Patrick Charbonneau, Duke Chemist and Foodie

Around 10,000 years ago humans discovered that by fermenting milk you could turn it into yogurt, something that is more transportable, lasts longer, and digests easier. In the 1600s a new cooking apparatus called the “bone digester” (pressure cooker) allowed you to cook things faster while enhancing the flavor. When the 1800s came around, a scientist named Eben Horsford discovered that adding an acid with sodium bicarbonate creates baking powder. Soon enough scientific and kitchen minds started to collaborate, and new creations were made in the culinary world. As you can see, a lot of fundamental cooking techniques and ingredients we use today are a product of scientific discoveries.

Old-school pressure cookers. Forerunners of the Instant Pot.

Whisked Toffee

Freezer toffee, AKA caramel

A huge part of cooking is controlling the transformation of matter, or “a change in phase.” Professor Charbonneau presented a very cool example demonstrating how controlling this phase shift can affect your experience eating something. He made the same toffee recipe twice, but he changed it slightly as the melted toffee mixture was cooling. One version you stick straight in the freezer; the other you whisk as it cools. The whisked version turns out crumbly and sweeter; the other one turns into a chewy, shiny caramel. The audience got samples, and I could easily tell how different each version looked and tasted.

Charbonneau explained that while both toffees have the same ingredients, most people prefer the crumbly one because it seems sweeter (I agreed). This is because the chewier one takes longer to dissolve onto your taste buds, so your brain registers it as less sweet.

I was fascinated to learn that a lot of food is mostly just water. It’s weird to think a solid thing could be made of water, yet some foods are up to 99% water and still elastic! We have polymers — long repeating patterns of atoms in a chain — to thank for that. In fact, you can turn almost any liquid into a gel. Polymers take up little space but play a vital role in not only foods but other everyday objects, like contact lenses.

Charbonneau also showed us a seemingly magical way to make cake. He took about half a Dixie cup of cake batter, stuck a whipping siphon charged with nitrous oxide inside it for a second, then threw it in the microwave for thirty seconds. Boom, easy as cake. Out came a cup full of some pretty darn good fluffy chocolate cake. The gas bubbles in the butter and egg batter expand when they are heated up, causing the batter to gel and form a solid network.

Professor Charbonneau is doing stuff like this in his class here at Duke, “The Chemistry and Physics of Cooking,” all the time.

In the past ten years a surge in science-cooking related classes has emerged. The experiments you could do in a kitchen-lab are so cool and can make science appealing to those who might normally shy away from it.

Another cool thing I learned at the stations outside of Charbonneau’s talk was that Dipping Dots are made by dripping melted ice cream into a bowl of liquid nitrogen. The nitrogen is so cold that it flash-freezes the ice cream droplet into a ball-like shape!

Post by Will Sheehan

Will Sheehan

Stretchable, Twistable Wires for Wearable Electronics

A new conductive “felt” carries electricity even when twisted, bent and stretched. Credit: Matthew Catenacci

The exercise-tracking power of a Fitbit may soon jump from your wrist and into your clothing.

Researchers are seeking to embed electronics such as fitness trackers and health monitors into our shirts, hats, and shoes. But no one wants stiff copper wires or silicon transistors deforming their clothing or poking into their skin.

Scientists in Benjamin Wiley’s lab at Duke have created new conductive “felt” that can be easily patterned onto fabrics to create flexible wires. The felt, composed of silver-coated copper nanowires and silicon rubber, carries electricity even when bent, stretched and twisted, over and over again.

“We wanted to create wiring that is stretchable on the body,” said Matthew Catenacci, a graduate student in Wiley’s group.

The conductive felt is made of stacks of interwoven silver-coated copper nanotubes filled with a stretchable silicone rubber (left). When stretched, felt made from more pliable rubber is more resilient to small tears and holes than felts made of stiffer rubber (middle). These tears can be seen in small cavities in the felt (right). Credit: Matthew Catenacci

To create a flexible wire, the team first sucks a solution of copper nanowires and water through a stencil, creating a stack of interwoven nanowires in the desired shape. The material is similar to the interwoven fibers that comprise fabric felt, but on a much smaller scale, said Wiley, an associate professor of chemistry at Duke.

“The way I think about the wires are like tiny sticks of uncooked spaghetti,” Wiley said. “The water passes through, and then you end up with this pile of sticks with a high porosity.”

The interwoven nanowires are heated to 300 F to melt the contacts together, and then silicone rubber is added to fill in the gaps between the wires.

To show the pliability of their new material, Catenacci patterned the nanowire felt into a variety of squiggly, snaking patterns. Stretching and twisting the wires up to 300 times did not degrade the conductivity.

The material maintains its conductivity when twisted and stretched. Credit: Matthew Catenacci

“On a larger scale you could take a whole shirt, put it over a vacuum filter, and with a stencil you could create whatever wire pattern you want,” Catenacci said. “After you add the silicone, so you will just have a patch of fabric that is able to stretch.”

Their felt is not the first conductive material that displays the agility of a gymnast. Flexible wires made of silver microflakes also exhibit this unique set of properties. But the new material has the best performance of any other material so far, and at a much lower cost.

“This material retains its conductivity after stretching better than any other material with this high of an initial conductivity. That is what separates it,” Wiley said.

Stretchable Conductive Composites from Cu-Ag Nanowire Felt,” Matthew J. Catenacci, Christopher Reyes, Mutya A. Cruz and Benjamin J. Wiley. ACS Nano, March 14, 2018. DOI: 10.1021/acsnano.8b00887

Post by Kara Manke

MRI Tags Stick to Molecules with Chemical “Velcro®”

An extremely close-up view of Velcro

In the new technique, MRI chemical tags attach to a target molecule and nothing else – kind of like how Velcro only sticks to itself. Credit: tanakawho, via Flickr.

Imagine attaching a beacon to a drug molecule and following its journey through our winding innards, tracking just where and how it interacts with the chemicals in our bodies to help treat illnesses.

Duke scientists may be closer to doing just that. They have developed a chemical tag that can be attached to molecules to make them light up under magnetic resonance imaging (MRI).

This tag or “lightbulb” changes its frequency when the molecule interacts with another molecule, potentially allowing researchers to both locate the molecule in the body and see how it is metabolized.

“MRI methods are very sensitive to small changes in the chemical structure, so you can actually use these tags to directly image chemical transformations,” said Thomas Theis, an assistant research professor in the chemistry department at Duke.

Chemical tags that light up under MRI are not new. In 2016, the Duke team of Warren S. Warren’s lab and Qiu Wang’s lab created molecular lightbulbs for MRI that burn brighter and longer than any previously discovered.

A photo of graduate students Junu Bae and Zijian Zhou in front of a bookshelf.

Junu Bae and Zijian Zhou, the co-first authors of the paper. Credit: Qiu Wang, Duke University.

In a study published March 9 in Science Advances, the researchers report a new method for attaching tags to molecules, allowing them to tag molecules indirectly to a broader scope of molecules than they could before.

“The tags are like lightbulbs covered in Velcro,” said Junu Bae, a graduate student in Qiu Wang’s lab at Duke. “We attach the other side of the Velcro to the target molecule, and once they find each other they stick.”

This reaction is what researchers call bioorthogonal, which means that the tag will only stick to the molecular target and won’t react with any other molecules.

And the reaction was designed with another important feature in mind — it generates a rare form of nitrogen gas that also lights up under MRI.

“One could dream up a lot of potential applications for the nitrogen gas, but one that we have been thinking about is lung imaging,” Theis said.

Currently the best way to image the lungs is with xenon gas, but this method has the downside of putting patients to sleep. “Nitrogen gas would be perfectly safe to inhale because it is what you inhale in the air anyways,” Theis said.

A stylized chemical diagram of the hyperpolarization process

In the new technique, a type of molecule called a tetrazine is hyperpolarized, making it “light up” under MRI (illustrated on the left). It is then tagged to a target molecule through a what is called a bioorthogonal reaction. The reaction also generates a rare form of nitrogen gas that can be spotted under MRI (illustrated on the right). Credit: Junu Bae and Seoyoung Cho, Duke University.

Other applications could include watching how air flows through porous materials or studying the nitrogen fixation process in plants.

One downside to the new tags is that they don’t shine as long or as brightly as other MRI molecular lightbulbs, said Zijian Zhou, a graduate student in  Warren’s lab at Duke.

The team is tinkering with the formula for polarizing, or lighting up, the molecule tags to increase their lifetime and brilliance, and to make them more compatible with chemical conditions in the human body.

“We are now developing new techniques and new procedures which may be helpful for driving the polarization levels even higher, so we can have even better signal for these applications,” Zhou said.

15N4-1,2,4,5-tetrazines as potential molecular tags: Integrating bioorthogonal chemistry with hyperpolarization and unearthing para-N2,” Junu Bae, Zijian Zhou, Thomas Theis, Warren S. Warren and Qiu Wang. Science Advances, March 9, 2018. DOI: 10.1126/sciadv.aar2978

Post by Kara Manke

Page 1 of 6

Powered by WordPress & Theme by Anders Norén