Following the people and events that make up the research community at Duke

Author: Shariar Vaez-Ghaemi

Hidden in Plain Sight: The Growing Role of Computation in Science

One of downtown Durham’s most memorable landmarks, the Chesterfield building looks like it was aesthetically designed to maintain the country’s morale during World War II. On the former cigarette factory’s roof rests a brilliant red sign that’s visible from miles away:

But don’t mistake the building’s quaint exterior for antiquity: the Chesterfield Building is home to one of the nation’s most powerful quantum computers. Managed by the Duke Quantum Center, the computer is part of Duke’s effort to bolster the Scalable Quantum Computing Laboratory (SQLab).

On February 2nd, the lab’s director – Christopher Monroe – joined engineering professor Michael Reiter and English professor Charlotte Sussman in a Research Week panel to discuss the growing presence of computation at Duke and in research institutions across the country. (View the panel.)

Chris Monroe

Monroe opened by detailing the significance of quantum computing in the modern world. He explained that quantum mechanics are governed by two golden rules: first, that quantum objects are waves and can be in superposition, and second, that the first rule only applies when said objects are not being measured.

The direct impact of quantum mechanics is that electrons can be in two orbits at the same time, which revolutionizes computing. Quantum computers factor numbers exponentially faster than classical computers, converge to more desirable solutions in optimization problems and have been shown to bolster research in fields like biomolecular modeling.

Still, Monroe insists that the future reach of quantum computing is beyond anyone’s current understanding. Says Monroe, “quantum computing is an entirely new way of dealing with information, so we don’t know all the application areas it will touch.” What we do know, he says, is that quantum computers are poised to take over where conventional computers and Moore’s Law leave off.

While Monroe discussed computing innovations, Michael Reiter – James B. Duke Professor of Computer Science and Electrical and Computer Engineering – demonstrated the importance of keeping computing systems safe. By pointing to the 2010 Stuxnet virus, a series of cyberattacks against Iranian nuclear centrifuges, and the 2017 Equifax Data Breach, which stole the records of 148 million people, Dr. Reiter provided evidence to show that modern data systems are vulnerable and attractive targets for cyber warfare.

Michael Reiter

To show the interdisciplinary responsibilities associated with the nation’s cybersecurity needs, Reiter posed two questions to the audience. First, what market interventions are appropriate to achieve more accountability for negligence in cybersecurity defenses? Second, what are the rules of war as it relates to cyber warfare and terrorism?

After Reiter’s presentation, Charlotte Sussman transitioned the conversation from the digital world to the maritime world. A professor of English at Duke, Sussman has always been interested in ways to both memorialize and understand the middle passage, the route slave trading ships took across the Atlantic from Africa to the Americas. Through the University’s Bass Connections and Data+ research programs, she and a group of students were able to approach this problem through the unlikely lens of data science.

Sussman explained that her Data+ team used large databases to find which areas of the Atlantic Ocean had the highest mortality rates during the slave trade, while the Bass Connections team looked at a single journey to understand one young migrant’s path to the bottom of the sea.

Professor Sussman (second from right), and the Bass Connections/Data+ Team.

Monroe, Reiter, and Sussman all showed that the applications of computing are growing without bound. Both the responsibility to improve computing infrastructures and the ability to leverage computing resources are rapidly expanding to new fields, from medicine and optimization to cybersecurity and history.

With so many exciting paths for growth, one point is clear about the future of computing: it will outperform anyone’s wildest expectations. Be prepared to find computing in academia, business, government, and other settings that require advanced information.

Many of these areas, like the Chesterfield Building, will probably see the impact of computing before you know it.

Post by Shariar Vaez-Ghaemi, Class of 2025

Opening the Black Box: Duke Researchers Discuss Bias in AI

Artificial intelligence has not only inherited many of the strongest capabilities of the human brain, but it has also proven to use them more efficiently and effectively. Object recognition, map navigation, and speech translation are just a few of the many skills that modern AI programs have mastered, and the list will not stop growing anytime soon.

Unfortunately, AI has also magnified one of humanity’s least desirable traits: bias. In recent years, algorithms influenced by bias have often caused more problems than they sought to fix.

When Google’s image recognition AI was found to be classifying some Black people as gorillas in 2015, the only consolation for those affected was that AI is improving at a rapid pace, and thus, incidents of bias would hopefully begin to disappear. Six years later, when Facebook’s AI made virtually the exact same mistake by labeling a video of Black men as “primates,” both tech fanatics and casual observers could see a fundamental flaw in the industry.

Jacky Alciné’s tweet exposing Google’s racist AI algorithm enraged thousands in 2015.


On November 17th, 2021, two hundred Duke Alumni living in all corners of the world – from Pittsburgh to Istanbul and everywhere in between – assembled virtually to learn about the future of algorithms, AI, and bias. The webinar, which was hosted by the Duke Alumni Association’s Forever Learning Institute, gave four esteemed Duke professors a chance to discuss their view of bias in the artificial intelligence world.

Dr. Stacy Tantum, Bell-Rhodes Associate Professor of the Practice of Electrical and Computer Engineering, was the first to mention the instances of racial bias in image classification systems. According to Tantum, early facial recognition did not work well for people of darker skin tones because the underlying training data – observations that inform the model’s learning process – did not have a broad representation of all skin tones. She further echoed the importance of model transparency, noting that if an engineer treats an AI as a “black box” – or a decision-making process that does not need to be explained – then they cannot reasonably assert that the AI is unbiased.

Stacy Tantum, who has introduced case studies on ethics to students in her Intro to Machine Learning Class, echoes the importance of teaching bias in AI classrooms.

While Tantum emphasized the importance of supervision of algorithm generation, Dr. David Hoffman – Steed Family Professor of the Practice of Cybersecurity Policy at the Sanford School of Public Policy – explained the integration of algorithm explainability and privacy. He pointed to the emergence of regulatory legislation in other countries that ensure restrictions, accountability, and supervision of personal data in cybersecurity applications. Said Hoffman, “If we can’t answer the privacy question, we can’t put appropriate controls and protections in place.”

To discuss the implications of blurry privacy regulations, Dr. Manju Puri – J.B. Fuqua Professor of Finance at the Fuqua School of Business – discussed how the big data feeding modern AI algorithms impact each person’s digital footprint. Puri noted that data about a person’s phone usage patterns can be used by banks to decide whether that person should receive a loan. “People who call their mother every day tend to default less, and people who walk the same path every day tend to default less.” She contends that the biggest question is how to behave in a digital world where every action can be used against us.

Dr. Philip Napoli has observed behaviors in the digital world for several years as James R. Shepley Professor of Public Policy at the Sanford School, specifically focusing on self-reinforcing cycles of social media algorithms. He contends that Facebook’s algorithms, in particular, reward content that gets people angry, which motivates news organizations and political parties to post galvanizing content that will swoop through the feeds of millions. His work shows that AI algorithms can not only impact the behaviors of individuals, but also massive organizations.

At the end of the panel, there was one firm point of agreement between all speakers: AI is tremendously powerful. Hoffman even contended that there is a risk associated with not using artificial intelligence, which has proven to be a revolutionary tool in healthcare, finance, and security, among other fields. However, while proven to be immensely impactful, AI is not guaranteed to have a positive impact in all use cases – rather, as shown by failed image recognition platforms and racist healthcare algorithms that impacted millions of Black people, AI can be incredibly harmful.

Thus, while many in the AI community dream of a world where algorithms can be an unquestionable force for good, the underlying technology has a long way to go. What stands between the status quo and that idealistic future is not more data or more code, but less bias in data and code.

Post by Shariar Vaez-Ghaemi, Class of 2025


Science Behind the Scenes: Get To Know a Zebrafish Lab Technician

It’s 7:30 a.m. on a Sunday morning, and Mark McDonough is making a very familiar journey through a very unfamiliar mode. With the light rain pelting down on his gelled hair, he’s walking the 2-mile trek from East Campus to West Campus. The C1 doesn’t run until 8:30 a.m. on weekends, and his job is simply too important to wait for Duke-provided transportation.

Since his third week as a freshman, Mark has held the position of Lab Technician at the Duke University School of Medicine Zebrafish Core Facilities. Through the job, which he earned via the university’s work-study program, Mark has had the opportunity to make his college experience more affordable while completing the behind-the-scenes work that keeps the university’s labs running.

Upon arriving at work every morning, Mark spends anywhere from thirty minutes to an hour cleaning the filters on the fish tanks, after which he removes feces and inserts food. These three tasks are just a microcosm of his duties as a lab technician, but without them, a majority of his assigned fish would die before their biological characteristics could be fully measured.

As a freshman, Mark McDonough (pictured) has had the opportunity to work in a lab that does cutting-edge research.

Mark’s day-to-day responsibilities are similar to those of many lab technicians. Hundreds of Duke’s affiliated research labs make use of living subjects and biological specimens which must be sheltered, fed, and closely monitored. Without the presence of lab technicians, studies involving these subjects could lead to inconsequential or misleading results.

Mark’s supervisor, Z-Core Facilities Manager Karina Olivieri, fully understands the importance of the three lab technicians in her five zebrafish facilities. Says Olivieri, technicians ensure the “health of the fish and quality of their water so that researchers can collect measurements and make valuable insights.” As the demand for zebrafish grows on Duke’s campus, she expects the number of lab technician roles to grow as well. This trend will likely not be unique to Duke.

The zebrafish’s see-through embryo, rapid life cycle, and well-documented genome make it a “model organism” for biological experiments.

Nationwide, demand for lab technicians has accelerated in many of the largest research corporations and academic institutions. According to the Foundation for Biomedical Research, almost every U.S. drug must pass through animal testing in order to reach FDA approval, meaning that larger amounts of living specimens must be preserved as the pharmaceutical industry grows. The rising presence of these experimental subjects may be why the Bureau of Labor Statistics reports that lab technician roles are increasing at a rate of 11%, which beats the national average for STEM occupations.

Though lab technicians don’t present work at prestigious conferences or see their names printed at the top of cutting-edge research articles, their work is pivotal for ensuring that innovative research can be conducted at Duke and beyond. So in the unlikely event that you recognize a passing stranger as a lab technician, make sure to thank them for their service to the Duke community. They keep the university’s vibrant research scene – and its fish – thriving.

Post by Shariar Vaez-Ghaemi, class of 2025

New Blogger Shariar Vaez-Ghaemi: Arts and Artificial Intelligence

Hi! My name is Shariar. My friends usually pronounce that as Shaw-Ree-Awr, and my parents pronounce it as a Share-Ee-Awr, but feel free to mentally process my name as “Sher-Rye-Eer,” “Shor-yor-ior-ior-ior-ior,” or whatever phonetic concoction your heart desires. I always tell people that there’s no right way to interpret language, especially if you’re an AI (which you might be).

Speaking of AI, I’m excited to study statistics and mathematics at Duke! This dream was born out of my high school research internship with New York Times bestselling author Jonah Berger, through which I immersed myself in the applications of machine learning to the social sciences. Since Dr. Berger and I completed our ML-guided study of the social psychology of communicative language, I’ve injected statistical learning techniques into my investigations of political science, finance, and even fantasy football.

Unwinding in the orchestra room after a performance

When I’m not cramped behind a Jupyter Notebook or re-reading a particularly long research abstract for the fourth time, I’m often pursuing a completely different interest: the creative arts. I’m an orchestral clarinetist and quasi-jazz pianist by training, but my proudest artistic endeavours have involved cinema. During high school, I wrote and directed three short films, including a post-apocalyptic dystopian comedy and a silent rendition of the epic poem “Epopeya de la Gitana.”

I often get asked whether there’s any bridge between machine learning and the creative arts*, to which the answer is yes! In fact, as part of my entry project for Duke-based developer team Apollo Endeavours, I created a statistical language model that writes original poetry. Wandering
Mind, as I call the system, is just one example of the many ways that artificial intelligence can do what we once considered exclusively-human tasks. The program isn’t quite as talented as Frost or Dickinson, but it’s much better at writing poetry than I am.

In a movie production (I’m the one wearing a Totoro onesie)

I look forward to presenting invigorating research topics to blog readers for the next year or more. Though machine learning is my scientific expertise, my investigations could transcend all boundaries of discipline, so you may see me passionately explaining biology experiments, environmental studies, or even macroeconomic forecasts. Go Blue Devils!

(* In truth, I almost never get asked this question by real people unless I say, “You know, there’s actually a connection between machine learning and arts.”)

By Shariar Vaez-Ghaemi, Class of 2025

Powered by WordPress & Theme by Anders Norén