Following the people and events that make up the research community at Duke

Students exploring the Innovation Co-Lab

Category: Computers/Technology Page 1 of 20

“Flipping the Bird”: What do Audubon’s Paintings and England’s Crown Jewels Have in Common?

Sticky post
On the first Friday of each month, Duke Libraries will hold a “Flipping the Bird” event where members of the public can watch exhibit curators flip the pages of two of Audubon’s original “Birds of America” books.

At 12:15 PM on the first Friday of each month, you can watch Duke Libraries curators crank open a half-ton case made by the same company that designed the storage system for England’s Crown Jewels. Inside, protected by elaborate security features and carefully controlled temperature and light conditions, is a different collection of valuable, colorful items: Audubon’s “Birds of America” paintings.

“People love these birds for lots of different reasons,” said Duke Libraries Head of Exhibitions Meg Brown. The Audubon exhibit preserves and displays the birds while also raising awareness of Audubon’s complicated legacy as both a very talented artist and a deeply flawed man.

John James Audubon is one of ornithology’s most well known and most controversial figures. He painted 489 bird species with precision and accuracy, part of an ambitious and unfinished quest to paint every bird in America.

He also owned enslaved people, stole human skulls from indigenous burial sites, and held staunchly racist and anti-abolitionist beliefs.

Even in his own lifetime, Audubon’s “Birds of America” paintings were very valuable. They were sold in “subscriptions” in which patrons would receive paintings periodically as loose sheets and then have them bound themselves. The frequency ranged from weeks to years depending on the speed of Audubon’s work. Today, more than a century and a half after Audubon’s death, his paintings remain subjects of fascination, value, and beauty. One reason the paintings are so valuable is that each set is unique. Audubon used between 20 and 40 colorists who applied color to each print by hand, meaning different copies of the same painting may have slightly different colors.

Approximately 120 complete “Birds of America” sets survive today. “Typically a ‘set’ is all four volumes,” said Aaron Welborn, Duke Libraries Director of Communications. Duke owns “one complete set of four volumes,” two of which are on display in the Mary Duke Biddle Room in Perkins. At the inaugural “Flipping the Bird” event earlier this month, Duke Libraries Head of Exhibitions Meg Brown spoke to visitors about what it takes to preserve and flip these fragile birds.

Head of Exhibitions Meg Brown and Exhibition Intern Grace Zayobi flip the pages of one of the two Audubon books on display in Duke Libraries.

The books are stored in glass and metal cases that weigh more than 1000 pounds according to Brown. The company that made these cases also made the glass under which England’s Crown Jewels are stored. The standard for the glass strength was that it had to be able to withstand 18 minutes of someone actively trying to break through.

The paintings are protected by another unusual security feature as well: “These won’t fit through any of our doors,” Brown said. The cases were brought in during library renovation, and their assembly was completed inside the library.

Duke acquired the collection from Margaret L. Barber, an art and antique collector who loaned items from her private collection for an exhibition in the Women’s College Library in 1931. Duke later purchased the “Birds of America” paintings from her. Originally all four were on display, but for preservation reasons only two are displayed at one time today.

Strips of Mylar—a soft, inert plastic—keep the open pages in place. Curators avoid putting the strips directly on the paintings, instead positioning them closer to the edges of the paper.

Preserving paintings from two centuries ago requires special care. Curators keep sheets of paper between the pages to prevent pigment from transferring to adjacent pages over time. And since watercolor is very sensitive to light exposure, the library uses strategically placed lamps to illuminate the pages without exposing them to bright light. (Specifically, they aim to keep ambient light under 6 foot candles.) UV light is particularly damaging. The exhibit is in an interior room that does not use UV lighting, but there is “one time of day, one time of year” when light streaming through the windows of Saladelia Cafe in Perkins Library can reach the Mary Duke Biddle Room, Brown explains, so the shades on the window facing the cafe are kept below the level that sunlight could reach.

Exhibit curators also monitor temperature and humidity using sensors in the glass cases. The two volumes not on display are kept in the library’s closed stacks, where the temperature is colder to help preserve the paintings. Every couple years the books on display are rotated out with those in the closed stacks.

From left: Yoon Kim, Senior Library Exhibition Technician; Meg Brown, Head of Exhibitions; and Grace Zayobi, Exhibition Intern.

Flipping such old and delicate pages is its own challenge. The display cases have a motorized system to lift the glass, allowing curators to flip the pages before sealing them inside again. The pages of the books on display are flipped once a month. The flipping used to happen when the exhibit was closed, but now any library visitor can witness the process themselves on the first Friday of each month, from 12:15-12:45 PM.

Yoon Kim and Grace Zayobi flipping the book from a page displaying raptors to a page showing the “Bachman’s Finch,” now known as the Bachman’s sparrow.

Though the Audubon exhibit is permanent, other exhibits in the space are temporary. A recent exhibit there has highlighted female scientific illustrators, including Maria Martin Bachman, who painted some of the floral backgrounds for Audubon’s birds. While that exhibit has been up, the library has been focused on “displaying pictures that [Martin] had a part in” rather than just flipping to the next page in order.

Bachman’s husband, Reverend John Bachman, was also a naturalist. He lived in South Carolina and collaborated with Audubon on a later collection of mammal paintings. Like Audubon, Bachman is also a controversial figure with multiple birds named after him. There is a theme here. Also like Audubon, the Bachmans owned enslaved people, some of whom were involved in the production of Audubon’s paintings. A man enslaved by the Bachman family, Thomas Skining, was very skilled at stuffing birds. “He became so good at it that he sort of became the main person who did it,” Brown said.

One of Audubon’s paintings depicts the Carolina parakeet, which he called the Carolina parrot. The species is now extinct.
Image courtesy of the John James Audubon Center at Mill Grove, Montgomery County Audubon Collection, and Zebra Publishing.

Several species in Audubon’s “Birds of America” have since gone extinct: the Carolina parakeet. The Labrador duck. The passenger pigeon. The great auk. In all likelihood the ivory-billed woodpecker and Bachman’s warbler are also extinct. The Eskimo curlew is either critically endangered or extinct as well, and the “pinnated grouse” is an extinct subspecies of the greater prairie-chicken. Many others face threats to their existence, including the Bachman’s sparrow, currently on display in one of the books in the library. (Audubon called it Bachman’s Finch, but the species is not a finch and has since been renamed.)

Light, temperature, and humidity conditions are carefully controlled to help preserve the paintings.

“These are here forever,” Brown said. Audubon’s paintings remain widely loved and influential, and they will remain on display for people to admire, ponder, and learn from. At the same time, the Audubon exhibit seeks to raise awareness of Audubon’s complicated legacy and about the individuals involved in his work who he never fully credited in his lifetime. Context is important, Brown said, and “We never want to shy away from the truth and the history about the important stories that aren’t being told.”

You can view the “Birds of America” books in the Mary Duke Biddle Room, across from the main entrance to Perkins. The species on display this month are the Bachman’s sparrow on the right and mourning, blackburnian, and black-throated green warblers on the left. And at 12:15 on March 7 or the first Friday of any other month, you, too, can watch exhibit curators flip the birds. 

Post by Sophie Cox, Class of 2025

Making the Case for Data Privacy: Here’s What We’re Up Against

Sticky post

After bringing Data Privacy Day to campus seventeen years ago, Duke faculty Jolynn Dellinger and David Hoffman co-moderated this year’s event at the Law School on January 28. Seated between them were attorneys Joshua Stein and Carol Villegas, partners at law firm Boies Schiller Flexne LPP and Labaton Keller Sucharow, respectively. Both are in the midst of multiple lawsuits against corporate giants; Boies Schiller joined a lawsuit against Meta last year, while Labaton is currently involved in data privacy-related suits against Meta, Flo Health and Amazon, and Google. 

Panelists at Data Privacy Day 2025 at the Duke School of Law

Villegas began by emphasizing the importance of legal action on these issues in light of inadequate legislation. She pointed to the confusion of senators at Mark Zuckerberg’s testimony during the Cambridge Analytica scandal, in which the data of over 50 million Facebook users was misused for political purposes. “They don’t understand it…You can’t expect a legislature like that to make any kind of laws [on data privacy], not to mention technology is just moving way too fast,” Villegas said.  

Facebook and most social media platforms generate revenue through advertisements. While many people are aware that these sites track their activity to better target users with ads, they may not know that these companies can collect data from outside of social apps. So, what does that look like?

Almost all apps are built using Software Development Kits (SDKs), which not only make it easier for developers to create apps but also track analytics. Tracking pixels function similarly for building websites. These kits and pixels are often provided for free by companies like Google and Meta–and it’s not too difficult to guess why this might be an issue. “An SDK is almost like an information highway,” Villegas said. “They’re getting all of the data that you’re putting into an app. So every time you press a button in an app, you know you answer a survey in an app, buy something in an app, all of that information is making its way to Meta and Google to be used in their algorithm.” 

So, there’s more at stake than just your data being tracked on Instagram; tracking pixels are often used by hospitals, raising the concern of sensitive health data being shared with third parties. The popular women’s health app Flo helps users track their fertility and menstrual cycle–information it promised to keep private. Yet in Frasco v. Flo Health, Labaton alleged it broke confidentiality and violated the Confidentiality of Medical Information Act (CMIA), illegally transmitting data via Software Development Kits to companies like Google and Meta. Flo Health ended up settling out of court with the Federal Trade Commission (FTC) without admitting wrongdoing, though Google failed to escape the case, which remains ongoing. 

It’s not only lawyers who are instrumental to this process. In cases like the ones that Stein and Villegas work on, academics and researchers can play key roles as expert witnesses. From psychiatrists to computer scientists, these experts explain the technical aspects and provide scientific basis to the judge and jury. Getting a great expert is costly and a significant challenge in itself–ideally, they’d be well-regarded in their field, have very specialized knowledge, and have some understanding of court proceedings. “There are really important ways your experts will get attacked for their credibility, for their analysis, for their conclusions, and their qualifications even,” said Stein, referring to Daubert challenges, which can result in expert testimony being excluded from trial. 

The task of finding experts becomes even more daunting when going up against companies as colossal and profitable as Meta. “One issue that’s come up in AI cases, is finding an expert in AI that isn’t being paid by one of these large technology companies or have… grants or funding from one of these companies. And I got turned down by a lot of experts because of that issue,” Stein said. 

Ultimately, some users don’t care that much if their data is being shared, making it more difficult to address privacy and hold corporations accountable. The aforementioned cases filed by Labaton are class action lawsuits, meaning that a smaller group represents a much larger group of individuals–for example, all users of a certain app within a given timeframe. Yes, it may seem pointless to push for data privacy when even the best outcomes in these cases only entitle individuals to small sums of money, often no more than $30. However, these cases have an arguably more important consequence: when successful, they force companies to change their behavior, even if only in small changes to their services. 

By Crystal Han, Class of 2028

You Don’t Have to Be a Hacker to Make an Impact in This 24-Hour Coding Sprint

Sticky post

Twenty four hours full of brainstorming, debugging, and caffeine.  

Coders of all skill levels came together from February 8-9 to participate in the Code for Good hackathon, an annual event hosted by student organization HackDuke. Fueled by pizza and energy drinks, teams of up to four vie for the chance to win prizes ranging from LEGO sets to Apple Watches. Most projects fell into one or more of the four tracks: Health, Finance, Sustainability, and Interactive Media.

This year’s event fittingly took place at the Fuqua School of Business, where giant flags line the walls.

Over the laughter of a Saturday night poker competition — one of the scheduled social activities for participants — I spoke to Rishi Rao, the lead organizer for HackDuke’s Technology team. “Historically, HackDuke has mainly been a Duke/UNC event, but this year we have people from all over the country,” said Rao, who attributes this year’s wide reach to advertising on social media. 

There’s a focus on making the event as open as possible to new coders, including students that don’t study computer science. “A lot of people here are beginners who haven’t been to a hackathon before so we try to encourage [finishing a product] by having a beginner track and having mentors… Speakers do workshops to help people gain the skills necessary,” Rao said. Hackers are also supplied with “beginner tech kits,” consisting of short tutorials and starter projects created by the HackDuke team.

It certainly seems plausible for first-timers to do well. Duke freshmen Alexis Fox, Phillip Lin, Eric Wang, and Siven Panda entered the competition together in the Health track, and took 2nd place in the category. Upon hearing that rescheduling appointments required tedious manual work in hospitals, the team decided to create an interface to automate the process–hence the name Linked Automated Rescheduling Interface (LARI), inspired by the surname of ambulance inventor Dominique–Jean Larrey.

The team created a diagram to display their process and division of tasks

In twenty four hours, most groups only have time to develop a proof of concept. Team LARI noted that they had to manage their expectations for the final product, but also that practicing better time management could’ve allowed them to add more desired features. “We have to make a compromise between learning and perfectionism,” said Lin.

“I wanted to learn something here, so I wrote my [code] in a language I’ve never used before,” said Panda, adding that he would switch back into a familiar language if he didn’t finish parts by a self-imposed deadline.

In settings like hackathons, the short time frame and low experience of many competitors has made AI particularly relevant. Given the advancements in AI in just the last year, it’s no surprise that it’s taken on an outsized role; two of this year’s workshops focused on using it as a tool for coding. “It helps the more experienced teams come to a more complete product and it helps these beginners teams complete a product,” Rao said.

A quick recount of an unproductive day results in suggested reading material. It’s taken note that my entry is less positive than those previously typed by others.

Many also chose to integrate generative AI into their product. First-time participants Carlos and Elijah, a freshman and sophomore from MIT respectively, decided to create “filosof.ai”: a digital journal that analyzes entries for philosophy. They explained their product was aimed towards people just starting to think about philosophy, helping them further develop their interest by identifying the branches closest to their existing thoughts.

Like the aforementioned groups, Duke seniors Julia Hornstein, Owen Jennings, and Chinomnso Okechukwu were also first time hackathon participants.

“I thought, why not, I don’t want to graduate without doing it”, said Hornstein, a computer science major.

They entered on the sustainability track, wanting to create something that would be realistically used. Okechukwu recalled being unable to find clothes for Duke events on short notice, while Hornstein also noted the amount of theme-specific clothing she would no longer have use for after senior year. Soon, their idea came to them: Campus Closet would provide a platform for students within universities to buy and sell clothes by theme. Instead of being bought from Amazon three days beforehand, worn twice, and then tossed away, clothing would remain within the community, reducing waste and fast fashion demand.

Though some enter the competition nervous, most come out feeling accomplished and more confident in their abilities. “This was such a good experience for me and I’m so inspired by the fact that we could do this in twenty four hours,” said Hornstein. “Meeting my team, and the team dynamic…I had so much fun with both of them, honestly.” The group plans to continue working on Campus Closet, and said they looked forward to hanging out both inside and outside of the project.

For the organizers, an ideal hackathon means not only generating high participation but seeing a high number of submissions when the 24 hours come to a close. After receiving the most applications and product submissions in the history of the event, it seems fair to call Code for Good 2025 a success.

By Crystal Han, Class of 2028

Farmers, Crops…and Computers?

Sticky post

In Hanjie the rules are simple. In this game of logic and creativity, the players, often working on medium-sized grids of 225 squares, use numbers on the rows and columns as clues to determine which boxes to shade. At first, the prospect of seeing a beautiful picture seems almost unfathomable. However, through patience and collaboration from every corner of the page, these small seemingly random squares gradually come together to reveal a masterpiece—one square at a time. 

In a sense the efforts of Duke’s Climate Commitment are no different. The issue of climate change has proven to be a multifaceted one. One in which many parties play a role. However, with initiatives such as Duke’s Forever Learning Institute, the probability of tackling these issues becomes much clearer.

The logo of Duke’s Forever Learning Institute retrieved from their website.

Recently Duke’s Forever Learning Institute, an interdisciplinary educational program for Duke alumni, hosted Professors Norbert Wilson and Maiken Mikkelson for a compelling session on the impact of climate change on food and agriculture. Wilson, an agricultural economist and the Director of the World Food Policy Center at Duke, specializes in addressing critical issues related to food access, nutrition, and food security. Mikkelsen, a distinguished expert in physics, electrical, and computer engineering, explores the potential of nanomaterials to revolutionize agricultural processes, paving the way for innovative solutions in the field. Together, they explained how advancements in nanomaterials can improve food security and sustainability. 

Throughout the session, Wilson emphasized the concept of food security. He began by clarifying the difference between “food loss” and “food waste.”  Food loss occurs at the agricultural level. It refers to food that is produced but never reaches consumers, often due to challenges such as poor harvesting seasons, labor shortages for harvesting, or other natural factors. He describes the ways in which loss occurs across the board but disproportionately affects less developed countries. Wilson also explained how food waste occurs at the consumer level. He details how it goes beyond the waste of a product but is also a waste of the resources used to create that product. 

Picture of Professor Norbert Wilson. Photo retrieved from Duke Divinity School.

Wilson illustrated the significance of these issues by drawing out the larger issue of food insecurity. Food insecurity describes an inability to access food or concerns about accessing food. In the United States 13.5 percent of citizens struggle with accessing food. This can lead to a number of negative health outcomes such as cardiovascular issues and diabetes. Food insecurity can also lead to behavioral and performance issues, particularly in young children.

Infographic about food insecurity retrieved from ECOMERGE.

This is where Mikkelson comes in. She described a term known as Precision Agriculture. In this, researchers observe and measure agriculture fields and extra data to see what resources such as water, and fertilizer is needed at each part. In this, they hope to retrieve good information through wavelengths as a means of getting a spectral fingerprint that supplies information about the crops. Mikkelsen describes her interest in leveraging nanomaterials to create lightweight, cost-effective hyperspectral cameras capable of capturing detailed spectral fingerprints of crops. She hopes that these materials can be employed around the world, and low resource settings to increase crop yields. The greatest roadblock in this would be the price and issues with widespread application. However, once applied it would hold the ability to detect key characteristics such as nutrient deficiencies, water stress, or disease presence.

Duke Researchers working with Nanotechnology. Image retrieved from Duke Pratt School of Engineering.

Our world is wildly affected by climate change. Climate change and agricultural production hold a very dependent relationship and fixing one side holds the ability to correct the other. This is what makes the work and research of those such as Wilson and Mikelson all the more important. Their efforts show how we can utilize technology to not only enact social change but also reverse our climate issues. Their research highlights not only the urgency of addressing food security and agricultural sustainability but also the transformative potential of interdisciplinary approaches.

Just as the game of Hanjie reveals its masterpiece one square at a time, tackling climate change requires collective effort and patience. Each initiative, whether through advanced nanotechnology or policy-driven solutions, brings us closer to a sustainable future. Duke’s Forever Learning Institute serves as a platform to connect these ideas, inspiring action and innovation that can shape a better tomorrow—one step at a time.

Post by Gabrielle Douglas, Class of 2027
Post by Gabrielle Douglas, Class of 2027

Navigating the Complex World of Social Media and Political Polarization: Insights from Duke’s Polarization Lab

This February, the U.S. Supreme Court heard arguments challenging laws in Florida and Texas that would regulate how social media companies like Facebook and X (formerly Twitter) control what posts can appear on their sites.

Given the legal challenges involved over the concerns of the role social media plays in creating polarization, there is a need for further research to explore the issue. Enter Duke’s Polarization Lab, a multidisciplinary research hub designed to explore and mitigate the societal effects of online engagement.

In an April 17 seminar, Polarization Lab postdoc Max Allamong delved into the workings and discoveries of this innovative lab, which brings together experts from seven disciplines and various career stages, supported by twelve funders and partners, including five UNC affiliates.

Duke postdoctoral associate Max Allamong

Unless you’re okay with people stealing your data for their own research, conducting studies based on social media is next to impossible, Allamong explained.

In their attempt to conduct research ethically, the lab has developed a tool called “Discussit.” This platform enables users to see the partisanship of people they are communicating with online, aiming to reduce polarization by fostering dialogue across political divides. To put it simply, they’ll know if they’re talking to someone from the left or if they’re talking to someone from the right. Building on this, Allamong also introduced “Spark Social,” a social media simulator where researchers can adjust variables to study interactions under controlled conditions. This system not only allows for the modification of user interactions but also employs large language models (like those used in ChatGPT) to simulate realistic conversations.

Allamong highlighted a particularly revealing study from the lab, titled “Outnumbered Online,” which examined how individuals behave in partisan echo chambers versus balanced environments. The study placed users in forums where they were either in the majority or minority in terms of political alignment, revealing that being outnumbered led to increased self-censorship and perceptions of a toxic environment.

The lab’s ongoing work also explores the broader implications of polarization on political engagement. By manipulating the type of content users see, researchers are examining variables like believability and replicability of data generated by AI. This approach not only contributes to academic knowledge but also has practical implications for designing healthier online spaces.

As social media continues to shape political and social discourse, the work of Duke’s Polarization Lab and Allamong serves as a safe space to conduct ethical and meaningful research. The insights gained here will better equip us to analyze the polarization created by social media companies, and how that affects the political landscape of the country. The longstanding questions of the effects of echo chambers may soon be answered. This research will undoubtedly influence how we engage with and understand the digital world around us, making it a crucial endeavour for fostering a more informed and less polarized society.

Post by Noor Nazir, class of 2027

Democracy Threatened: Can We Depolarize Digital Spaces?

“Israeli Mass Slaughter.” “Is Joe Biden Fit to be President?” Each time we log on to social media, potent headlines encircle us, as do the unwavering and charged opinions that fill the comment spaces. Each like, repost, or slight interaction we have with social media content is devoured by the “algorithm,” which tailors the space to our demonstrated beliefs.

So, where does this leave us? In our own personal “echo chamber,” claim the directors of Duke’s Political Polarization Lab in a recent panel.

Founded in 2018, the lab’s 40 scholars enact cutting edge research on politics and social media. This unique intersection requires a diverse team, evident in its composition of seven different disciplines and career stages. The research has proven valuable: beneficiaries include government policy-makers, non-profit organizations, and social media companies. 

The lab’s recent research project sought to probe the underlying mechanisms of our digital echo-chambers: environments where we only connect with like-minded individuals. Do we have the power to shatter the glass and expand perspectives? Researchers used bots to generate social media content of opposing party views. The content was intermixed with subject’s typical feeds, and participants were evaluated to see if their views would gradually moderate.

The results demonstrated that the more people paid attention to the bots, the more grounded in their viewpoints or polarized they became. 

Clicking the iconic Twitter bird or new “X” logo signifies a step onto the battlefield, where posts are ambushed by a flurry of rebuttals upon release.

Chris Bail, Professor of Political and Data Science, shared that 90% of these tweets are generated by a meager 6% of Twitter’s users. Those 6% identify as either very liberal or very conservative, rarely settling in a midde area. Their commitment to propagating their opinions is rewarded by the algorithm, which thrives on engagement. When reactive comments filter in, the post is boosted even more. The result is a distorted perception of social media’s community, when in truth the bulk of users are moderate and watching on the sidelines. 

Graphic from the Political Polarization Lab presentation at Duke’s 2024 Research & Innovation Week

Can this be changed? Bail described the exploration of incentives for social media users. This means rewarding both sides, fighting off the “trolls” who wreak havoc on public forums. Enter a new strategy: using bots to retweet top content creators that receive engagement from both parties.

X’s (formerly Twitter’s) Community Notes feature allows users to annotate tweets that they find misleading. This strategy includes boosting notes that annotate bipartisan creators, after finding that notes tended towards the polarized tweets.

 The results were hard to ignore: misinformation decreased by 25-35%, said Bail, saving companies millions of dollars.

Social media is democracy’s public square

Christopher bail

Instead of simply bashing younger generation’s fixation on social media, Bail urged the audience to consider the bigger picture.

“What do we want to get out of social media?” “

What’s the point and how can it be made more productive?”

On a mission to answer these questions, the Polarization Lab has set out to develop evidence-based social media by creating custom platforms. In order to test the platforms out, researchers prompted A.I. to create “digital twins” of real people, to simulate users. 

Co-Director Alex Volfovsky described the thought process that led to this idea: Running experiments on existing social media often requires dumping data into an A.I. system and interpreting results. But by building an engaging social network, researchers were able to manipulate conditions and observe causal effects.

How can the presence of a “like button” or “repost” feature affect our activity on platforms? On LinkedIn, even tweaking recommended users showed that people gain the most value from semi-distant connections.

In this exciting new field, unanswered questions ring loud. It can be frightening to place our trust in ambiguous algorithms for content moderation, especially when social media usage is at an all-time high.

After all, the media I consume has clearly trickled into my day-to-day decisions. I eat at restaurants I see on my Instagram feed, I purchase products that I see influencers promote, and I tend to read headlines that are spoon-fed to me. As a frequent social media user, I face the troubling reality of being susceptible to manipulation.

Amidst the fear, panelists stress that their research will help create a safer and more informed culture surrounding social media in pressing efforts to preserve democracy.

Post by Ana Lucia Ochoa, class of 2026
Post by Ana Lucia Ochoa, class of 2026

Your AI Survival Guide: Everything You Need to Know, According to an Expert

What comes to your mind when you hear the term ‘artificial intelligence’? Scary, sinister robots? Free help on assignments? Computers taking over the world?

Pictured: Media Architect Stephen Toback

Well, on January 24, Duke Media Architect Stephen Toback hosted a lively conversation on all things AI. An expert in the field of technology and media production, Toback discussed some of the practical applications of artificial intelligence in academic and professional settings.

According to Toback, enabling machines to think like humans is the essence of artificial intelligence. He views AI as a humanities discipline — an attempt to understand human intelligence. “AI is really a digital brain. You can’t digitize it unless you know how it actually works,” he began. Although AI has been around since 1956, the past year has seen an explosion in usage. ChatGPT, for example, became the fastest-growing user application in the world in less than 6 months. “One thing I always talk about is that AI is not gonna take your job, but someone using AI will.”

During his presentation, he referenced five dominant AI platforms on the market. The first one is ChatGPT, created by OpenAI. Released to the public in November 2022, it has over 100 million users every single month. The second is BardAI, which was created by Google in March 2023. Although newer on the market, the chatbot has gained significant traction online.

Pictured: Toback explaining the recent release of Meta’s AI “Characters.”

Next, we have LLama, owned by tech giant Meta. Last September, Meta launched AI ‘characters’ based on famous celebs including Paris Hilton and Snoop Dog, which users could chat with online. “They’ve already started commercializing AI,” Toback explained.

Then there’s Claude, by Anthropic. Claude is an AI assistant for a variety of digital tasks. “Writers tend to use Claude,” Toback said. “Its language models are more attuned to text.”

And finally on Toback’s list is Microsoft Copilot, which is changing the AI game. “It’s integrating ChatGPT into the apps that we use every day. And that’s the next step in this evolution of AI tools.” Described on Microsoft’s website as ‘AI for everything you do,’ Copilot embeds artificial intelligence models into the entire Microsoft 365 suite (which includes apps such as Word, Excel, PowerPoint, and Outlook). “I don’t have to copy and paste into ChatGPT and come back- It’s built right into the app.” It’s also the first AI tool on the market that provides integration into a suite of applications, instead of just one.

Pictured: A presentation created by Toback using Copilot in PowerPoint

He outlined several features of the software, such as: summarizing and responding to email threads on Outlook, creating intricate presentations from a simple text document in PowerPoint, and generating interview questions and resume comparisons in Word. “There’s a great example of using AI for something that I have to do… but now I can do it a little bit better and a little bit faster.”

Throughout his presentation, Toback also touched on the practical use of ChatGPT. “AI is not perfect,” he began. “If you just ask it a question, you’re like ‘Oh that sounds reasonable’, and it might not be right.” He emphasized challenges such as the rapidly changing nature of the platform, inherent biases, and incorrect data/information as potential challenges for practical use.

“Rather than saying I don’t know, it acts a lot like a middle schooler and says it knows everything and gives you a very convincing answer.”

Stephen Toback

These challenges have been felt nationwide. In early 2023, for example, lawyers for a federal court case used ChatGPT to find previous claims in an attempt to show precedent. However, after presenting the claims to a judge, the court found that the claims didn’t actually exist. “It cited all of these fake cases that look like real citations and then the judge considered sanctions, ” said Toback. ‘AI hallucinations’ such as this one, have caused national controversy over the use and accuracy of AI-generated content. “You need to be able to double-check and triple-check anything that you’re using through ChatGPT,” Toback said.

So how can we use ChatGPT more accurately? According to Toback, there are a variety of approaches, but the main one is called prompt engineering: the process of structuring text so that it can be understood by an AI model. “Prompts are really the key to all of this,” he revealed. “The better formed your question is, the more data you’re giving ChatGPT, the better the response you’re going to get.” Below is Toback’s 6-step template to make sure you are engineering prompts correctly for ChatGPT.

Pictured: Toback’s template for ChatGPT prompt engineering

So there you have it — your 2024 AI survival guide. It’s clear from the past few years that artificial intelligence is here to stay, and with that comes a need for improved understanding and use. As AI expert Oren Etzioni proclaims, “AI is a tool. The choice about how it gets deployed is ours.”

Have more questions about AI tools such as ChatGPT? Reach out to the Duke Office of Information Technology here.

Written by Skylar Hughes, Class of 2025

Computer Science Students Say: Let’s Talk About Microaggressions

Soon after taking a seat in her high-level computer science class, Duke student Kiara de Lande surveyed the room. The realization that she was one of only three women of color washed over her. It left a tang of discomfort and confusion. In her gut, she knew that she was capable of success. But then, why were there so few students that looked like her? Doubt ensued: perhaps this was not a place for her. 

de Lande was one of five members of the student advisory board for AiiCE (Alliance for Identity-Inclusive Computing Education) who reflected on their experiences as minority students in computer science in a virtual panel held Jan. 23.

As de Lande shared her story, undergraduate Kianna Bolante nodded in agreement. She too, felt that she had to “second-guess her sense of belonging and how she was perceived.” 

Berkeley ’24 graduate Bridget Agyare added that group work is crucial to success in CS classes, stressing the need for inclusion. The harm of peer micro-aggressions was brought up, the panel emphasizing the danger of stifling minority voices: “When in groups of predominantly males,” de Lande said, “my voice is on the back-burner.”

To not feel heard is to feel isolated, compounding the slam of under-confidence. Small comments here and there. Anxiety trickling in when the professor announces a group project. Peers delegating to you the “front-end” or “design” aspects, leaving the more intricate back-end components for themselves. It’s subtle. It feels like nothing glaring enough to bring attention to. So you shove the feelings to the side.

“No one reaches this level of education by mistake,” said Duke CS graduate student Jabari Kwesi. But over time, these subtle slights chip away at the assurance in your capabilities. 

Kwesi remembers the first time he spoke to a Black female professional software engineer (SWE). “Finally,” he said, “someone who understands what you’re talking about for your experience in and outside academia.”

He made this connection in a Duke course structured to facilitate conversations between students and professionals in the technology industry. In similar efforts, the Duke organization DTech is devoted to non-males in tech. Mentors provide support with peer advisors, social gatherings, and recruiter connections. It also provides access to a database of internships, guiding members during competitive job-hunting cycles. 

As university support continues to grow, students have not shied away from taking action. Bolante, for example, created her own social computing curriculum: focused on connecting student’s identities to the course material. The initiative reflects her personal realization of finding the value in her voice. 

“My personal experiences, opinions, ideas are things no one can take away from me. My voice is my strongest asset and power,” she said. 

As I listened to the declaration, I felt the resilience behind her words. It was evident that the AiiCE panelists are united in their passion for an inclusive and action-driven community. 

Kwesi highlighted the concept of “intentionality.” As a professor, one has to be conscious of the commitment to improvement. This includes making themselves available to students and accepting feedback. Some suggestions amongst the panel were “spotlights” on impactful minorities in CS. Similarly, in every technical class, mandating a societal impact section is key. Technology does not exist in a vacuum: deployment affects real people. For example, algorithms are susceptible to biases against certain groups. Algorithms are designed for tools like resume scanners and medical evaluations. These are not just lines of code- people’s livelihoods are at stake. With the surge of developments in artificial intelligence, technology is advancing more rapidly than ever. To keep bias in check, assembling interdisciplinary teams can help ensure diverse perspectives.

Above all, we must be willing to continue this conversation. There is no singular curriculum or resource that will permanently correct inequities. Johns Hopkins ’25 graduate Rosa Gao reminded the audience that inclusivity efforts are “a practice,” and “a way of moving through space” for professors and peers alike.

It can be as simple as a quick self-evaluation. As a peer: “Am I being dismissive?” “Am I holding everyone’s opinions at an equal weight?” As a professor: “How can I create assignments that will leverage the student voice?”

Each individual experience must be valued, and even successful initiatives should continue to be reinvented. As minorities, to create positive change, we must take up space. As a greater community, we must continue to care, to discuss, and to evolve. 

By Ana Lucia Ochoa, Class of 2026

How Do Animals – Alone or in Groups – Get Where They’re Going?

Note: Each year, we partner with Dr. Amy Sheck’s students at the North Carolina School of Science and Math to profile some unsung heroes of the Duke research community. This is the of fourth eight posts.

In the intricate world of biology, where the mysteries of animal behavior unfold, Dr. Jesse Granger emerges as a passionate and curious scientist with a Ph.D. in biology and a penchant for unraveling the secrets of how animals navigate their surroundings.

Her journey began in high school when she posed a question to her biology teacher about the effect of eye color on night vision. Unable to find an answer, they embarked together on a series of experiments, igniting a passion that would shape Granger’s future in science.

Jesse Granger in her lab at Duke

Granger’s educational journey was marked by an honors thesis at the College of  William & Mary that delved into the potential of diatoms, single-cell algae known for their efficiency in capturing light, to enhance solar panel efficiency. This early exploration of light structures paved the way for a deeper curiosity about electricity and magnetism, leading to her current research on how animals perceive and use the electromagnetic spectrum.

Currently, Granger is involved in projects that explore the dynamics of animal group navigation. She is investigating how animals travel in groups to find food, with collective movement and decision-making.  

Among her countless research endeavors, one project holds a special place in Granger’s heart. Her study involved creating a computational model to explore the dynamics of group travel among animals.  She found that agents, a computational entity mimicking the behavior of an animal, are way better at getting where they are going as part of a group than agents who are traveling alone.

Granger’s daily routine in the Sönke Johnson Lab revolves around computational work. While it may not seem like a riveting adventure to an outsider, to her, the glow of computer screens harbors the key to unlocking the secrets of animal behavior. Coding becomes her toolkit, enabling her to analyze data, develop models, and embark on simulations that mimic the complexities of the natural world.

Granger’s expertise in coding extends to using R for data wrangling and NetLogo, an agent-based modeling program, for simulations. She describes the simulation process as akin to creating a miniature world where coded animals follow specific rules, giving rise to emergent properties and valuable insights into their behavior. This skill set seamlessly intertwined with her favorite project, where the exploration of group dynamics and navigation unfolded within the intricate landscapes of her simulated miniature world.

In the tapestry of scientific exploration, Jesse Granger emerges as a weaver of knowledge, blending biology, physics, and computation to unravel the mysteries of animal navigation. Her journey, marked by curiosity and innovation, not only enriches our understanding of the natural world but also inspires the next generation of  scientists to embark on their unique scientific odysseys.      

Guest Post by Mansi Malhotra, North Carolina School of Science and Math, Class of 2025.

Putting Stronger Guardrails Around AI

AI regulation is ramping up worldwide. Duke AI law and policy expert Lee Tiedrich discusses where we’ve been and where we’re going.
AI regulation is ramping up worldwide. Duke AI law and policy expert Lee Tiedrich discusses where we’ve been and where we’re going.

DURHAM, N.C. — It’s been a busy season for AI policy.

The rise of ChatGPT unleashed a frenzy of headlines around the promise and perils of artificial intelligence, and raised concerns about how AI could impact society without more rules in place.

Consequently, government intervention entered a new phase in recent weeks as well. On Oct. 30, the White House issued a sweeping executive order regulating artificial intelligence.

The order aims to establish new standards for AI safety and security, protect privacy and equity, stand up for workers and consumers, and promote innovation and competition. It’s the U.S. government’s strongest move yet to contain the risks of AI while maximizing the benefits.

“It’s a very bold, ambitious executive order,” said Duke executive-in-residence Lee Tiedrich, J.D., who is an expert in AI law and policy.

Tiedrich has been meeting with students to unpack these and other developments.

“The technology has advanced so much faster than the law,” Tiedrich told a packed room in Gross Hall at a Nov. 15 event hosted by Duke Science & Society.

“I don’t think it’s quite caught up, but in the last few weeks we’ve taken some major leaps and bounds forward.”

Countries around the world have been racing to establish their own guidelines, she explained.

The same day as the US-led AI pledge, leaders from the Group of Seven (G7) — which includes Canada, France, Germany, Italy, Japan, the United Kingdom and the United States — announced that they had reached agreement on a set of guiding principles on AI and a voluntary code of conduct for companies.

Both actions came just days before the first ever global summit on the risks associated with AI, held at Bletchley Park in the U.K., during which 28 countries including the U.S. and China pledged to cooperate on AI safety.

“It wasn’t a coincidence that all this happened at the same time,” Tiedrich said. “I’ve been practicing law in this area for over 30 years, and I have never seen things come out so fast and furiously.”

The stakes for people’s lives are high. AI algorithms do more than just determine what ads and movie recommendations we see. They help diagnose cancer, approve home loans, and recommend jail sentences. They filter job candidates and help determine who gets organ transplants.

Which is partly why we’re now seeing a shift in the U.S. from what has been a more hands-off approach to “Big Tech,” Tiedrich said.

Tiedrich presented Nov. 15 at an event hosted by Duke Science & Society.

In the 1990s when the internet went public, and again when social media started in the early 2000s, “many governments — the U.S. included — took a light touch to regulation,” Tiedrich said.

But this moment is different, she added.

“Now, governments around the world are looking at the potential risks with AI and saying, ‘We don’t want to do that again. We are going to have a seat at the table in developing the standards.’”

Power of the Purse

Biden’s AI executive order differs from laws enacted by Congress, Tiedrich acknowledged in a Nov. 3 meeting with students in Pratt’s Master of Engineering in AI program.

Congress continues to consider various AI legislative proposals, such as the recently introduced bipartisan Artificial Intelligence Research, Innovation and Accountability Act, “which creates a little more hope for Congress,” Tiedrich said.

What gives the administration’s executive order more force is that “the government is one of the big purchasers of technology,” Tiedrich said.

“They exercise the power of the purse, because any company that is contracting with the government is going to have to comply with those standards.”

“It will have a trickle-down effect throughout the supply chain,” Tiedrich said.

The other thing to keep in mind is “technology doesn’t stop at borders,” she added.

“Most tech companies aren’t limiting their market to one or two particular jurisdictions.”

“So even if the U.S. were to have a complete change of heart in 2024” and the next administration were to reverse the order, “a lot of this is getting traction internationally,” she said.

“If you’re a U.S. company, but you are providing services to people who live in Europe, you’re still subject to those laws and regulations.”

From Principles to Practice

Tiedrich said a lot of what’s happening today in terms of AI regulation can be traced back to a set of guidelines issued in 2019 by the Organization for Economic Cooperation and Development, where she serves as an AI expert.

These include commitments to transparency, inclusive growth, fairness, explainability and accountability.

For example, “we don’t want AI discriminating against people,” Tiedrich said. “And if somebody’s dealing with a bot, they ought to know that. Or if AI is involved in making a decision that adversely affects somebody, say if I’m denied a loan, I need to understand why and have an opportunity to appeal.”

“The OECD AI principles really are the North Star for many countries in terms of how they develop law,” Tiedrich said.

“The next step is figuring out how to get from principles to practice.”

“The executive order was a big step forward in terms of U.S. policy,” Tiedrich said. “But it’s really just the beginning. There’s a lot of work to be done.”

Robin Smith
By Robin Smith

Page 1 of 20

Powered by WordPress & Theme by Anders Norén