Following the people and events that make up the research community at Duke

Students exploring the Innovation Co-Lab

Author: Guest Post Page 6 of 10

Stalking Elusive Ferns Down Under

Graduate student Karla Sosa (left) photographs and presses newly collected ferns for later analysis while Ashley Field (in truck) marks the GPS location of the find.

In Queensland, Australia, early March can be 96 degrees Fahrenheit. It’s summer in the Southern Hemisphere, but that’s still pretty hot.

Although hot, dry Australia probably isn’t the first place you’d think to look for ferns, that’s precisely why I’m here and the sole reason we’ve hit the road at 6 a.m. Our schedule for the day: to drive as far south as we can while still letting us come home at the end of the day.

My local colleague, Ashley Field, grew up just the next town over. A skinny, speedy man, he works at James Cook University in Cairns and knows most of northern Queensland like the back of his hand.

Cairns is on the coast at the upper right, where the little green airplane is.

The ferns I’m looking for today are interesting because some species can move from their original home in Australia to the tiny islands in the Pacific. But some cannot. Why? Understanding what makes them different could prove useful in making our crops more resilient to harsh weather, or preventing weeds from spreading.

We’ve been driving for four hours before we turn off onto a dirt road. If you haven’t been to Australia, it’s worth noting that four hours here is unlike any four hours I’ve experienced before. The roads are fairly empty, flat, and straight, meaning you can cover a lot of terrain. Australia is also incredibly big and most of the time you’re travelling through unpopulated landscapes. While it may be only four hours, your mind feels the weight of the distance.

Here’s the one they were looking for!
Cheilanthes tenuifolia with lots of little spore babies on the undersides of its leaves.

The dirt road begins to climb into the mountains. We are leaving behind low scrub and big granite rocks that sit on the flat terrain. Ashley knows where we can find the ferns I’m looking for, but he’s never driven this road before. Instead, we’re trusting researchers who came before us. When they explored this area, they took samples of plants that were preserved and stored in museums and universities. By reviewing the carefully labelled collections at these institutions, we can know which places to revisit in hopes of finding the ferns.

Often, however, having been collected before there was GPS, the location information on these samples is not very precise, or the plants may no longer live there, or maybe that area got turned into a parking lot, as happened to me in New Zealand. So, despite careful planning, you may drive five hours one way to come up empty handed.

As we move higher up the mountain, the soil turns redder and sparse eucalyptus forests begin to enclose us. We locate the previous collections coordinates, an area that seems suitable for ferns to grow. We park the truck on the side of the road and get out to look.

We comb 300 feet along the side of the road because these ferns like the edges of forest, and we find nothing. But as we trudge back to the truck, I spot one meager fern hiding behind a creeping vine! It’s high up off the road-cut and I try to scramble up but only manage to pull a muscle in my arm. Ashley is taller, so he climbs partway up a tree and manages to fetch the fern. It’s not the healthiest, only 6 inches tall for a plant that usually grows at least 12 to 14 inches. It’s also not fertile, making it less useful for research, and in pulling it out of the ground, Ashley broke one of its three leaves off. But it’s better than nothing!

This delicate beauty has no name yet. Karla has to compare it to other ferns in the area to know whether it’s just an odd-looking variant or possibly … a new species!

Ashley excels at being a field botanist because he is not one to give up. “We should keep looking,” he says despite the sweat dripping down our faces.

We pile back in and continue up the road. And who could have predicted that just around the bend we would find dozens of tall, healthy looking ferns! There are easily fifty or so plants, each a deep green, the tallest around 12 inches. Many others are at earlier stages of growth, which can be very helpful for scientists in understanding how plants develop. We take four or five plants, enough to leave a sample at the university in Cairns and for the rest to be shipped back to the US. One sample will be kept at Duke, and the others will be distributed amongst other museums and universities as a type of insurance.

The long hours, the uncertainty, and the harsh conditions become small things when you hit a jackpot like this. Plus, being out in remote wilderness has its own soothing charm, and chance also often allows us to spot cool animals, like the frilled lizard and wallaby we saw on this trip.

Funding for this type of fieldwork is becoming increasingly rare, so I am grateful to the National Geographic Society for seeing the value in this work and funding my three-week expedition. I was able to cover about 400 miles of Australia from north to south, visiting twenty-four different sites, including eight parks, and ranging from lush rainforest to dry, rocky scrub. We collected fifty-five samples, including some that may be new species, and took careful notes and photographs of how these plants grow in the wild, something you can’t tell from dried-up specimens.

Knowing what species are out there and how they exist within the environment is important not only because it may provide solutions to human problems, but also because understanding what biodiversity we have can help us take better care of it in the future.

Guest Post by graduate student Karla Sosa

Do DNA Tests Sell Rosy Ideas About Race for Profit?

Earlier this year,  the online DNA testing company Ancestry.com faced a media firestorm and social media backlash after posting a controversial ad on its YouTube page.

The DNA testing company Ancestry.com took down its ad, “Inseparable,” in April 2019 in response to criticism that it romanticized slavery.

Titled “Inseparable,” the 30-second ad depicted a white man in the antebellum South asking an African-American woman to flee to the North with him. Before the woman can answer, the piece cuts to a tagline: “Only you can keep the story going. Uncover the lost chapters of your family history with Ancestry.” Many criticized the ad’s historical inaccuracy, showcasing a rosier portrayal of a complicated past. To extinguish flames, Ancestry completely pulled the ad from its platforms.

A recent Duke study of dozens of other ads across multiple DNA testing companies shows that this isn’t the only example of mixed messaging about race and identity from the world of genetic ancestry tests.

The tests are quite simple: order a kit, send off a saliva sample and receive an ethnicity estimate within weeks. A test taker’s ethnicity is broken down into percentages based off their DNA matches compared to a globally referenced DNA database. Kits can range in price from $79 to$400. Sales of DNA testing kits had reached 12 million people by 2017, as reported by ScienceLine.

As part of the six-week summer research program Story+, Duke students Dakota Douglas, Mona Tong and Madelyn Winchester analyzed the messaging in 90 video ads from the companies 23andMe, AncestryDNA and MyHeritageDNA to see what they promise consumers.

Many of the ads lured customers with promises of a newfound identity and possible family members, the team found. One Ancestry.com ad, entitled “Kyle,” depicts a customer whose childhood was steeped in German culture, but discovers as an adult that he is also part Scottish and Irish. He happily “traded in his lederhosen for a kilt,” completely forgoing his previous heritage and reducing a newly discovered culture to stereotypes.

“There were a lot of advertisements similar to that one,” said team member Mona Tong. “Many found a new identity embracing it fully despite a lack of any cultural connections.”

“Kyle” illustrates a phenomenon described in a 2018 study from the University of British Columbia, which found that people tended to “cherry-pick” the results, identifying more with certain ethnicities and cultures to appear different. Whites were more likely to see their results as “transformational” than their nonwhite counterparts.

“It’s not a bad idea to test your genes for medical reasons,” said Patricia Bass, the team’s project mentor. “However, these ads can be misleading by assuming that someone’s cultural and racial heritage are determined by genes.”

While the majority of subjects featured within the ads were white, the few ads that featured people of color often glossed over the complicated history of someone’s lineage or conveniently left out difficult topics. Ancestry’s “Anthem” ad detailed historical reenactments of an African tribal women, prohibition gangsters, a man fleeing England for America and Native Americans somberly heading to a new land. A voiceover speaks with inspiration ending with a shot of a biracial woman.

In marketing the idea that we are all one, the ads fetishized mixed-race subjects, while ignoring the genocide and displacement of people, the team found.

The team hopes future research will further examine the impact of these ads on people’s view of identity. Importantly, one could note if there were any focus groups to test these ads before release.

“It furthers the idea of colorblindness,” Tong said. “It assumes that relationships are contingent upon common ancestry and genes.”

“In a way, companies are trying to help by focusing on the interconnectivity and commonalities between people,” Tong said. “But it hurts more than it helps.”

Story+ is a six-week undergraduate research program offered through the John Hope Franklin Humanities Institute and Bass Connections, with support from the Duke University Libraries and Versatile Humanists at Duke.

By Deja Finch

Innocent Until Proven Guilty? Well, That Depends

This is the last of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

In the criminal justice system, one might imagine that the more serious a crime is, the more extensive the evidence should be to support the verdict. However, a recent study conducted at Duke University finds that jurors assessment of guilt is less reliant on the type of evidence and more on the severity of the crime.

Mock jurors in the study were more likely to find someone charged with murder guilty than someone charged with robbery.

A still from the movie “Twelve Angry Men” (1954), a tense drama about jury deliberations.

Numerous scholars have looked at how flawed forensic evidence, mistaken eyewitness identifications and defendants’ prior criminal convictions can introduce errors in criminal prosecutions.

But John Pearson, an assistant professor in four Duke departments including neurobiology, and his colleagues in law wanted to know whether the type of crime can also lead to a greater chance of wrongful conviction. It may be that jurors use moral and emotional responses to various crimes as reasoning for the decisions they make regarding the defendant’s guilt.

The researchers aimed to understand the relationship between crime severity and confidence in guilt by seeing how mock jurors, practicing prosecutors, and other practicing lawyers weigh various types of evidence in order to make a decision on guilt.

John Pearson

Participants in the study were subjected to about 30 crime scenarios, each one paired with a random variety of types of evidence. After participants read through each respective scenario, they rated the strength of the case on a 0-100 scale and their emotional and moral responses.

It appeared that the more threat or outrage they felt toward crime type, the more likely they were to find the defendant guilty.

The authors also tested different types of evidence’s potential interaction with people’s beliefs.

They found that both DNA and non-DNA physical evidence had the highest amount of influence on participants, but the difference between how the participants weighed them was small. The jurors appeared to place very similar, if not the same amount of weight onto these two types of evidence in terms of their confidence.

Pearson refers to juror’s equal weight of DNA and non DNA evidence as the “CSI effect.” But DNA evidence is far more reliable than non DNA evidence. The CSI effect lays out that jurors tend to give more weight to conclusions based on traditional evidence. The study found that no matter one’s position, the pattern of similar weight between the DNA and non DNA evidence was found across all groups. The study also states that “subjects tend to overweight widely used types of forensic evidence, but give much less weight than expected to a defendant’s criminal history.”

Along with finding similar patterns between confidence in guilt and evidence type, researchers also discovered an intense link between the subject’s confidence in guilt with the severity of the crime.

Notably for jurors, crime type highly influenced their perception of confidence in guilt. The study showed a positive correlation between personal, emotional, and moral biases and “adjudicative bias,’ or the likelihood of conviction.

And while jurors did show more of a trend in this finding, practicing lawyers and prosecutors also exhibited a crime-type bias correlation with the seriousness of crime, even though it was much smaller.

The study’s results model how punishment, outrage, and threat are almost entirely dependent on crime effect and crime scenario. This indicates that despite how much evidence was presented, crime type alone influenced jurors decisions to charge someone as guilty of that crime more frequently.

(Bang) Guilty!

This could mean that regardless of how much evidence or what type of evidence is present, innocent people wrongly charged of crimes could more easily be convicted if it is a more severe offense.

These findings indicate how easy it is to reach wrongful convictions of severe crimes within the US criminal justice system.

Guest post by Casey M. Chanler

6-Month-Old Brains Are Categorically Brilliant

This is the seventh of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

Let’s say you visit your grandmother later today and come across a bowl of unknown exotic berries that look and taste similar to a raspberry. Your grandmother tells you that they are called bayberries. How would your mind react to the new word “bayberry”?

Research shows that an adult brain would probably categorize the word “bayberry” into the category of berries, and draw connections between “bayberry” and other related berry names.

But how do you think an infant would deal with a word like “bayberry”? Would he or she categorize the word the same way you would?

Elika Bergelson, a developmental psychologist at Duke University, provided some possible answers for this question in a study published in the Proceedings of National Academy of Sciences.

Six-month-old infants were shown two objects on a screen simultaneously, as a speaker provided labeling for one of the objects (eg. Look at the dog!).

The thing on the right is a shoe, sweetie. We’re not sure about that other thing…

The two objects were either literally related or unrelated. For example, the words nose and mouth are semantically, or literally, related since they both refer to body parts, while the words nose and boots are semantically unrelated.

As the babies were presented with these objects, their eye movements were tracked. The longer a baby stared at an object, the more confident he or she is presumed to be about the object’s match with the label. This acted as an indicator of how well the baby understood which object the label was referring to.

If the infants categorized words into semantically related groups, then they’d be more likely to confuse objects that are related. This means that the infants would perform better at choosing the correct object when the objects are unrelated.

The results suggest that infants approach words no differently than adults. The babies correctly identified the labeled object more frequently when the two were unrelated than when the two objects were related. This indicates that babies have the mental representation of words categorized into semantically related groups. When encountering two unrelated objects, babies can quickly distinguish between the two objects because they do not belong to the same mental category.

Elika Bergelson

However, when the two objects are related, the infants often confuse them with each other because they belong to the same or closely related categories — while 6-month-olds have developed a general categorization of nouns, their categories remain broad and unrefined, which causes the boundaries between objects in the same category to be unclear.

So what do all these results mean? Well, back to the bayberry example, it means that a 6-month-old will place the word “bayberry” into his or her mental category of “berries.” He or she might not be able to distinguish bayberries from raspberries the next time you mention the word “bayberry,” but he or she will definitely not point to bayberries when you drop the word “milk” or “car.”

Toddler Rock

If the results of this study can be replicated, it means that the infant approach to language is much more similar to adults than researchers previously thought; the infants have already developed a deep understanding of semantics that resembles grown-ups much earlier than researchers previously speculated.

While the results are exciting, there are limitations to the study. In addition to the small sample size, the infants mainly came from upper middle class families with highly educated parents. Parents in these families tend to spend more time with their infant and expose the infant with more words than parents with lower socio-economic status. So these findings might not be representative of the entire infant population. Nevertheless, the study sheds light on how infants approach and acquire words. It’s also possible this finding could become a new way to detect language delay in infants by the age of six-months.

Guest post by Jing Liu, a psychology and neuroscience major, Trinity 2022.

A Mind at Rest Still Has Feelings

This is the sixth of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

Emotions drive our everyday lives: They help us make decisions, they guide us into acting certain ways and they can even define who we are as people. But when we take a break from our busy lives and rest, does our brain do the same?

A 2016 study by Duke researchers tested whether neural models developed to categorize distinct emotional categories in an MRI brain scan would work with people who are in a resting state, meaning no activity is being done by the person physically or mentally.

An algorithm determined different patterns of brain activity that mapped to different emotional states.

When a person is active, emotions are usually a huge part of the ways they interact and the decisions they make, but this study led by Kevin LaBar a professor of psychology and neuroscience, wanted to see if changing the activity level to its minimum can cause different effects on the person and the emotions they experience.

They used a tool called functional magnetic resonance imaging (fMRI) that allows scientists to measure brain activity by seeing the amount of blood flow to different areas in the brain. They were looking for universal emotions, those feelings that are understood in all cultures and societies as the same state of mind, such as contentment, amusement, surprise, fear, anger, sadness, and neutral.

Each emotion has been shown by fMRI to activate different portions of the brain. This is significant if a person is injured or has decreased activity level in a region of the brain, because it can change the ways they feel, act, and interact with others. It also can help to better understand why certain people have better visual recollection of memories, can recall certain information, even when in a sleeping or resting state.

This study consisted of two experiments. The first experiment included a large number of students recruited for a larger study by Ahmad Hariri, a professor of psychology and neuroscience. These healthy, young adult university students have been assessed on  a wide range of behavioral and biological traits. For this experiment, they were told to stare at a blank gray screen and to rest while not thinking of anything particular while being scanned by the fMRI.

The second experiment was with a smaller sample of just 22 participants. Before going into the fMRI, they rated how they felt emotionally in an unconstrained resting state. Once in the machine, they were told to rest and let their mind wander and to think freely with the blank screen occasionally letting them rate their current state of emotion. By the end of the experiment, they completed 40 trials of rating how they felt, which consisted of 16 different emotions they could choose from.

The researchers tried to quantify the occurrence of different spontaneous emotional states in resting brains.

At the end of both experiments, the researchers tested the brain scans with an algorithm that categorized emotional arousal and brain activity. They found distinct patterns of activity in these resting minds that seemed to match various emotional states the students had reported. Prior to this study, there had only been experiments which test to see how the brain is stimulated in active people in a non-resting state.

Although this experiment was successful and helped the researchers understand a lot more about the emotional states of the brain, there were some limitations as well. One of the main biases of the self-report experiment was the high percentage of students reporting that they were experiencing amusement (23.45%) and contentment (46.31%) which the researchers suppose was students putting forth a more positive image of themselves to others. Another possible bias is that brain patterns might vary depending on the emotional status of an individual. Emotional processes unfolding at both long and short time scales likely contribute to spontaneous brain activity.  

This study holds important clinical implications. Being able to ‘see’ emotional states in a resting brain would help us understand how important the feelings we experience are. With refinement, fMRI could become useful for diagnosing personality or mood disorders by showing us the brain areas being stimulated during certain periods of sadness, anger, and anxiety. Such applications could help with identifying emotional experiences in individuals with impaired awareness or compromised ability to communicate.

Guest post by Brynne O’Shea.

Are People Stuck with Their Political Views?

This is the fifth of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

Whether you cheered or cried when Donald Trump was elected President, or if you stood in the blazing heat marching for women’s rights, your position on socio-political issues is important to you.  Would you ever change it?

Psychologists have found that people tend to hold onto their views, even when presented with conflicting evidence. Is it ever worth your time to argue with the opposition, knowing that they will not budge from their stance?

A 2013 protest in Brussels. Picture by M0tty via wikimedia commons

Researchers from Duke University explored the idea that people stand with their positions on political and social matters, even when presented with affirming or conflicting evidence.

But they also offer hope that knowing that these cognitive biases exist and understanding how they work can help lead to more rational decision-making, open debate of socio-political issues, and fact-based discussion.

The stubbornness of people’s views are based on a couple of concepts. “Resistance to belief-change” is the idea that people will stand with their original views and are unwilling to change them. This could be a result of a cognitive bias known as the “confirmation bias.” The bias is that people will favor evidence that supports their claim and deny evidence that refutes their position.

An example of this would be a person who supports Donald Trump rating an article about how he is doing a great job more favorably as opposed to a non-supporter who would use evidence that shows he is doing a bad job. Whether their position is a supporter or non-supporter, they will use evidence that supports their position and will overlook any conflicting evidence.

This can be shown through the following 2019 experiment performed by the Duke team, which was led by Felipe De  Brigard, a Duke assistant professor of philosophy and member of the Duke Institute for Brain Sciences.

This experiment started with a group of individuals across the spectrum of socio-economic and political interests. They were presented with five different socio-political issues: fracking, animal testing, drone strikes, standardized testing, and the gold standard. They started by reading background on the issue and then were to report any prior knowledge on these issues to eliminate people favoring information that they had previously encountered.

After reporting any prior knowledge, they were to make their decision and rate how confident they were in that decision. They were then tasked with evaluating only affirming evidence, only conflicting evidence, or evidence for both sides. After this, they gave their final decision about their stance on the issues.

The results showed that there was very little change in people’s positions after being presented with the evidence. For example, in the topic of fracking, about two in one-hundred people changed their position after being presented with affirming evidence. Also, when being presented with conflicting evidence, only one in five people changed their stance on the issue.

Similar changes were recorded with other issues and other sets of evidence. The results showed that receiving conflicting evidence caused people to change their position the most, but it was still a small percentage of people who changed their stance. This is significant because it shows how people are resistant to change because of their belief biases. Another interesting aspect is that participants rated evidence that affirmed their belief to be more favorable than those that conflicted it. This means they tended to use this evidence to support their stance and overlook conflicting evidence, which shows how cognitive biases, like the confirmation bias, play an important role in decision-making.

Cognitive bias affects how we make our decisions. More importantly, it entrenches our views and stops us from being open-minded. It is important to understand cognitive biases because they impact our choices and behavior. Becoming aware of biases like resistance to change, and the confirmation bias allows people to think independently and make decisions based off of rationale as well as emotion because they are aware of how these impact their decision making process.

Well, do you?

We expect to act rationally, making decisions that are in our best interest. However, this is often not true of humans. However, having adequate information, including understanding the impact of biases on decisionmaking, can lead humans to make better judgements. The next step in decision making research is to understand how people can change their entrenched positions to eliminate biases like the confirmation bias and bring more fact-based, open debate to socio-political issues.

To borrow from President Obama’s campaign slogan, is that change you can believe in?

Guest Post by Casey Holman, psychology major.

Move Your Eyes and Wiggle Your Ears

This is the fourth of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

Research by Duke University neuroscientists has uncovered that the eardrums move when the eyes do. Even without sound, simply moving your eyes side to side causes the eardrums to vibrate.

Because the vibrations and eye movements seem to start simultaneously, it seems as if both processes are controlled by the same parts of the brain, suggesting the same motor commands control both processes, according to senior author Jennifer Groh of psychology and neuroscience.

A human ear.

Her team used rhesus monkeys and humans in an experiment that has given us new understanding of how the brain pairs hearing and seeing.

This research could help shed light on the brain’s role in experiencing outside stimuli, such as sounds or lights, or even in understanding hearing disorders. Scientists still don’t understand the purpose of eardrum movement, however.

The experiment fitted sixteen participants with microphones small enough to fit into the ear canals, but also sensitive enough to pick up the eardrums’ vibrations. It is known that the eardrum can be controlled by the brain, and these movements help control the influx of sound from the outside and also produce small sounds called “otoacoustic emissions.” Thus, it is important to measure vibrations, as this would signify the movement of the eardrum.

LED lights were presented in front of the participants and they were asked to follow the lights with their eyes as they shifted side to side.

Rhesus monkeys move their eardrums too!

This experiment was also replicated in three rhesus monkeys, using five of the six total ears between them. These trials were conducted in the same way as the humans.

The researchers concluded that whenever the eyes move, the eardrums moved together to shift focus to the direction of sight. Vibrations began shortly before and lasted slightly after the eye movements, further suggesting the brain controls these processes together. As eye movements get bigger, they cause larger vibrations.

These relationships highlight an important void in previous research, as the simultaneous and even anticipatory action of nearly 10 milliseconds of eardrum vibrations show that the brain has more control in making the systems work together, using the same motor commands. The information being sent to the eardrums, therefore, likely contains information received from the eyes.

Perhaps immersive headphones or movie theaters could also take advantage of this by playing sounds linked to the movements of eyes and eardrums to create a more “realistic” experience.

While the relationship between side to side eye movements was analyzed for their effect on eardrum movement, the relationship between up and down eye movements has yet to be discovered. Hearing disorders, like being unable to focus on a specific sound when many are played at once, are still being investigated. Scientists hope to further understand the relationship the brain has with the audio and visual systems, and the relationship they have with each other.

Guest post by Benjamin Fiszel, Class of 2022.

Your Brain Likes YOU Most

This is the third of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

Imagine you’re at a party. You have a few friends there, but the rest of the people you don’t know. You fiddle with the beaded bracelet around your wrist, take a breath, relax your arms, and walk in. You grab some pretzels and a drink, and talk to this guy named Richard. He has a daughter, or a niece, or something like that. His moustache looked weird.

Okay, now quick question: would you remember if he was wearing a bracelet or not? Odds are you wouldn’t unless he had a bracelet like yours. In fact, it turns out that we recall things far better when those things concern ourselves.

Research has shown us that when it comes to what we notice, the quickest thing to grab our attention will be something we relate to ourselves, such as a picture of your own face compared to a picture of any other face. What still remains unknown however, is to what extent our prioritization of self plays an internal role in our processes of memory and decision making.

I am. Therefore I selfie.

To explore this, an international team of researchers led by Duke’s Tobias Egner analyzed the degree to which we prioritize self-related information by looking at how efficiently we encode and actively retrieve information we have deemed to concern ourselves.

They did this with a game. Research participants were shown three different colored circles that represented self, friend, and stranger. A pair of colored circles would appear in various locations on the screen, then vanish, followed by a black circle which appeared in the same or different location as one of the colored circles.

Participants were then asked if the black circle appeared at the same location where one of the colored circles had been. The responses were quite revealing.

People responded significantly quicker when the black circle was in the location of the circle labeled self, rather than friend or stranger. After variations of the experiment, the results still held. In one variation, the black circle would appear in the location of the self-circle only half as often as it did the others. But participants still recalled the quickest when the black circle appeared where their self-circle had been.

If the light blue dot is “you,” will you get the answer quicker?

With nothing but perception and reaction time, the process demonstrated that this is not a conscious decision we make, but an automatic response we have to information we consider our own.

The experiment demonstrated that when it comes to holding and retrieving information on demand, the self takes precedence. The interesting thing in this study however, is that the self-related stimulus in this experiment was not a picture of the person, or even the circle of their preferred color, it was simply the circle that the researchers assigned to the participant as self, it had nothing to do with the participants themselves. It was simply the participant’s association of that circle with themselves that made the information more important and readily available. It seems that to associate with self, is to bring our information closer.

The fact that we better recall things related to ourselves is not surprising. As creatures intended mostly to look after our own well-being, this seems quite an intuitive response by our working memory. However, if there is anything to take away from this experiment, it’s the significance of the colored circle labeled self. It was no different than any of the others circles, but merely making it ‘self’ improved the brain’s ability to recall and retrieve relevant information.

Simply associating things with ourselves makes them more meaningful to us.

Guest post by Kenan Kaptanoglu, Class of 2020.

Putting Your Wandering Mind on a Leash

This is the second of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

What should I eat for dinner? What do I need to do when I return home? What should I do this weekend? All three questions above are questions we frequently ask ourselves when we begin to mind-wander in class, at work, and even at home.

Mind-wandering has commonly been defined and recognized as the unconscious process of getting distracted from a task at hand. Thus, mind-wandering has garnered a fairly negative connotation due to it being viewed as an uncontrollable phenomenon. But what if I told you that recent research shows that not only can we control our mind wandering with the presence of an upcoming task, but we can do so on a moment-to-moment basis as well?

         Illustration by Charlie Taylor @c.e.b.t. (http://www.mylittleplaceofcalm.com/the-wonderings-of-a-wandering-mind/)

And if we can indeed modulate and directly control our minds, can we find ways to mind-wander that would ultimately optimize our productivity? Could we potentially control our off-topic thoughts without seeing a loss in overall performance of a task?

To answer these questions, Harvard postdoc Paul Seli, who is now an assistant professor of psychology and neuroscience at Duke, and his team conducted a fascinating experiment. They knew from earlier work that our minds tend to wander more while completing easier tasks than difficult ones. Why? Because we simply need to use fewer executive resources to perform easy tasks and thus we can freely mind-wander without noticing a loss in performance. In fact, one could say that we are optimizing our executive functions and resources across a variety of different tasks instead of just one.

Seli hypothesized that people could control their mind wandering on the basis of their expectations of upcoming challenges in a task. To test this, he had research participants sit in front of a computer screen that showed a large analog clock. Researchers told each participant to click on the spacebar every time the clock struck 12:00. Seems simple right? Even simpler, the clock struck 12:00 every 20 seconds and thus it was completely predictable. To incentivize the participants to click the spacebar on time, a bonus payment was awarded for every correct response.

Paul Seli studies…
What were we talking about?

During some of the 20-second intervals, the participants were presented with what are called “thought probes.” These popped up on the screen to ask the participants whether or not their mind had just been wandering.

The participants were assured that their responses did not affect their bonus payments and the probes were presented above a paused clock face so that the participants still saw where the hand of the clock was while answering the question. Participants could either respond by clicking “on task” (meaning that they were focusing on the clock), “intentionally mind-wandering” (meaning that they were purposely thinking about something off-topic), or “unintentionally mind-wandering.” After a response was given, the question disappeared, and the clock resumed.

By using the thought probes to track the mind-wandering of participants on a second-by-second basis, Seli found that the participants tended to decrease their levels of mind-wandering as the clock approached 12:00. In other words, participants would freely mind-wander in the early stages of the hand’s rotation and then quickly refocus on the task at hand as the clock approached 12:00.

Seli showed that we have some ability to control a wandering mind. Instead of mind-wandering being solely dependent on the difficulty of the task, Seli found that we can control our mind-wandering on a moment-to-moment basis as the more difficult or pressing aspect of the task approaches.

Even if we are distracted, we have the ability to refocus when the task at hand becomes pressing. However, there is a time and place for mind-wandering and multitasking, and we should certainly not get too confident with our mind-wandering abilities.

Take mind-wandering and distracted driving for example. Approximately nine Americans are killed each day due to distracted driving and more than 1,000 people are injured. Therefore, just because you are overly familiar with a task does not mean that it’s not crucial and demanding. Thus, we shouldn’t undervalue the amount of executive resources and attention we need to focus and stay safe.    

So, the next time you catch yourself thinking about your upcoming weekend, chances are that the task your completing isn’t too pressing, because if it were, you’d be using up all of your executive resources to focus.

Guest post by Jesse Lowey, Trinity 2021

Just The Way You Say It Can Make Something ‘True’

This is the first of eight blog posts written by undergraduates in PSY102: Introduction to Cognitive Psychology, Summer Term I 2019.

We’ve all accepted a lie that we’ve heard before. For example, “vitamin C prevents the common cold” is a statement that rings true for many people. However, there is only circumstantial evidence supporting this claim, and instead, many researchers agree that the evidence in fact reveals that vitamin C has no effect on the common cold.

So why do we end up believing things that are not true? One reason is known as the “illusory truth effect” which claims that the more “fluent” a statement is or feels, the more likely it is to be remembered as true.

Fluency in this case refers to how easily we can later recall information. Fluency can increase in a variety of ways; it could be due to the size of the text in which the fact was presented, or how many times you have heard the statement. Fluency can even be influenced by the color of the text that we read. As an example, if we were only presented with the blue-text version of the four statements shown in the picture above, it would be easier for us to remember — compared to if we were only shown the yellow-text version — and thus easier for us to recall later. Similarly, if the text was larger, or the statements were repeated more frequently, it would be easier for us to recall the information.

This fluency can be useful if we are constantly told accurate facts. However, in our current day and age, truth and lies can become muddled, and if we end up hearing more lies than truths, this illusory truth effect can take over, and we soon begin to accept these falsehoods.

Vanderbilt University psychologist Lisa Fazio studied this during graduate school at Duke. Her aim was to explore this illusory truth effect.

Eighty Duke undergraduates participated in her studies. For the first part, participants were shown factual statements — both true and false — and asked to rate how interesting they were.

For the second part, participants were shown statements — some of which came from the first part of the study — and told that some would be true and some false. They were then asked to rate how truthful the statements were, on a scale from one to six, with one being definitely false, and six being definitely true.

Fazio and her colleagues found that the illusory truth effect is not only a powerful mental mechanism, but that it is so powerful, it can override our personal knowledge.

For example, if presented with the question “what is the name of the skirt that Scottish people sometimes wear?” most people would correctly respond with “a kilt.” However, if you were shown the false statement “a sari is the skirt worn by Scottish people,” you would be more likely to later report this statement as being truthful, even though you knew the correct answer before reading that false statement.

Fazio’s paper also proposed a model for how fluency and knowledge may interact in this situation. Their model (shown below) suggests that fluency is the main deciding factor on the decisions that we make. If we cannot easily remember an answer, then we rely on our prior knowledge, and finally, if our knowledge fails, then we resort to a guess. This model makes an important distinction from their other model and the underlying hypothesis, which both suggest that knowledge comes first, and thus could override the illusory truth effect.

All of this research can seem scary at first glance. In a world where “fake news” is on the rise, and where we are surrounded by ads and propaganda, how can we make sure that the information we believe to be true is actually true? While the paper does not fully explore the

Lisa Fazio’s model of the illusory truth effect.

effectiveness of different ways to train our brains to weaken the illusory truth effect, the authors  do offer some suggestions.

The first is to place yourself in situations where you are going to rely more on your knowledge. Instead of being a passive consumer of information, actively fact-check the information you find. Similar to a reporter chasing down a story, someone who actively thinks about the things they hear is not as likely to fall victim to this effect.

The second suggestion is to train oneself. Providing training with trial-by-trial feedback in a situation similar to this study could help people understand where their gut reactions fall short, and when to avoid using them. The most important point to remember is that the illusory truth effect is not inherently bad. Instead, it can act as a useful tool to reduce mental work throughout one’s day. If ten people say one thing, and one person says another, many times, then ten will be right, and the one will be wrong. The real skill is learning when to trust the wisdom of the crowds, and when to reject it.

Kevyn Smith Guest Post by Kevyn Smith, a third-year undergraduate majoring in Electrical and Computer Engineering and Computer Science, and minoring in Psychology.

Page 6 of 10

Powered by WordPress & Theme by Anders Norén