Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Computers/Technology (Page 1 of 12)

Heating Up the Summer, 3D Style

While some students like to spend their summer recovering from a long year of school work, others are working diligently in the Innovation Co-Lab in the Telcom building on West Campus.

They’re working on the impacts of dust and particulate matter (PM) pollution on solar panel performance, and discovering new technologies that map out the 3D volume of the ocean.

The Co-Lab is one of three 3D printing labs located on campus. It allows students and faculty the opportunity to creatively explore research through the use of new and emerging technologies.

Third-year PhD candidate Michael Valerino said his long term research project focuses on how dust and air pollution impacts the performance of solar panels.

“I’ve been designing a low-cost prototype which will monitor the impact of dust and air pollution on solar panels,” said Valerino. “The device is going to be used to monitor the impacts of dust and particulate matter (PM) pollution on solar panel performance. This processis known as soiling. This is going to be a low-cost alternative (~$200 ) to other monitoring options that are at least $5,000.”

Most of the 3D printers come with standard Polylactic acid (PLA) material for printing. However, because his first prototype completely melted in India’s heat, Valerino decided to switch to black carbon fiber and infused nylon.

“It really is a good fit for what I want to do,” he said. “These low-cost prototypes will be deployed in China, India, and the Arabian Peninsula to study global soiling impacts.”

In a step-by-step process, he applied acid-free glue to the base plate that holds the black carbon fiber and infused nylon. He then placed the glass plate into the printer and closely examined how the thick carbon fiber holds his project together.

Michael Bergin, a professor of civil and environmental engineering professor at Duke collaborated with the Indian Institute of Technology-Gandhinagar and the University of Wisconsin last summer to work on a study about soiling.

The study indicated that there was a decrease in solar energy as the panels became dirtier over time. The solar cells jumped 50 percent in efficiency after being cleaned for the first time in several weeks. Valerino’s device will be used to expand Bergin’s work.

As Valerino tackles his project, Duke student volunteers and high school interns are in another part of the Co-Lab developing technology to map the ocean floor.

The Blue Devil Ocean Engineering team will be competing in the Shell Ocean Discovery XPRIZE, a global technology competition challenging teams to advance deep-sea technologies for autonomous, fast and high-resolution ocean exploration. (Their mentor, Martin Brooke, was recently featured on Science Friday.)

The team is developing large, highly redundant carbon drones that are eight feet across. The drones will fly over the ocean and drop pods into the water that will sink to collect sonar data.

Tyler Bletsch, a professor of the practice in electrical and computer engineering, is working alongside the team. He describes the team as having the most creative approach in the competition.

“We have many parts of this working, but this summer is really when it needs to come together,” Bletsch said. “Last year, we made it through round one of the competition and secured $100,000 for the university. We’re now using that money for the final phase of the competition.”

The final phase of the competition is scheduled to be held fall 2018.
Though campus is slow this summer, the Innovation Co-Lab is keeping busy. You can keep up-to-date with their latest projects here.

Post by Alexis Owens

 

Duke Alumni Share Their SpaceX Experiences

It was 8 o’clock on a Monday night and Teer 203 was packed. A crowd of largely Pratt Engineering students had crammed into practically every chair in the room, as if for lecture. Only, there were no laptops out tonight. No one stood at the blackboard, teaching.

SpaceX launches

SpaceX’s Falcon Heavy and Dragon rockets in simultaneous liftoff

No, these students had given up their Monday evening for something more important. Tonight, engineering professor Rebecca Simmons was videoconferencing with six recent Duke grads—all of whom are employed at the legendary aerospace giant SpaceX, brainchild of tech messiah Elon Musk.

Eager to learn as much as possible about the mythic world of ultracompetitive engineering, the gathered students spent the next hour and fifteen minutes grilling Duke alumni Anny Ning (structures design engineering), Kevin Seybert (integration and test engineering), Matthew Pleatman and Daniel Lazowski (manufacturing engineering), and Zachary Loncar (supply chain) with as many questions as they could squeeze through.

Over the course of the conversation, Duke students seemed particularly interested in the overall culture of SpaceX: What was it like to actually work there? What do the employees think of the SpaceX environment, or the way the company approaches engineering?

One thing all of the alumni were quick to key in on was the powerful emphasis their company placed on flexibility and engagement.

“It’s much harder to find someone that says ‘no’ at SpaceX,” Pleatman said. “It’s way easier to find someone who says ‘yes.’ ”

SpaceX’s workflow, Seybert added, is relentlessly adaptive. There are no strict boundaries on what you can work on in your job, and the employee teams are made up of continually evolving combinations of specialists and polymaths.

“It’s extremely dynamic,” Seybert said. “Whatever the needs of the company are, we will shift people around from week to week to support that.”

“It’s crazy—there is no typical week,” Lazowski added. “Everything’s changing all the time.”

SpaceX Launch

Launch of Hispasat 30W-6 Mission

Ning, for her part, focused a great deal on the flexibility SpaceX both offers and demands. New ideas and a willingness to question old ways of thinking are critical to this company’s approach to innovation, and Ning noted that one of the first things she had to learn was to be continuously on the lookout for ways her methods could be improved.

“You should never hear someone say, ‘Oh, we’re doing this because this is how we’ve always done it,’ ” she said.

The way SpaceX approaches engineering and innovation, Seybert explained, is vastly different from how traditional aerospace companies have tended to operate. SpaceX employees are there because of their passion for their work. They focus on the projects they want to focus on, they move between projects on a day-to-day basis, and they don’t expect to stay at any one engineering company for more than a few years. Everything is geared around putting out the best possible product, as quickly as humanly possible.

So now, the million dollar question: How do you get in?

“One thing that I think links us together is the ability to work hands-on,” Loncar offered.

Pleatman agreed. “If you want to get a job at SpaceX directly out of school, it’s really important to have an engineering project that you’ve worked on. It doesn’t matter what it is, but just something where you’ve really made a meaningful contribution, worked hard, and can really talk through the design from start to finish.”

Overall, passion, enthusiasm and flexibility were overarching themes. And honestly, that seems pretty understandable. We are talking about rockets, after all — what’s not to be excited about? These Duke alums are out engineering the frontier of tomorrow — bringing our species one step closer to its place among the stars.

As Ning put it, “I can’t really picture a future where we’re not out exploring space.”

Post by Daniel Egitto

Artificial Intelligence Knows How You Feel

Ever wondered how Siri works? Afraid that super smart robots might take over the world soon?

On April 3rd researchers from Duke, NCSU and UNC came together for Triangle Machine Learning Day to provoke everyone’s curiosities about the complex field that is Artificial Intelligence. A.I. is an overarching term for smart technologies, ranging from self-driving cars to targeted advertising. We can arrive at artificial intelligence through what’s known as “machine learning.” Instead of explicitly programming a machine with the basic capabilities we want it to have, we can make it so that its code is flexible and adapts based on information it’s presented with. Its knowledge grows as a result of training it. In other words, we’re teaching a computer to learn.

Matthew Philips is working with Kitware to get computers to “see,” also known as “machine vision.” By providing thousands and thousands of images, a computer with the right coding can learn to actually make sense of what an image is beyond different colored pixels.

Machine vision has numerous applications. An effective way to search satellite imagery for arbitrary objects could be huge in the advancement of space technology – a satellite could potentially identify obscure objects or potential lifeforms that stick out in those images. This is something we as humans can’t do ourselves just because of the sheer amount of data there is to go through. Similarly, we could teach a machine to identify cancerous or malignant cells in an image, thus giving us a quick diagnosis if someone is at risk of developing a disease.

The problem is, how do you teach a computer to see? Machines don’t easily understand things like similarity, depth or orientation — things that we as humans do automatically without even thinking about. That’s exactly the type of problem Kitware has been tackling.

One hugely successful piece of Artificial Intelligence you may be familiar with is IBM’s Watson. Labeled as “A.I. for professionals,” Watson was featured on Sixty Minutes and even played Jeopardy on live television. Watson has visual recognition capabilities, can work as a translator, and can even understand things like tone, personality or emotional state. And obviously it can answer crazy hard questions. What’s even cooler is that it doesn’t matter how you ask the question – Watson will know what you mean. Watson is basically Siri on steroids, and the world got a taste of its power after watching it smoke its competitors on Jeopardy. However, Watson is not to be thought of as a physical supercomputer. It is a collection of technologies that can be used in many different ways, depending on how you train it. This is what makes Watson so astounding – through machine learning, its knowledge can adapt to the context it’s being used in.

Source: CBS News.

IBM has been able to develop such a powerful tool thanks to data. Stacy Joines from IBM noted, “Data has transformed every industry, profession, and domain.” From our smart phones to fitness devices, data is being collected about us as we speak (see: digital footprint). While it’s definitely pretty scary, the point is that a lot of data is out there. The more data you feed Watson, the smarter it is. IBM has utilized this abundance of data combined with machine learning to produce some of the most sophisticated AI out there.

Sure, it’s a little creepy how much data is being collected on us. Sure, there are tons of movies and theories out there about how intelligent robots in the future will outsmart humans and take over. But A.I. isn’t a thing to be scared of. It’s a beautiful creation that surpasses all capabilities even the most advanced purely programmable model has. It’s joining the health care system to save lives, advising businesses and could potentially find a new inhabitable planet. What we choose to do with A.I. is entirely up to us.

Post by Will Sheehan

Will Sheehan

How a Museum Became a Lab

Encountering and creating art may be some of mankind’s most complex experiences. Art, not just visual but also dancing and singing, requires the brain to understand an object or performance presented to it and then to associate it with memories, facts, and emotions.

A piece in Dario Robleto’s exhibit titled “The Heart’s Knowledge Will Decay” (2014)

In an ongoing experiment, Jose “Pepe” Contreras-Vidal and his team set up in artist Dario Robleto’s exhibit “The Boundary of Life Is Quietly Crossed” at the Menil Collection near downtown Houston. They then asked visitors if they were willing to have their trips through the museum and their brain activities recorded. Robleto’s work was displayed from August 16, 2014 to January 4, 2015. By engaging museum visitors, Contreras-Vidal and Robleto gathered brain activity data while also educating the public, combining research and outreach.

“We need to collect data in a more natural way, beyond the lab” explained Contreras-Vidal, an engineering professor at the University of Houston, during a talk with Robleto sponsored by the Nasher Museum.

More than 3,000 people have participated in this experiment, and the number is growing.

To measure brain activity, the volunteers wear EEG caps which record the electrical impulses that the brain uses for communication. EEG caps are noninvasive because they are just pulled onto the head like swim caps. The caps allow the museum goers to move around freely so Contreras-Vidal can record their natural movements and interactions.

By watching individuals interact with art, Contreras-Vidal and his team can find patterns between their experiences and their brain activity. They also asked the volunteers to reflect on their visit, adding a first person perspective to the experiment. These three sources of data showed them what a young girl’s favorite painting was, how she moved and expressed her reaction to this painting, and how her brain activity reflected this opinion and reaction.

The volunteers can also watch the recordings of their brain signals, giving them an opportunity to ask questions and engage with the science community. For most participants, this is the first time they’ve seen recordings of their brain’s electrical signals. In one trip, these individuals learned about art, science, and how the two can interact. Throughout this entire process, every member of the audience forms a unique opinion and learns something about both the world and themselves as they interact with and make art.

Children with EEG caps explore art.

Contreras-Vidal is especially interested in the gestures people make when exposed to the various stimuli in a museum and hopes to apply this information to robotics. In the future, he wants someone with a robotic arm to not only be able to grab a cup but also to be able to caress it, grip it, or snatch it. For example, you probably can tell if your mom or your best friend is approaching you by their footsteps. Contreras-Vidal wants to restore this level of individuality to people who have prosthetics.

Contreras-Vidal thinks science can benefit art just as much as art can benefit science. Both he and Robleto hope that their research can reduce many artists’ distrust of science and help advance both fields through collaboration.

Post by Lydia Goff

Using Drones to Feed Billions

A drone flying over an agricultural field

Drones revolutionizing farming

As our population continues its rapid growth, food is becoming increasingly scarce. By the year 2050, we will need to double our current food production to feed the estimated 9.6 million mouths that will inhabit Earth.

A portrait of Maggie Monast

Maggie Monast

Thankfully, introducing drones and other high-tech equipment to farmers could be the solution to keeping our bellies full.

Last week, Dr. Ramon G. Leon of North Carolina State University and Maggie Monast of the Environmental Defense Fund spoke at Duke’s monthly Science & Society Dialogue, sharing their knowledge of what’s known as “precision agriculture.” At its core, precision agriculture is integrating technology with farming in order to maximize production.

It is easy to see that farming has already changed as a result of precision agriculture. The old family-run plot of land with animals and diverse crops has turned into large-scale, single-crop operations. This transition was made possible through the use of new technologies — tractors, irrigation, synthetic fertilizer, GMOs, pesticides — and is no doubt way more productive.

A portrait of Dr. Ramon G. Leon

Dr. Ramon G. Leon

So while the concept of precision agriculture certainly isn’t new, in today’s context it incorporates some particularly advanced and unexpected tools meant to further optimize yield while also conserving resources.

Drones equipped with special cameras and sensors, for example, can be flown over thousands of acres and gather huge amounts of data. This data produces a map of  things like pest damage, crop stress and yield. One image from a drone can easily help a farmer monitor what’s going on: where to cut back on resources, what needs more attention, and where to grow a certain type of crop. Some drones can even plant and water crops for you.

Blue River’s “See & Spray” focuses on cutting back herbicide use. Instead of spraying herbicide over an entire field and wasting most of it, this machine is trained to spray weeds directly, using 10% of the normal amount of herbicide.

Similarly, another machine called the Greenseeker can decide where, when and how much fertilizer should be applied based on the greenness of the crop. Fertilizing efficiently means saving money and emitting less ozone-depleting nitrous oxide.

As you can see, fancy toys like these are extremely beneficial, and there are more out there. They enable farmers to make faster, better decisions and understand their land on an unprecedented level. At the same time, farmers can cut back on their resource usage. This should eventually result in a huge productivity boom while helping out the environment. Nice.

One problem preventing these technologies from really taking off is teaching the farmers how to take advantage of them. As Dr. Leon put it, “we have all these toys, but nobody knows how to play with them.” However, this issue can resolved with enough time. Some older farmers love messing around with the drones, and the next generations of farmers will have more exposure to this kind of technology growing up. Sooner or later, it may be no big deal to spot drones circling above fields of wheat as you road trip through the countryside.

A piece of farm equipment in a field

A Greenseeker mounted on a Boom Sprayer

Precision agriculture is fundamental to the modern agricultural revolution. It increases efficiency and reduces waste, and farming could even become a highly profitable business again as the cost for these technologies goes down. Is it the solution to our environmental and production problems? I guess we’ll know by 2050!

Will Sheehan

Post By Will Sheehan

What is a Model?

When you think of the word “model,” what do you think?

As an Economics major, 
the first thing that comes to my mind is a statistical model, modeling phenomena such as the effect of class size on student test scores. A
car connoisseur’s mind might go straight to a model of their favorite vintage Aston
Martin. Someone else studying fashion even might imagine a runway model. The point is, the term “model” is used in popular discourse incredibly frequently, but are we even sure what it implies?

Annabel Wharton, a professor of Art, Art History, and Visual Studies at Duke, gave a talk entitled “Defining Models” at the Visualization Friday Forum. The forum is a place “for faculty, staff and students from across the university (and beyond Duke) to share their research involving the development and/or application of visualization methodologies.” Wharton’s goal was to answer the complex question, “what is a model?”

Wharton began the talk by defining the term “model,” knowing that it can often times be rather ambiguous. She stated the observation that models are “a prolific class of things,” from architectural models, to video game models, to runway models. Some of these types of things seem unrelated, but Wharton, throughout her talk, pointed out the similarities between them and ultimately tied them together as all being models.

The word “model” itself has become a heavily loaded term. According to Wharton, the dictionary definition of “model” is 9 columns of text in length. Wharton then stressed that a model “is an autonomous agent.” This implies that models must be independent of the world and from theory, as well as being independent of their makers and consumers. For example, architecture, after it is built, becomes independent of its architect.

Next, Wharton outlined different ways to model. They include modeling iconically, in which the model resembles the actual thing, such as how the video game Assassins Creed models historical architecture. Another way to model is indexically, in which parts of the model are always ordered the same, such as the order of utensils at a traditional place setting. The final way to model is symbolically, in which a model symbolizes the mechanism of what it is modeling, such as in a mathematical equation.

Wharton then discussed the difference between a “strong model” and a “weak model.” A strong model is defined as a model that determines its weak object, such as an architect’s model or a runway model. On the other hand, a “weak model” is a copy that is always less than its archetype, such as a toy car. These different classifications include examples we are all likely aware of, but weren’t able to explicitly classify or differentiate until now.

Wharton finally transitioned to discussing one of her favorite models of all time, a model of the Istanbul Hagia Sophia, a former Greek Orthodox Christian Church and later imperial mosque. She detailed how the model that provides the best sense of the building without being there is found in a surprising place, an Assassin’s Creed video game. This model is not only very much resembles the actual Hagia Sophia, but is also an experiential and immersive model. Wharton joked that even better, the model allows explorers to avoid tourists, unlike in the actual Hagia Sophia.

Wharton described why the Assassin’s Creed model is a highly effective agent. Not only does the model closely resemble the actual architecture, but it also engages history by being surrounded by a historical fiction plot. Further, Wharton mentioned how the perceived freedom of the game is illusory, because the course of the game actually limits players’ autonomy with code and algorithms.

After Wharton’s talk, it’s clear that models are definitely “a prolific class of things.” My big takeaway is that so many thing in our everyday lives are models, even if we don’t classify them as such. Duke’s East Campus is a model of the University of Virginia’s campus, subtraction is a model of the loss of an entity, and an academic class is a model of an actual phenomenon in the world. Leaving my first Friday Visualization Forum, I am even more positive that models are powerful, and stretch so far beyond the statistical models in my Economics classes.


By Nina Cervantes

Game-Changing App Explores Conservation’s Future

In the first week of February, students, experts and conservationists from across the country were brought together for the second annual Duke Blueprint symposium. Focused around the theme of “Nature and Progress,” this conference hoped to harness the power of diversity and interdisciplinary collaboration to develop solutions to some of the world’s most pressing environmental challenges.

Scott Loarie spoke at Duke’s Mary Duke Biddle Trent Semans Center.

One of the most exciting parts of this symposium’s first night was without a doubt its all-star cast of keynote speakers. The experiences and advice each of these researchers had to offer were far too diverse for any single blog post to capture, but one particularly interesting presentation (full video below) was that of National Geographic fellow Scott Loarie—co-director of the game-changing iNaturalist app.

iNat, as Loarie explained, is a collaborative citizen scientist network with aspirations of developing a comprehensive mapping of all terrestrial life. Any time they go outside, users of this app can photograph and upload pictures of any wildlife they encounter. A network of scientists and experts from around the world then helps the users identify their finds, generating data points on an interactive, user-generated map of various species’ ranges.

Simple, right? Multiply that by 500,000 users worldwide, though, and it’s easy to see why researchers like Loarie are excited by the possibilities an app like this can offer. The software first went live in 2008, and since then its user base has roughly doubled each year. This has meant the generation of over 8 million data points of 150,000 different species, including one-third of all known vertebrate species and 40% of all known species of mammal. Every day, the app catalogues around 15 new species.

“We’re slowly ticking away at the tree of life,” Loarie said.

Through iNaturalist, researchers are able to analyze and connect to data in ways never before thought possible. Changes to environments and species’ distributions can be observed or modeled in real time and with unheard-of collaborative opportunities.

To demonstrate the power of this connectedness, Loarie recalled one instance of a citizen scientist in Vietnam who took a picture of a snail. This species had never been captured, never been photographed, hadn’t been observed in over a century. One of iNat’s users recognized it anyway. How? He’d seen it in one of the journals from Captain James Cook’s 18th-century voyage to circumnavigate the globe.

It’s this kind of interconnectivity that demonstrates not just the potential of apps like iNaturalist, but also the power of collaboration and the possibilities symposia like Duke Blueprint offer. Bridging gaps, tearing down boundaries, building up bonds—these are the heart of conservationism’s future. Nature and Progress, working together, pulling us forward into a brighter world.

Post by Daniel Egitto

 

 

“I Heart Tech Fair” Showcases Cutting-Edge VR and More

Duke’s tech game is stronger than you might think.

OIT held an “I Love Tech Fair” in the Technology Engagement Center / Co-Lab on Feb. 6 that was open to anyone to come check out things like 3D printers and augmented reality, while munching on some Chick-fil-a and cookies. There was a raffle for some sweet prizes, too.

I got a full demonstration of the 3D printing process—it’s so easy! It requires some really expensive software called Fusion, but thankfully Duke is awesome and students can get it for free. You can make some killer stuff 3D printing, the technology is so advanced now. I’ve seen all kinds of things: models of my friend’s head, a doorstop made out of someone’s name … one guy even made a working ukulele apparently!

One of the cooler things at the fair was Augmented Reality books. These books look like ordinary picture books, but looking at a page through your phone’s camera, the image suddenly comes to life in 3D with tons of detail and color, seemingly floating above the book! All you have to do is download an app and get the right book. Augmented reality is only getting better as time goes on and will soon be a primary tool in education and gaming, which is why Duke Digital Initiative (DDI) wanted to show it off.

By far my favorite exhibit at the tech fair was  virtual reality. Throw on a headset and some bulky goggles, grab a controller in each hand, and suddenly you’re in another world. The guy running the station, Mark McGill, had actually hand-built the machine that ran it all. Very impressive guy. He told me the machine is the most expensive and important part, since it accounts for how smooth the immersion is. The smoother the immersion, the more realistic the experience. And boy, was it smooth. A couple years ago I experienced virtual reality at my high school and thought it was cool (I did get a little nauseous), but after Mark set me up with the “HTC Vive” connected to his sophisticated machine, it blew me away (with no nausea, too).

I smiled the whole time playing “Super Hot,” where I killed incoming waves of people in slow motion with ninja stars, guns, and rocks. Mark had tons of other games too, all downloaded from Steam, for both entertainment and educational purposes. One called “Organon” lets you examine human anatomy inside and out, and you can even upload your own MRIs. There’s an unbelievable amount of possibilities VR offers. You could conquer your fear of public speaking by being simulated in front of a crowd, or realistically tour “the VR Museum of Fine Art.” Games like these just aren’t the same were you to play them on, say, an Xbox, because it simply doesn’t have that key factor of feeling like you’re there. In Fallout 4, your heart pounds fast in your chest as you blast away Feral Ghouls and Super Mutants right in front of you. But in reality, you’re just standing in a green room with stupid looking goggles on. Awesome!

There’s another place on campus — the Bolt VR in Edens residence hall — that also has a cutting-edge VR setup going. Mark explained to me that Duke wants people to get experience with VR, as it will soon be a huge part of our lives. Having exposure now could give Duke graduates a very valuable head start in their career (while also making Duke look good). Plus, it’s nice to have on campus for offering students a fun break from all the hard work we put in.

If you’re bummed you missed out, or even if you don’t “love tech,” I recommend checking out the Tech Fair next time — February 13, from 6-8pm. See you there.

Post By Will Sheehan

Will Sheehan

Researchers Get Superman’s X-ray Vision

X-ray vision just got cooler. A technique developed in recent years boosts researchers’ ability to see through the body and capture high-resolution images of animals inside and out.

This special type of 3-D scanning reveals not only bones, teeth and other hard tissues, but also muscles, blood vessels and other soft structures that are difficult to see using conventional X-ray techniques.

Researchers have been using the method, called diceCT, to visualize the internal anatomy of dozens of different species at Duke’s Shared Materials Instrumentation Facility (SMIF).

There, the specimens are stained with an iodine solution that helps soft tissues absorb X-rays, then placed in a micro-CT scanner, which takes thousands of X-ray images from different angles while the specimen spins around. A computer then stitches the scans into digital cross sections and stacks them, like slices of bread, to create a virtual 3-D model that can be rotated, dissected and measured as if by hand.

Here’s a look at some of the images they’ve taken:

See-through shrimp

If you get flushed after a workout, you’re not alone — the Caribbean anemone shrimp does too.

Recent Duke Ph.D. Laura Bagge was scuba diving off the coast of Belize when she noticed the transparent shrimp Ancylomenes pedersoni turn from clear to cloudy after rapidly flipping its tail.

To find out why exercise changes the shrimp’s complexion, Bagge and Duke professor Sönke Johnsen and colleagues compared their internal anatomy before and after physical exertion using diceCT.

In the shrimp cross sections in this video, blood vessels are colored blue-green, and muscle is orange-red. The researchers found that more blood flowed to the tail after exercise, presumably to deliver more oxygen-rich blood to working muscles. The increased blood flow between muscle fibers causes light to scatter or bounce in different directions, which is why the normally see-through shrimp lose their transparency.

Peer inside the leg of a mouse

Duke cardiologist Christopher Kontos, M.D., and MD/PhD student Hasan Abbas have been using the technique to visualize the inside of a mouse’s leg.

The researchers hope the images will shed light on changes in blood vessels in people, particularly those with peripheral artery disease, in which plaque buildup in the arteries reduces blood flow to the extremities such as the legs and feet.

The micro-CT scanner at Duke’s Shared Materials Instrumentation Facility made it possible for Abbas and Kontos to see structures as small as 13 microns, or a fraction of the width of a human hair, including muscle fibers and even small arteries and veins in 3-D.

Take a tour through a tree shrew

DiceCT imaging allows Heather Kristjanson at the Johns Hopkins School of Medicine to digitally dissect the chewing muscles of animals such as this tree shrew, a small mammal from Southeast Asia that looks like a cross between a mouse and a squirrel. By virtually zooming in and measuring muscle volume and the length of muscle fibers, she hopes to see how strong they were. Studying such clues in modern mammals helps Kristjanson and colleagues reconstruct similar features in the earliest primates that lived millions of years ago.

Try it for yourself

Students and instructors who are interested in trying the technique in their research are eligible to apply for vouchers to cover SMIF fees. People at Duke University and elsewhere are encouraged to apply. For more information visit https://smif.pratt.duke.edu/Funding_Opportunities, or contact Dr. Mark Walters, Director of SMIF, via email at mark.walters@duke.edu.

Located on Duke’s West Campus in the Fitzpatrick Building, the SMIF is a shared use facility available to Duke researchers and educators as well as external users from other universities, government laboratories or industry through a partnership called the Research Triangle Nanotechnology Network. For more info visit http://smif.pratt.duke.edu/.

Post by Robin Smith, News and Communications

Post by Robin Smith, News and Communications

Farewell, Electrons: Future Electronics May Ride on New Three-in-One Particle

“Trion” may sound like the name of one of the theoretical particles blamed for mucking up operations aboard the Starship Enterprise.

But believe it or not, trions are real — and they may soon play a key role in electronic devices. Duke researchers have for the first time pinned down some of the behaviors of these one-of-a-kind particles, a first step towards putting them to work in electronics.

A carbon nanotube, shaped like a rod, is wrapped in a helical coating of polymer

Three-in-one particles called trions — carrying charge, energy and spin — zoom through special polymer-wrapped carbon nanotubes at room temperature. Credit: Yusong Bai.

Trions are what scientists call “quasiparticles,” bundles of energy, electric charge and spin that zoom around inside semiconductors.

“Trions display unique properties that you won’t be able to find in conventional particles like electrons, holes (positive charges) and excitons (electron-hole pairs that are formed when light interacts with certain materials),” said Yusong Bai, a postdoctoral scholar in the chemistry department at Duke. “Because of their unique properties, trions could be used in new electronics such as photovoltaics, photodetectors, or in spintronics.”

Usually these properties – energy, charge and spin – are carried by separate particles. For example, excitons carry the light energy that powers solar cells, and electrons or holes carry the electric charge that drives electronic devices. But trions are essentially three-in-one particles, combining these elements together into a single entity – hence the “tri” in trion.

A diagram of how a trion is formed in carbon nanotubes.

A trion is born when a particle called a polaron (top) marries an exciton (middle). Credit: Yusong Bai.

“A trion is this hybrid that involves a charge marrying an exciton to become a uniquely distinct particle,” said Michael Therien, the William R. Kenan, Jr. Professor of Chemistry at Duke. “And the reason why people are excited about trions is because they are a new way to manipulate spin, charge, and the energy of absorbed light, all simultaneously.”

Until recently, scientists hadn’t given trions much attention because they could only be found in semiconductors at extremely low temperatures – around 2 Kelvin, or -271 Celcius. A few years ago, researchers observed trions in carbon nanotubes at room temperature, opening up the potential to use them in real electronic devices.

Bai used a laser probing technique to study how trions behave in carefully engineered and highly uniform carbon nanotubes. He examined basic properties including how they are formed, how fast they move and how long they live.

He was surprised to find that under certain conditions, these unusual particles were actually quite easy to create and control.

“We found these particles are very stable in materials like carbon nanotubes, which can be used in a new generation of electronics,” Bai said. “This study is the first step in understanding how we might take advantage of their unique properties.”

The team published their results Jan. 8 in the Proceedings of the National Academy of Sciences.

Dynamics of charged excitons in electronically and morphologically homogeneous single-walled carbon nanotubes,” Yusong Bai, Jean-Hubert Olivier, George Bullard, Chaoren Liu and Michael J. Therien. Proceedings of the National Academy of Sciences, Jan. 8, 2018 (online) DOI: 10.1073/pnas.1712971115

Post by Kara Manke

Page 1 of 12

Powered by WordPress & Theme by Anders Norén