Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Engineering (Page 1 of 9)

Meet Dr. Sandra K. Johnson, Engineering “Hidden Figure”

When Dr. Sandra K. Johnson first tried her hand at electrical engineering during a summer institute in high school, she knew that she was born to be an electrical engineer. Now, as the first African-American woman to receive a Ph.D. in computer engineering in the United States, Johnson visited Duke to share her story as a “hidden figure” and inspire not just black women, but all students not to be discouraged by obstacles they may face in pursuit of their passion.

Though she did discuss her achievements, Johnson’s talk also made it clear that more than successes, it was the opposition she faced that most motivated her to persevere in electrical engineering. While pursuing a Master’s degree at Stanford, she met Dr. William Shockley, who in his free time was conducting research he believed would prove that African Americans were intellectually inferior to other races. Johnson had originally been planning on just finishing her program with a Master’s and then going into the workforce, but after hearing what this man was trying to prove, she decided she would prove to him that she was capable of doing anything that the non-black students in the same program could do. She finished the program with a Ph.D. in electrical engineering. She continued to make this declaration to anyone who didn’t believe she was capable: “before I leave this place, I will make a believer out of you.”

Dr. Johnson is the founder, CTO and CEO of Global Mobile Finance, Inc., a finance and tech startup based in Research Triangle Park, NC. Photo from BlackComputeHER.

While mapping out her own path to pursuing her goals, Johnson also firmly believed in making the path easier for other black people pursuing advanced degrees. When asked what the current generation of students could be doing to help themselves, she said to find mentors and to mentor others. Johnson shared an anecdote of sitting in a lab at Stanford waiting to begin an experiment when a man walked up to her and said she was in the wrong place. After talking to him for several minutes and showing him that she knew even more about the subject than he did and was in the right place, she told him that the next time someone who looked like her walked into the lab, not to be so sure of himself. Johnson went on to become an IBM Fellow, an IEEE Fellow, and a member of the prestigious Academy of Electrical Engineers. At the end of her talk, Johnson discussed what she believes is the best way to expedite change — to have people of color as founders and CEOs of major corporations that have the power to increase minority representation in their workforce. This is what she intends to do with her own company, Global Mobile Finance, Inc. If her current track record is any indication, there is no doubt her company will become a major corporation in the years to come, opening more doors for black women and other minorities pursuing their passions.

Post by Victoria Priester

Cracking the Code on Credit Cards at Datathon 2018

Anyone who has ever tried to formulate and answer their own research question knows that it means entering uncharted waters. This past weekend the hundreds of students in Duke Datathon 2018 did just that, using only their computer science prowess and a splash of innovation.

Here’s how it worked: the students were provided three data sets by Credit Sesame, a free credit score estimator, and given eight hours to use their insight and computer science knowledge to interpret the data and create as much value for the company as they could. Along the way, Duke Undergraduate Machine Learning (DUML), the organization hosting the event, provided mentors and workshops to help the participants find direction and achieve their goals. 

Datathon participants attempting to derive meaning from the Credit Sesame Data

This year was the first such ‘Datathon’ event to take place at Duke. The event attracted big-name sponsors such as Google and Pinterest and was made possible by the DUML executive team, headed by co-presidents Rohith Kuditipudi and Shrey Gupta (to see a full list of event sponsors, click here).

DUML faculty advisor Dr. Rebecca Steorts said that even the planning of the event transcended disciplines: one of her undergraduate students and co-president of DUML, Shrey Gupta, found a way to utilize statistics to predict how many people would be attending. “It’s all about finding computational ways of combining disciplines to solve the problem,” Steorts said, and it’s very apparent that her students have taken this to heart.

The winning team (Jie Cai, Catie Grasse, Feroze Mohideen) presenting on how they can best gauge which customers are most “valuable” to Credit Sesame

After more than an hour of deliberations, the eight top teams were selected and five finalists were asked to present their findings to the judges. The winning team (Jie Cai, Catie Grasse, Feroze Mohideen) proposed a way to gauge which customers who create trial accounts are most likely to be profitable, by using a computer filtering program to predict likely customer engagement based on customer-supplied data and their interaction with the free trial. Other top teams discussed similar topics with different variations on how Credit Sesame might best create this profile to determine who the “valuable” customers are likely to be.

DUML hosts other events throughout the year to engage students such as their MLBytes Speaker Series and ECE Seminar Series. To learn more about Duke Undergraduate Machine Learning, click here.

by Rebecca Williamson

 

 

 

 

 

Drug Homing Method Helps Rethink Parkinson’s

The brain is the body’s most complex organ, and consequently the least understood. In fact, researchers like Michael Tadross, MD, PhD, wonder if the current research methods employed by neuroscientists are telling us as much as we think.

Michael Tadross is using novel approaches to tease out the causes of neuropsychiatric diseases at a cellular level.

Current methods such as gene editing and pharmacology can reveal how certain genes and drugs affect the cells in a given area of the brain, but they’re limited in that they don’t account for differences among different cell types. With his research, Tadross has tried to target specific cell types to better understand mechanisms that cause neuropsychiatric disorders.

To do this, Tadross developed a method to ensure a drug injected into a region of the brain will only affect specific cell types. Tadross genetically engineered the cell type of interest so that a special receptor protein, called HaloTag, is expressed at the cell membrane. Additionally, the drug of interest is altered so that it is tethered to the molecule that binds with the HaloTag receptor. By connecting the drug to the Halo-Tag ligand, and engineering only the cell type of interest to express the specific Halo-Tag receptor, Tadross effectively limited the cells affected by the drug to just one type. He calls this method “Drugs Acutely Restricted by Tethering,” or DART.

Tadross has been using the DART method to better understand the mechanisms underlying Parkinson’s disease. Parkinson’s is a neurological disease that affects a region of the brain called the striatum, causing tremors, slow movement, and rigid muscles, among other motor deficits.

Only cells expressing the HaloTag receptor can bind to the AMPA-repressing drug, ensuring virtually perfect cell-type specificity.

Patients with Parkinson’s show decreased levels of the neurotransmitter dopamine in the striatum. Consequently, treatments that involve restoring dopamine levels improve symptoms. For these reasons, Parkinson’s has long been regarded as a disease caused by a deficit in dopamine.

With his technique, Tadross is challenging this assumption. In addition to death of dopaminergic neurons, Parkinson’s is associated with an increase of the strength of synapses, or connections, between neurons that express AMPA receptors, which are the most common excitatory receptors in the brain.

In order to simulate the effects of Parkinson’s, Tadross and his team induced the death of dopaminergic neurons in the striatum of mice. As expected, the mice displayed significant motor impairments consistent with Parkinson’s. However, in addition to inducing the death of these neurons, Tadross engineered the AMPA-expressing cells to produce the Halo-Tag protein.

Tadross then treated the mice striatum with a common AMPA receptor blocker tethered to the Halo-Tag ligand. Amazingly, blocking the activity of these AMPA-expressing neurons, even in the absence of the dopaminergic neurons, reversed the effects of Parkinson’s so that the previously affected mice moved normally.

Tadross’s findings with the Parkinson’s mice exemplifies how little we know about cause and effect in the brain. The key to designing effective treatments for neuropsychiatric diseases, and possibly other diseases outside the nervous system, may be in teasing out the relationship of specific types of cells to symptoms and targeting the disease that way.

The ingenious work of researchers like Tadross will undoubtedly help bring us closer to understanding how the brain truly works.

Post by undergraduate blogger Sarah Haurin

Post by undergraduate blogger Sarah Haurin

 

Heating Up the Summer, 3D Style

While some students like to spend their summer recovering from a long year of school work, others are working diligently in the Innovation Co-Lab in the Telcom building on West Campus.

They’re working on the impacts of dust and particulate matter (PM) pollution on solar panel performance, and discovering new technologies that map out the 3D volume of the ocean.

The Co-Lab is one of three 3D printing labs located on campus. It allows students and faculty the opportunity to creatively explore research through the use of new and emerging technologies.

Third-year PhD candidate Michael Valerino said his long term research project focuses on how dust and air pollution impacts the performance of solar panels.

“I’ve been designing a low-cost prototype which will monitor the impact of dust and air pollution on solar panels,” said Valerino. “The device is going to be used to monitor the impacts of dust and particulate matter (PM) pollution on solar panel performance. This processis known as soiling. This is going to be a low-cost alternative (~$200 ) to other monitoring options that are at least $5,000.”

Most of the 3D printers come with standard Polylactic acid (PLA) material for printing. However, because his first prototype completely melted in India’s heat, Valerino decided to switch to black carbon fiber and infused nylon.

“It really is a good fit for what I want to do,” he said. “These low-cost prototypes will be deployed in China, India, and the Arabian Peninsula to study global soiling impacts.”

In a step-by-step process, he applied acid-free glue to the base plate that holds the black carbon fiber and infused nylon. He then placed the glass plate into the printer and closely examined how the thick carbon fiber holds his project together.

Michael Bergin, a professor of civil and environmental engineering professor at Duke collaborated with the Indian Institute of Technology-Gandhinagar and the University of Wisconsin last summer to work on a study about soiling.

The study indicated that there was a decrease in solar energy as the panels became dirtier over time. The solar cells jumped 50 percent in efficiency after being cleaned for the first time in several weeks. Valerino’s device will be used to expand Bergin’s work.

As Valerino tackles his project, Duke student volunteers and high school interns are in another part of the Co-Lab developing technology to map the ocean floor.

The Blue Devil Ocean Engineering team will be competing in the Shell Ocean Discovery XPRIZE, a global technology competition challenging teams to advance deep-sea technologies for autonomous, fast and high-resolution ocean exploration. (Their mentor, Martin Brooke, was recently featured on Science Friday.)

The team is developing large, highly redundant carbon drones that are eight feet across. The drones will fly over the ocean and drop pods into the water that will sink to collect sonar data.

Tyler Bletsch, a professor of the practice in electrical and computer engineering, is working alongside the team. He describes the team as having the most creative approach in the competition.

“We have many parts of this working, but this summer is really when it needs to come together,” Bletsch said. “Last year, we made it through round one of the competition and secured $100,000 for the university. We’re now using that money for the final phase of the competition.”

The final phase of the competition is scheduled to be held fall 2018.
Though campus is slow this summer, the Innovation Co-Lab is keeping busy. You can keep up-to-date with their latest projects here.

Post by Alexis Owens

 

Artificial Intelligence Knows How You Feel

Ever wondered how Siri works? Afraid that super smart robots might take over the world soon?

On April 3rd researchers from Duke, NCSU and UNC came together for Triangle Machine Learning Day to provoke everyone’s curiosities about the complex field that is Artificial Intelligence. A.I. is an overarching term for smart technologies, ranging from self-driving cars to targeted advertising. We can arrive at artificial intelligence through what’s known as “machine learning.” Instead of explicitly programming a machine with the basic capabilities we want it to have, we can make it so that its code is flexible and adapts based on information it’s presented with. Its knowledge grows as a result of training it. In other words, we’re teaching a computer to learn.

Matthew Philips is working with Kitware to get computers to “see,” also known as “machine vision.” By providing thousands and thousands of images, a computer with the right coding can learn to actually make sense of what an image is beyond different colored pixels.

Machine vision has numerous applications. An effective way to search satellite imagery for arbitrary objects could be huge in the advancement of space technology – a satellite could potentially identify obscure objects or potential lifeforms that stick out in those images. This is something we as humans can’t do ourselves just because of the sheer amount of data there is to go through. Similarly, we could teach a machine to identify cancerous or malignant cells in an image, thus giving us a quick diagnosis if someone is at risk of developing a disease.

The problem is, how do you teach a computer to see? Machines don’t easily understand things like similarity, depth or orientation — things that we as humans do automatically without even thinking about. That’s exactly the type of problem Kitware has been tackling.

One hugely successful piece of Artificial Intelligence you may be familiar with is IBM’s Watson. Labeled as “A.I. for professionals,” Watson was featured on Sixty Minutes and even played Jeopardy on live television. Watson has visual recognition capabilities, can work as a translator, and can even understand things like tone, personality or emotional state. And obviously it can answer crazy hard questions. What’s even cooler is that it doesn’t matter how you ask the question – Watson will know what you mean. Watson is basically Siri on steroids, and the world got a taste of its power after watching it smoke its competitors on Jeopardy. However, Watson is not to be thought of as a physical supercomputer. It is a collection of technologies that can be used in many different ways, depending on how you train it. This is what makes Watson so astounding – through machine learning, its knowledge can adapt to the context it’s being used in.

Source: CBS News.

IBM has been able to develop such a powerful tool thanks to data. Stacy Joines from IBM noted, “Data has transformed every industry, profession, and domain.” From our smart phones to fitness devices, data is being collected about us as we speak (see: digital footprint). While it’s definitely pretty scary, the point is that a lot of data is out there. The more data you feed Watson, the smarter it is. IBM has utilized this abundance of data combined with machine learning to produce some of the most sophisticated AI out there.

Sure, it’s a little creepy how much data is being collected on us. Sure, there are tons of movies and theories out there about how intelligent robots in the future will outsmart humans and take over. But A.I. isn’t a thing to be scared of. It’s a beautiful creation that surpasses all capabilities even the most advanced purely programmable model has. It’s joining the health care system to save lives, advising businesses and could potentially find a new inhabitable planet. What we choose to do with A.I. is entirely up to us.

Post by Will Sheehan

Will Sheehan

Stretchable, Twistable Wires for Wearable Electronics

A new conductive “felt” carries electricity even when twisted, bent and stretched. Credit: Matthew Catenacci

The exercise-tracking power of a Fitbit may soon jump from your wrist and into your clothing.

Researchers are seeking to embed electronics such as fitness trackers and health monitors into our shirts, hats, and shoes. But no one wants stiff copper wires or silicon transistors deforming their clothing or poking into their skin.

Scientists in Benjamin Wiley’s lab at Duke have created new conductive “felt” that can be easily patterned onto fabrics to create flexible wires. The felt, composed of silver-coated copper nanowires and silicon rubber, carries electricity even when bent, stretched and twisted, over and over again.

“We wanted to create wiring that is stretchable on the body,” said Matthew Catenacci, a graduate student in Wiley’s group.

The conductive felt is made of stacks of interwoven silver-coated copper nanotubes filled with a stretchable silicone rubber (left). When stretched, felt made from more pliable rubber is more resilient to small tears and holes than felts made of stiffer rubber (middle). These tears can be seen in small cavities in the felt (right). Credit: Matthew Catenacci

To create a flexible wire, the team first sucks a solution of copper nanowires and water through a stencil, creating a stack of interwoven nanowires in the desired shape. The material is similar to the interwoven fibers that comprise fabric felt, but on a much smaller scale, said Wiley, an associate professor of chemistry at Duke.

“The way I think about the wires are like tiny sticks of uncooked spaghetti,” Wiley said. “The water passes through, and then you end up with this pile of sticks with a high porosity.”

The interwoven nanowires are heated to 300 F to melt the contacts together, and then silicone rubber is added to fill in the gaps between the wires.

To show the pliability of their new material, Catenacci patterned the nanowire felt into a variety of squiggly, snaking patterns. Stretching and twisting the wires up to 300 times did not degrade the conductivity.

The material maintains its conductivity when twisted and stretched. Credit: Matthew Catenacci

“On a larger scale you could take a whole shirt, put it over a vacuum filter, and with a stencil you could create whatever wire pattern you want,” Catenacci said. “After you add the silicone, so you will just have a patch of fabric that is able to stretch.”

Their felt is not the first conductive material that displays the agility of a gymnast. Flexible wires made of silver microflakes also exhibit this unique set of properties. But the new material has the best performance of any other material so far, and at a much lower cost.

“This material retains its conductivity after stretching better than any other material with this high of an initial conductivity. That is what separates it,” Wiley said.

Stretchable Conductive Composites from Cu-Ag Nanowire Felt,” Matthew J. Catenacci, Christopher Reyes, Mutya A. Cruz and Benjamin J. Wiley. ACS Nano, March 14, 2018. DOI: 10.1021/acsnano.8b00887

Post by Kara Manke

How a Museum Became a Lab

Encountering and creating art may be some of mankind’s most complex experiences. Art, not just visual but also dancing and singing, requires the brain to understand an object or performance presented to it and then to associate it with memories, facts, and emotions.

A piece in Dario Robleto’s exhibit titled “The Heart’s Knowledge Will Decay” (2014)

In an ongoing experiment, Jose “Pepe” Contreras-Vidal and his team set up in artist Dario Robleto’s exhibit “The Boundary of Life Is Quietly Crossed” at the Menil Collection near downtown Houston. They then asked visitors if they were willing to have their trips through the museum and their brain activities recorded. Robleto’s work was displayed from August 16, 2014 to January 4, 2015. By engaging museum visitors, Contreras-Vidal and Robleto gathered brain activity data while also educating the public, combining research and outreach.

“We need to collect data in a more natural way, beyond the lab” explained Contreras-Vidal, an engineering professor at the University of Houston, during a talk with Robleto sponsored by the Nasher Museum.

More than 3,000 people have participated in this experiment, and the number is growing.

To measure brain activity, the volunteers wear EEG caps which record the electrical impulses that the brain uses for communication. EEG caps are noninvasive because they are just pulled onto the head like swim caps. The caps allow the museum goers to move around freely so Contreras-Vidal can record their natural movements and interactions.

By watching individuals interact with art, Contreras-Vidal and his team can find patterns between their experiences and their brain activity. They also asked the volunteers to reflect on their visit, adding a first person perspective to the experiment. These three sources of data showed them what a young girl’s favorite painting was, how she moved and expressed her reaction to this painting, and how her brain activity reflected this opinion and reaction.

The volunteers can also watch the recordings of their brain signals, giving them an opportunity to ask questions and engage with the science community. For most participants, this is the first time they’ve seen recordings of their brain’s electrical signals. In one trip, these individuals learned about art, science, and how the two can interact. Throughout this entire process, every member of the audience forms a unique opinion and learns something about both the world and themselves as they interact with and make art.

Children with EEG caps explore art.

Contreras-Vidal is especially interested in the gestures people make when exposed to the various stimuli in a museum and hopes to apply this information to robotics. In the future, he wants someone with a robotic arm to not only be able to grab a cup but also to be able to caress it, grip it, or snatch it. For example, you probably can tell if your mom or your best friend is approaching you by their footsteps. Contreras-Vidal wants to restore this level of individuality to people who have prosthetics.

Contreras-Vidal thinks science can benefit art just as much as art can benefit science. Both he and Robleto hope that their research can reduce many artists’ distrust of science and help advance both fields through collaboration.

Post by Lydia Goff

Using Drones to Feed Billions

A drone flying over an agricultural field

Drones revolutionizing farming

As our population continues its rapid growth, food is becoming increasingly scarce. By the year 2050, we will need to double our current food production to feed the estimated 9.6 million mouths that will inhabit Earth.

A portrait of Maggie Monast

Maggie Monast

Thankfully, introducing drones and other high-tech equipment to farmers could be the solution to keeping our bellies full.

Last week, Dr. Ramon G. Leon of North Carolina State University and Maggie Monast of the Environmental Defense Fund spoke at Duke’s monthly Science & Society Dialogue, sharing their knowledge of what’s known as “precision agriculture.” At its core, precision agriculture is integrating technology with farming in order to maximize production.

It is easy to see that farming has already changed as a result of precision agriculture. The old family-run plot of land with animals and diverse crops has turned into large-scale, single-crop operations. This transition was made possible through the use of new technologies — tractors, irrigation, synthetic fertilizer, GMOs, pesticides — and is no doubt way more productive.

A portrait of Dr. Ramon G. Leon

Dr. Ramon G. Leon

So while the concept of precision agriculture certainly isn’t new, in today’s context it incorporates some particularly advanced and unexpected tools meant to further optimize yield while also conserving resources.

Drones equipped with special cameras and sensors, for example, can be flown over thousands of acres and gather huge amounts of data. This data produces a map of  things like pest damage, crop stress and yield. One image from a drone can easily help a farmer monitor what’s going on: where to cut back on resources, what needs more attention, and where to grow a certain type of crop. Some drones can even plant and water crops for you.

Blue River’s “See & Spray” focuses on cutting back herbicide use. Instead of spraying herbicide over an entire field and wasting most of it, this machine is trained to spray weeds directly, using 10% of the normal amount of herbicide.

Similarly, another machine called the Greenseeker can decide where, when and how much fertilizer should be applied based on the greenness of the crop. Fertilizing efficiently means saving money and emitting less ozone-depleting nitrous oxide.

As you can see, fancy toys like these are extremely beneficial, and there are more out there. They enable farmers to make faster, better decisions and understand their land on an unprecedented level. At the same time, farmers can cut back on their resource usage. This should eventually result in a huge productivity boom while helping out the environment. Nice.

One problem preventing these technologies from really taking off is teaching the farmers how to take advantage of them. As Dr. Leon put it, “we have all these toys, but nobody knows how to play with them.” However, this issue can resolved with enough time. Some older farmers love messing around with the drones, and the next generations of farmers will have more exposure to this kind of technology growing up. Sooner or later, it may be no big deal to spot drones circling above fields of wheat as you road trip through the countryside.

A piece of farm equipment in a field

A Greenseeker mounted on a Boom Sprayer

Precision agriculture is fundamental to the modern agricultural revolution. It increases efficiency and reduces waste, and farming could even become a highly profitable business again as the cost for these technologies goes down. Is it the solution to our environmental and production problems? I guess we’ll know by 2050!

Will Sheehan

Post By Will Sheehan

Student Ingenuity vs. Environmental Issues (like Cow Farts)

Lots of creative and potentially life changing ideas filled the Fitzpatrick CIEMAS atrium last weekend. From devices meant to address critical environmental issues such as global warming and lion fish invasiveness, to apps that help you become more sustainable, Duke’s Blueprint tech ideation conference showcased some awesome, good ol’ student-led ingenuity.

These bright students from around Durham (mostly from Duke) competed in teams to create something that would positively impact the environment. The projects were judged for applicability, daringness, and feasibility, among other things. During the Project Expo, all teams briefly presented to viewers like a school science fair.

One of the projects I liked a lot was called Entropy—a website with your own personal plant (I named mine “Pete”) that grows or dies depending on your sustainable actions throughout the day. The user answers simple yes or no questions, such as, “did you turn off the lights today?”

You can also complete daily goals to get accessories like a hat or mustache for your plant. The website connects to Facebook, so you can track your friends’ progress and see how green they’re living. Ultimately it’s just a good, fun way to keep your sustainability in check. Pete was looking super-cute after I spammed the yes button.

Another interesting innovation posed a solution to the difficulty of catching lion fish. Humans are a lion fish’s only predator, and we hunt them by spear fishing. Since lion fish are highly invasive, catching them en-masse could seriously benefit the biodiversity of the ocean (plus, they taste delicious). So one team came up with a canopy like contraption that attracts lion fish to hang out underneath it, and then snatches them all up at once like a net. Pretty neat idea, and if it was implemented on a large scale could be a huge benefit to the Earth’s oceans (and restaurants)!

After the expo, the top seven teams were selected and given three minutes to present to the judges and audience as a whole.

Every project was astounding. “Collide-o-scope” came up with a simple Arduino-based device to transmit elephant seismic activity to train drivers nearby in order to reduce the number of train-elephant collisions in India and Sri Lanka — currently a huge problem, for both us as humans and the elephant population.

Another team, “Manatee Marker,” proposed a system of solar powered buoys to detect manatees, with the hope of reducing frequent manatee-boat accidents. Considering that manatees are quiet, basically camouflaged, and thermally invisible, this was quite an ingenious task.

Perhaps my favorite project, “Algenie” stole the show. Methane gas is a huge factor to global warming — around twenty-five times more potent as a heat-trapping gas than Carbon Dioxide — and a lot of it comes from cow farts. However, we’ve recently discovered that putting seaweed in cow feed actually lowers methane emissions almost entirely! So this team came up with a vertical, three-dimensional way to grow algae — opposed to “two-dimensionally” growing across a pond — that would maximize production. Global warming is obviously a massive issue right now and Algenie is looking to change that. They ended up getting first place, and winning a prize of $1,000 along with GoPros for every team member.

Algenie’s prototype

At the end of the day, it wasn’t about the prize money. The competition was meant to generate creative and practical ideas, while promoting making a difference. After  attending the expo I felt more aware of all the environmental issues and influenced to help out. Even if you don’t feel like spending the time drafting up a crazy buoy manatee-detecting system, you can still do your part by living sustainably day to day.

Blueprint has done an awesome job of spurring young, enthusiastic students towards helping this planet — one cow fart at a time.

Post by Will Sheehan; Will SheehanPictures from Duke Conservation Tech

“I Heart Tech Fair” Showcases Cutting-Edge VR and More

Duke’s tech game is stronger than you might think.

OIT held an “I Love Tech Fair” in the Technology Engagement Center / Co-Lab on Feb. 6 that was open to anyone to come check out things like 3D printers and augmented reality, while munching on some Chick-fil-a and cookies. There was a raffle for some sweet prizes, too.

I got a full demonstration of the 3D printing process—it’s so easy! It requires some really expensive software called Fusion, but thankfully Duke is awesome and students can get it for free. You can make some killer stuff 3D printing, the technology is so advanced now. I’ve seen all kinds of things: models of my friend’s head, a doorstop made out of someone’s name … one guy even made a working ukulele apparently!

One of the cooler things at the fair was Augmented Reality books. These books look like ordinary picture books, but looking at a page through your phone’s camera, the image suddenly comes to life in 3D with tons of detail and color, seemingly floating above the book! All you have to do is download an app and get the right book. Augmented reality is only getting better as time goes on and will soon be a primary tool in education and gaming, which is why Duke Digital Initiative (DDI) wanted to show it off.

By far my favorite exhibit at the tech fair was  virtual reality. Throw on a headset and some bulky goggles, grab a controller in each hand, and suddenly you’re in another world. The guy running the station, Mark McGill, had actually hand-built the machine that ran it all. Very impressive guy. He told me the machine is the most expensive and important part, since it accounts for how smooth the immersion is. The smoother the immersion, the more realistic the experience. And boy, was it smooth. A couple years ago I experienced virtual reality at my high school and thought it was cool (I did get a little nauseous), but after Mark set me up with the “HTC Vive” connected to his sophisticated machine, it blew me away (with no nausea, too).

I smiled the whole time playing “Super Hot,” where I killed incoming waves of people in slow motion with ninja stars, guns, and rocks. Mark had tons of other games too, all downloaded from Steam, for both entertainment and educational purposes. One called “Organon” lets you examine human anatomy inside and out, and you can even upload your own MRIs. There’s an unbelievable amount of possibilities VR offers. You could conquer your fear of public speaking by being simulated in front of a crowd, or realistically tour “the VR Museum of Fine Art.” Games like these just aren’t the same were you to play them on, say, an Xbox, because it simply doesn’t have that key factor of feeling like you’re there. In Fallout 4, your heart pounds fast in your chest as you blast away Feral Ghouls and Super Mutants right in front of you. But in reality, you’re just standing in a green room with stupid looking goggles on. Awesome!

There’s another place on campus — the Bolt VR in Edens residence hall — that also has a cutting-edge VR setup going. Mark explained to me that Duke wants people to get experience with VR, as it will soon be a huge part of our lives. Having exposure now could give Duke graduates a very valuable head start in their career (while also making Duke look good). Plus, it’s nice to have on campus for offering students a fun break from all the hard work we put in.

If you’re bummed you missed out, or even if you don’t “love tech,” I recommend checking out the Tech Fair next time — February 13, from 6-8pm. See you there.

Post By Will Sheehan

Will Sheehan

Page 1 of 9

Powered by WordPress & Theme by Anders Norén