Friday, November 15, 2024

Teens’ Tech Reduces Drowning and Fights Air Pollution

Drowning is the third leading cause of accidental deaths globally, according to the World Health Organization. The deaths disproportionately impact low- and middle-income communities, whose beaches tend to lack lifeguards because of limited funds. Last year 104 drownings of the 120 reported in the United States occurred on unstaffed beaches.

That fueled Angelina Kim’s drone project. Kim is a senior at the Bishop’s School in La Jolla, Calif. Her Autonomous Unmanned Aerial Vehicle (UAV) System for Ocean Hazard Recognition and Rescue: Scout and Rescue UAV Prototype project was showcased in May at Regeneron’s International Science and Engineering Fair (ISEF) in Los Angeles.

Kim project took first place in this year’s IEEE Presidents’ Scholarship competition: a prize of US $10,000 payable over four years of undergraduate university study. The IEEE Foundation established the award to acknowledge a deserving student whose project demonstrates an understanding of electrical or electronics engineering, computer science, or other IEEE field of interest. The scholarship is administered by IEEE Educational Activities.

Kim has long been motivated to help those in need, especially after her mother’s illness when Kim was a young child.

“I’ve been determined to find ways to create a safer community using technology my whole life,” she says. “I realized as I got more into robotics that technology can be used to protect those in my community.”

Kim and the second- and third-place scholarship winners also received a complimentary IEEE student membership.

In addition, to mark the scholarship’s quarter century, IEEE established a 25th anniversary award to honor a project that is likely to make a difference. It also was given out at ISEF.

Drones that can save swimmers’ lives

young man standing next to a poster board with words with a blue curtain background Angelina Kim’s autonomous drone system, consisting of a scout and rescue drone, aims to prevent drownings on beaches that have no lifeguards.Lynn Bowlby

The autonomous UAV lifeguard system consists of two types of drones: a scout craft and a rescue craft. The scout drone surveys approximately 1 kilometer of shoreline, taking photographs and analyzing them for rip currents, which can be deadly to swimmers.

The scout drone “implements a new version of differential frame displacement and a new depth-risk model,” Kim says. The displacement compares the previous image to the next one and notes in what direction a wave is moving. The algorithm detects rip currents and, using the depth-risk model, focuses on strong currents in the deep end—which are more dangerous. If the scout drone detects a swimmer caught in such a current, it summons the rescue drone. That drone drops a flotation device outfitted with a heaving rope and can pull the endangered swimmer to shore, Kim says.

The rescue drones operate with roll-and-pitch tilt rotors, allowing them to fly in whatever direction and orientation they need to.

Kim says she was shocked and ecstatic to receive the award: “It felt like a dream come true because IEEE has been the organization I’ve always wanted to be a part of.”

She presented her project at two IEEE gatherings this year: the IEEE International Conference on Control and Automation and the IEEE International Conference on Automatic Control and Intelligent Systems.

Kim says some of the roadblocks she encountered with her project have made her a better engineer.

“If everything goes perfectly, then there will not be much to learn,” she says. “But if you fail—like accidentally flying your drone into your parents’ house, like I did—you get opportunities to learn.”

She plans to study electrical or mechanical engineering in college. Eventually, she says, she would like to build and mass-produce technologies that help communities at a low cost.

Spotting carbon dioxide in soil

young woman standing next to a poster board with words with a blue curtain background Sahiti Bulusu’s Carboflux device measures and tracks carbon levels in soil to collect data for climatologists. Lynn Bowlby

Excessive carbon dioxide is a leading cause of global warming and climate change, according to NASA. Burning fossil fuels is a prominent source of carbon dioxide, but it also can be released from the ground.

“People don’t account for the large amounts of CO2 that come from soil,” second-place winner Sahiti Busulu says. “Carbon flux is the rate at which CO2 is exchanged between the soil and the atmosphere, and can account for 80 percent of net ecosystem carbon exchange.”

Busulu is a senior at the Basis Independent high school in Fremont, California.

Ecosystem carbon exchange is the transfer of carbon dioxide between the atmosphere and the physical environment. Because the exchange is not often accounted for, there’s a gap in data. To provide additional information, Busula created the Carboflux Network.

The sensor node is designed to measure enhanced flux CO2 and add to the Global Ecosystem Monitoring Network. The system contains three main subunits: a flux chamber, an under-soil sensor array, and a microcontroller with a data transmission unit. The enclosed flux chamber houses sensors. When CO2 accumulates in the chamber, the linear increase in carbon dioxide concentration is used to calculate carbon flux. The second subunit uses the gradient method, which measures flux using Fick’s law, looking at the CO2 gradient in the soil, and multiplying it with the soil diffusivity coefficient. Sensors are put in the ground at different depths to measure the gradient.

“Carboflux uses the gas concentration between the depths to predict what the soil carbon flux would be,” Bulusu says.

The system is automated and provides real-time data through microcontrollers and a modem. The device studies soil flux, both above and beneath the ground, she says.

“It is able to be scaled locally and globally,” she says, “helping to pinpoint local carbon sources and carbon sinks.” The data also can give scientists a large-scale global perspective.

Bulusu came up with her idea with help from mentors and professors Helen Dahlke and Elad Levintal.

“They told me about the lack of global carbon flux data, and the idea for a network of CO2 flux sensors,” Bulusu says. “This data is needed for climate modeling and mitigation strategies.

“I didn’t think I would be so passionate about a project. I love science, but I never thought I would be someone who would sit in the rain and cold for hours trying to figure out a problem. I was so invested in this project, and it has come so far.”

Bulusu plans to pursue a degree in computer science or environmental science. Whatever field she choses, she says, she wants to improve the environment with the technology she creates.

She was awarded a $600 scholarship for her project.

Accessible communication for ALS patients

young man standing next to a poster board with words with a blue curtain background Gaze Link, developed by Xiangzhou Sun, uses a mobile app, camera, and artificial intelligence to help those with ALS communicate through their eye movements. Lynn Bowlby

Xiangzhou “Jonas” Sun has been volunteering to help people with amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease. After Sun and his family spent time helping ALS patients and assisting caretakers in Hong Kong and the United States, he was inspired to create a mobile app to assist them. He is a senior at the Webb School of California, in Claremont.

While volunteering, he noticed that ALS patients had trouble communicating because of the disease.

“ALS damages neurons in the body, and patients gradually lose the ability to walk, move their hands, and speak,” he says. “My objective with Gaze Link was to build a mobile application that could help ALS patients to type sentences with only their eyes and without external assistance.”

The low-cost smartphone app uses eye-gesture recognition, AI sentence generation, and text-to-speech capabilities to allow people with disabilities linked to ALS to use their phone’s front-end camera to communicate.

Gaze Link now works with three languages: English, Mandarin, and Spanish.

Users’ eye gestures are mapped to words and pronunciations, and an AI next-word-prediction feature is incorporated.

Sun’s passion for his project is palpable.

“My favorite moment was when I brought a prototype of Gaze Link to a caretaker from the ALS Association, and they were overjoyed,” he says. “That moment was really emotional for me because I was motivated by these patients. It gave me a huge sense of achievement that I could finally help them.”

Sun received a $400 scholarship for placing third.

He says he plans to use his engineering know-how to help people with disabilities. He is passionate about design and visual arts as well, and says he hopes to combine them with his engineering skills.

Gaze Link is available through Google Play.

An inclusive way to code

 young man standing next to a poster board with words with a blue curtain background Abhisek Shah’s AuralStudio provides a way for programmers with visual impairments to code using the Rattle language he wrote, which can be read aloud.Lynn Bowlby

The 25th anniversary award, in the amount of $1,000, was given to Abishek Shah, a senior at Green Level High School, in Cary, N.C.

Last year Shah had a temporary vision issue, which made looking at a computer screen nearly impossible, even for a short time. After that, while visiting family back home in India, he got a chance to interact with some students at a school for blind girls.

During his interactions with the students, he says, he realized they were determined and driven and wanted to be financially independent. None had even considered a career in computer programming, however, believing it to be off-limits because of their visual impairment, he says.

He wondered if there was a solution that could help blind people code.

“How can we redesign and rethink the way we code today?” he asked himself before devising his AuralStudio. He realized he could put his passion for coding to good use and created a software application for those whose vision impairments are permanent.

His AuralStudio development environment allows programmers with a visual disability to write, build, run, and test prototypes.

“All this was built toward helping those with disabilities learn to code,” Shah says.

It eliminates the need for a keyboard and mouse in favor of a custom control pad. It includes a voice-only option for those who cannot use their hands.

The testbed uses Rattle, a programming language that Shah created to be read aloud, both from the computer and the programmer. AuralStudio also uses acyclic digraphs to render code, making it easier and more intuitive. Shah wrote a two-part autocorrect algorithm to prevent homophones and homonyms from causing errors. It was done by integrating AI into the application.

Shah used the programming languages Rust, C, JavaScript, and Python. Raspberry Pi was the primary hardware.

He worked with visually impaired students from the Governor Morehead School of Raleigh, N.C., teaching them programming basics.

“The students were so eager to learn,” he says. “Getting to see their eagerness, grit, and hard-working nature was so heartwarming and probably the best part of this project.”

Shah says he plans to study computer science and business.

“Anything I build will have the end goal of improving the lives of others,” he says.

Related Articles

Latest Articles