A color photo showing a child working on a computer at the BOOM showcase

Darren Key ’25 was tired of listening to the same lofi hip-hop songs over and over again while studying. “I love lofi music,” he said. “My thought process was, if I create an infinite radio of lofi music, it would help me study better.”

So, the computer science major made his own endless stream of AI-generated lofi hip-hop music by turning songs into an array of numbers, and taking a similar approach to Stable Diffusion, a AI text-to-image generator. He admits that the music isn't quite good enough for studying – he's still tweaking the AI – but the project was impressive enough to earn him an award at the Bits On Our Minds (BOOM) technology showcase, held April 27 in the Duffield Hall atrium. 

After a three-year pause due to COVID-19, BOOM returned to campus for its 25th anniversary. The event started in 1998 as a small showcase for computer science students, but is now hosted by the Cornell Ann S. Bowers College of Computing and Information Science and features cutting-edge technology projects from across the university. Nearly 100 students presented 32 projects in an atrium packed with visitors from Cornell and the broader Ithaca community.

Faculty from each department within Cornell Bowers CIS and representatives from seven corporate sponsors judged the projects. Selected teams received a trophy, certificate, and $750 prize;  Kavita Bala, dean of Cornell Bowers CIS, announced the awards.

“BOOM is a unique opportunity for students to demonstrate their technology projects to everyone from school kids to industry experts,” said Danica Rickards, program coordinator in Undergraduate Student Services, and BOOM committee chair. “They have a chance to really hone their elevator pitches and get input from corporate sponsors.” 

A color photo showing two people working on a laptop computer with another person looking on

         The projects included Ithaca Hunt, an app to help new students explore off-campus activities with friends; Tree Folio, a city planning tool that converts remote sensing data collected by airplanes into a digital map of each tree and the shade it provides; and The Bookkeeper, which looks like a stack of books, but locks away mobile devices for a set time to help users focus.

The Ithaca High School’s Code Red Robotics team also attended the event with their 4-foot-tall robot, which lifts traffic cones and other objects and maneuvers around to deposit them on shelves.

One popular project was Cosmic Swing, a computer game created by Francisco Kyriacou ‘24, Ankit Lakkapragada ‘25, Emily Hong ‘23, Matthew Karwan ’24, and Joseph Tung ’24 for their CIS 3152 Intro to Game Development class. The team brought the game to BOOM to get feedback from users before putting on the finishing touches. The main character is a Martian who swings through spinning worlds using a rope-like appendage to collect resources for their dying home planet. 

“The special aspect of our game is that our world is rotating, so the players have to be making fast decisions to grab onto the rotating platforms,” said Hong, a design and environmental analysis major.

Ana Suppé ’23, an environmental science and sustainability major, and Griffin Blotner ’24, an information science major, came to BOOM to present their initiative to increase e-waste recycling on campus. Currently, there is one e-waste drop box at Barton Hall, and the campus recycles more than 100 tons each year, but that number has remained stagnant for more than a decade.

They hope to create greater awareness of the issue and to work with Cornell Bowers CIS, Cornell Engineering, the campus sustainability office, and Student and Campus Life to expand e-waste recycling options. “We’ve been talking with Facilities Management and other groups to get the ball rolling,” Suppé said.

All BOOM projects and descriptions are listed here. The awarded projects were:

Sponsor Awards (sponsors listed first)

·  Air Liquide: Tree Folio  

·  Boeing and LinkedIn: Xenophobia Meter

·  EY: Volume 

·  Goldman Sachs: Using VR to study food consumer behavior 

·  Pepsi and Sandia National Labs: AI-Learners  

Faculty Awards (Departments listed first)

·  Computer Science: LoFi Hip-Hop 

·  Information Science: Hack4Impact Earth Law Center Project 

·  Statistics and Data Science: Tree Folio 

By Patricia Waldron, a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

Date Posted: 5/05/2023
A color photo of a woman smiling for a photo

The Yang-Tan WorkABILITY Incubator, recently launched through the ILR School’s Center for Applied Research on Work (CAROW), will support innovative applied research projects and collaborations that bring together two or more parts of the university to address important societal issues linked to work.

Funded through the generosity of K. Lisa Yang ’74, the incubator will provide support both to early stage projects and larger initiatives.

“Through applied research and collaboration across Cornell to create tools that will translate into equity and impact for individuals, CAROW and the Yang-Tan WorkABILITY Incubator will enable the ILR School to truly advance the world of work,” Yang said.

The incubator has already launched the Initiative on Home Care and Home Health Care Workers. It will also be the new home of the Criminal Justice and Employment Initiative. Both initiatives build a community of scholars and researchers across Cornell’s campuses.

“The Yang-Tan WorkABILITY Incubator provides CAROW with an engine through which to tackle the big, consequential challenges of our day in the areas of work, employment and labor,” said Ariel Avgar, Ph.D. ’08, the director of CAROW. “The two inaugural initiatives are a perfect case in point. Focusing on the working conditions of low-wage workers in health care and the equitable access to employment opportunities for justice- involved individuals builds on Cornell expertise with the goal of guiding action based on applied research.

“We owe a great debt to Lisa Yang’s vision and generosity, which have made this effort and approach possible,” Avgar said.

The Initiative on Home Care and Home Health Care Workers will be directed by Weill Cornell Medicine’s Dr. Madeline Sterling ’08. Nicola Dell, associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech and in the Cornell Ann S. Bowers College of Computing and Information Science, will serve as director of technical innovation.

“This new initiative will drive rigorous interdisciplinary research on the link between working conditions, the home care workforce and the delivery of high-quality patient care with the goal of influencing practice and policy,” said Avgar, ILR’s senior associate dean for outreach and sponsored research.

Sterling is an expert on home care and its impact on the health of patients. Her research focuses on examining how home care services impact the delivery of care and novel ways to leverage the home care workforce to improve both worker and patient outcomes.

Dell studies human-computer interactions, computer security and privacy, and information and communication technologies and development. Dell’s health care work examines the potential for designing technologies that enhance equity for home care workers.

ILR’s Criminal Justice and Employment Initiative will receive funding from the incubator, in addition to its state funding. Directed by Timothy McNutt with Jodi Anderson serving as technical innovation director and Matt Saleh as research director, the initiative provides training on criminal records and employment law to job seekers who have been involved in the criminal legal system. The program also assists employers in developing fair chance hiring, engages in research to study reentry practices and works with policymakers and legislators on criminal justice reform.

McNutt has a background in criminal law, litigation and policy to improve employment opportunities for people with criminal records. He has interacted with hundreds of incarcerated and newly paroled people in the past five years to help them access and correct their criminal records, and get jobs. McNutt broadened the outreach through the incubator to include the Restorative Record Project, which helps job candidates create non-traditional résumés that highlight core competencies and micro-credentials.

Anderson, a Cornell Prison Education Program alumnus who earned a master’s degree from Stanford University, is the developer of Rézme, an app created to support justice-involved job candidates.

Saleh is a senior research associate at the Yang-Tan Institute on Employment and Disability at the ILR School. His research focuses on career pathways for youth with disabilities and on employment barriers such as justice involvement.

By Julie Greco, a senior communications specialist for the ILR School.

This story was originally published in the Cornell Chronicle.

Date Posted: 4/28/2023
A black and white drawing showing the overhead view of a city

In higher education, universities have long been viewed as pipelines, preparing students for productive careers in specific fields. But when it comes to understanding how students actually make their way through college, the “pipeline” imagery fails to capture the twists and turns real people often take along the way. 

A group of scholars led by René Kizilcec, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, is calling for a new, data-informed model for the study of academic progress that leverages the trove of student data colleges and universities already possess. They urge researchers and policymakers to replace the “pipeline” metaphor with “pathways,” an updated, data-centric approach that accounts for the complexity of university curriculums and students’ journeys through them, providing critical information for researchers, university administrators, and students alike. 

In a paper published in the journal Science on April 28, professors from nine universities — including Stanford, Columbia, Texas A&M, and the University of Pennsylvania – said this shift toward a more analytical approach would open the "black box" of college to help administrators design effective curriculums and guide the students who navigate them. 

“We are building a new science of academic progress that leverages ubiquitous student and course data with computational methods to understand sequences of choices in higher education,” Kizilcec said. “This will enable new ways to understand the choices students make in college, and provide more transparency and advice to students as they make these fateful decisions. It also provides insights for administrators as they make structural and curriculum changes.”

The problem with pipelines

“In science, metaphors guide our understanding of a problem — they shape our approach to observing the world and the way we communicate our findings,” said co-author Mitchell Stevens, a professor of education at Stanford. “The pipeline metaphor has been useful for many years, but it has come to limit our understanding of how academic progress unfolds.” 

If students enter college with one major in mind, but then switch, the pipeline metaphor treats that kind of departure as a “leak,” or a loss, rather than an entry onto another route to graduation. What’s more, a pipeline metaphor suggests a lack of agency on students’ part, the authors say, when in reality, students are making decisions throughout their education.

A switch to pathways could also help pave the way for interventions that promote equity, the authors write. 

"Pathways science can help demystify the college experience and shed light on the consequences of students’ choices," Kizilcec said. "This can especially benefit students who are first-generation or low-income, who may not have input from people with significant college experience."  

Applying new analytical techniques

In conjunction with a new conceptual model, recent developments in computational science make it possible to analyze complex data on academic progress, the authors write. 

Currently, colleges and universities have an often untapped trove of student data – grades, demographics, classes that students take or drop, and how long it takes to graduate. This data can be used to understand how students are making choices in a complex system and how the curriculum's structure could be adapted to accommodate student preferences.

The authors call for building a shared analytical framework and infrastructure, including a system for standardizing data across institutions, available as open-source analytic tools. These tools can be shared and applied across schools and university systems.

This work can also lead to better resources for advising students. One such tool is Pathways, a platform that helps students navigate a university’s curriculum to make more informed choices when selecting classes and majors, which was developed previously by Kizilcec’s group, the Future of Learning lab.

“Understanding the consequences of academic choices – picking courses, declaring majors – can be difficult," Kizilcec said. "A new approach that leverages available data along with machine learning and other tools can increase transparency in academic environments to help students make well-informed choices and help universities design curriculums that keep up with the future of work.”

The paper’s other authors are: Rachel B. Baker from the University of Pennsylvania; Elizabeth Bruch from the University of Michigan at Ann Arbor; Kalena E. Cortes from Texas A&M University; Laura T. Hamilton from the University of California at Merced; David Nathan Lang from Western Governors University; Zachary A. Pardos from the University of California at Berkeley, and Marissa E. Thompson from Columbia University. 

Adapted from materials provided by the Stanford Graduate School of Education.

Date Posted: 4/27/2023
A color graphic with a purple background and white microchip lines

Cornell’s American Indian and Indigenous Studies Program and the Redistributive Computing Systems Group (RCSG) will present a series of talks this Friday exploring the intersection of Indigenous worldviews and computational technologies.

Indigenous Computing” will be held 1:30 to 4:30 p.m. Friday, April 28, in Gates Hall 114, with a virtual attendance option via Zoom. The event includes talks from Indigenous people working in computer science, information science, and genetics. Registration is encouraged.

“Indigenous people need technologies designed with, by, and in support of our unique lived experiences, identities, knowledge, beliefs, and politics,” said Marina Johnson-Zafiris, a member of the Mohawk Nation and a doctoral student in the field of information science. “As we will see through the speakers, this materializes in different ways – through our interventions, as models, as apps, as protocols – each of them representing our own understandings of Native sovereignty and Indigenous futures.”

Johnson-Zafiris will present “Computing Along the Two Row,” a reference to the treaty between the Haudenosaunee and European colonial settlers recorded in a purple and white beaded belt called the Two Row wampum belt.

Western science has been incredibly successful by adopting a universalist approach – traditionally, good scientists check their identity at the door, said Christopher Csíkszentmihályi, associate professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science and RCSG director.

“While this works well for physics, it creates many problems when science rubs up against the human realm,” he said. “Applied technology like computation benefits when diverse perspectives and experiences are mindfully brought to bear in its creation and implementation.”

“One of the central motivations in the creation of this event is to highlight Native people in the field of computing, a space we have largely been invisibilized,” Johnson-Zafiris said. “Our work provides critical interventions that center principles of relationship and accountability, principles that must come to the forefront of computer and information science research.”

The full schedule is as follows:

1:30 p.m. – Welcome “Indigenous Computing” from Troy Richardson (Saponi/Tuscarora), associate professor of philosophy of education and American Indian and Indigenous Studies at Cornell

1:45 p.m. – “Computing Along the Two Row” by Marina Johnson-Zafiris (Mohawk), doctoral student in the field of information science at Cornell

2:15 p.m. – “Indigenous Language AI” by Daniela Ramos Ojeda (Nahua) ‘25, a computer science major at Cornell

2:45 p.m. – “Developing a Nahuatl Language Translator” by Eduardo Lucero, professor at the Tecnologico Nacional de Mexico, Apizaco, and Sergio Khalil Bello García, senior iOS software engineer at Bitso

3:15 p.m. – “Cultivating Connection: Understanding Relatedness and Kin within Maize Quantitative Genetics” by Merritt Khaipho-Burch (Oglala), doctoral student in the field of plant breeding and genetics at Cornell

3:45 p.m. – “Modeling Dispossession, Youth Homelessness, and Integrated Climate Assessments with and for Indigenous Communities” by Mike Charles (Diné), Cornell Provost’s New Faculty Fellow and incoming assistant professor of biological and environmental engineering

4 p.m. – Closing comments on Indigenous protocol

4:20 p.m. – Coffee and extended discussions

Date Posted: 4/26/2023
A color photo of the trash barrel robot

How do New Yorkers, who are not known for their politeness, react to robots that approach them in public looking for trash? Surprisingly well, actually.

Cornell researchers built and remotely controlled two trash barrel robots – one for landfill waste and one for recycling – at a plaza in Manhattan to see how people would respond to the seemingly autonomous robots. Most people welcomed them and happily gave them trash, though a minority found them to be creepy. The researchers now have plans to see how other communities behave. If you’re a resident of New York City, these trash barrel robots may be coming soon to a borough near you.

A team led by Wendy Ju, associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech and the Technion, and a member of the Department of Information Science in the Cornell Ann S. Bowers College of Computing and Information Science, constructed the robots from a blue or gray barrel mounted on recycled hoverboard parts. They equipped the robots with a 360-degree camera and operated them using a joystick. 

“The robots drew significant attention, promoting interactions with the systems and among members of the public," said co-author Frank Bu, a doctoral student in the field of computer science. "Strangers even instigated conversations about the robots and their implications.” 

Bu and Ilan Mandel, a doctoral student in the field of information science, presented the study, "Trash Barrel Robots in the City," in the video program at the ACM/IEEE International Conference on Human-Robot Interaction last month.

In the video footage and interviews, people expressed appreciation for the service the robots provided and were happy to help move them when they got stuck, or to clear away chairs and other obstacles. Some people summoned the robot when they had trash – waving it like a treat for a dog – and others felt compelled to “feed” the robots waste when they approached.  

However, several people voiced concerns about the cameras and public surveillance. Some raised middle fingers to the robots and one person even knocked one over.

People tended to assume that the robots were “buddies” who were working together, and some expected them to race each other for the trash. As a result, some people threw their trash into the wrong barrel.

This type of research, where a robot appears autonomous but people are controlling it from behind the scenes, is called a Wizard of Oz experiment. It’s helpful during prototype development because it can alert researchers to potential problems autonomous robots are likely to encounter when interacting with humans in the wild.

Ju had previously deployed a trash barrel robot on the Stanford University campus, where people had similarly positive interactions. In New York City, initially she had envisioned new types of mobile furniture, such as chairs and coffee tables.

“When we shared with them the trash barrel videos that we had done at Stanford, all discussions of the chairs and tables were suddenly off the table,” Ju said. “It’s New York! Trash is a huge problem!”

Now, Ju and her team are expanding their study to encompass other parts of the city. “Everyone is sure that their neighborhood behaves very differently,” Ju said. “So, the next thing that we're hoping to do is a five boroughs trash barrel robot study.” 

Michael Samuelian, director of the Urban Tech hub at Cornell Tech, has helped the team to make contact with key partners throughout the city for the next phase of the project.

Wen-Ying “Rei” Lee, a doctoral student in the field of mechanical and aerospace engineering, also contributed to the study.

By Patricia Waldron, a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

Date Posted: 4/19/2023
A color photo showing young people using their cell phones

Youth in the United States are targets of cross-platform digital abuse from peers, strangers, offline acquaintances and even relatives, with threats ranging from harassment and sexual violence to financial fraud, according to a new collaborative study and call-to-action from Cornell and Google researchers.

Aided by firsthand accounts from 36 youth aged 10 to 17 and 65 parents, educators, social workers and other youth advocates, researchers identified the need for more resources to educate youth and parents on digital abuse. They call for better communication and coordination among adult stakeholders in implementing sound protective practices.

The study also calls for human computer interaction (HCI) scholars to study and develop better tools to safeguard youth online, where nearly half of American teenagers experience some form of digital abuse, according to Pew Research.

“We really need to take a closer look at the types of things that young people are experiencing online, because these experiences are not just child problems anymore,” said Diana Freed, a doctoral student in the field of information science and lead author of “Understanding Digital-Safety Experiences of Youth in the U.S.,” which will be presented at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems in Hamburg, Germany this month. “Young people are experiencing what are typically thought of as adult issues, like financial fraud and sexual violence.”

Youth in the study reported being harassed online by peers, intimate partners, acquaintances and strangers. Harassment could involve fielding toxic comments or having fake social media accounts set up without their authorization, but could also shift into more serious forms of digital abuse – like receiving intimate images they didn’t request – or escalate into threats in the physical world.

“Once your nudes get sent out, you’re done. It’s going to spread,” a youth said in the study. “I’ve seen videos spread from state to state in literally five minutes.”

“She told me she needed money for her child,” said another, detailing a financial scam. “I gave out my bank card and also my online banking code. When I stopped, she started harassing and threatening me.”

Just as today’s youth live and seamlessly move between offline and online worlds, threats often follow them from platform to platform, said Natalie Bazarova, M.S. ‘05, Ph.D. ‘09, professor of communication in the College of Agriculture and Life Sciences and director of the Cornell Social Media Lab.

“The porousness of barriers between digital platforms and online and physical worlds underscores how easily threats can escalate by crossing social contexts and amplifying harms,” she said.

While kids navigate complex and sometimes risky digital lives, for parents and educators alike, there are few formal options for support and resources to educate themselves and kids on potential online harms, researchers found.

“Whether it was the teachers or the parents, they didn’t really understand exactly what social media applications young people were using, let alone how to address the problems,” Freed said.

In many instances, parents’ knowledge about the platforms their kids frequent was limited to information pulled from quick web searches or conversations with friends; far from ironclad sources, she added.

“Some parents would tell us, ‘Online gaming is very safe, but a particular social media app is not safe.’ But is there an open chat on the gaming platform? Can anyone join it? Do you know who your kids are communicating with?” Freed said. “Well-meaning parents can have a very difficult time understanding what questions to ask their kids to improve safety.”

Among their recommendations, researchers call for better educational resources, such as more robust digital safety educational programs in schools and more accessible, actionable resources, like Common Sense Education’s Digital Citizenship curriculum for educators and Social Media Test Drive, a Cornell-led project that Bazarova co-founded and directs. Other recommendations include engaging youth in app and platform design and improving digital-abuse reporting processes on the online apps and platforms young people frequent.

“We may assume, because they’re digital natives, that kids will just know how to protect themselves online,” Freed said. “That’s leaving a lot on young people, families and schools.”

Other co-authors are: Eunice Han ’21; Sunny Consolvo, Patrick Gage Kelley, and Kurt Thomas, all of Google; and Dan Cosley of the National Science Foundation.

This research was supported in part by the National Science Foundation and the USDA’s National Institute of Food and Agriculture.

By Louis DiPietro, a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

Date Posted: 4/18/2023
A color photo of someone presenting a project to another person

After a three-year hiatus, Bits On Our Minds (BOOM), a showcase of cutting-edge digital technology projects created by Cornell students, returns to campus for its 25th anniversary. The event will be held 4-6 p.m. on Thursday, Apr. 27 in Duffield Hall atrium.

BOOM offers student teams and individuals the opportunity to showcase their projects, which will include games, robotics, autonomous vehicles, mobile phone apps, and more, to the Cornell community and beyond. The Cornell Ann S. Bowers College of Computing and Information Science is sponsoring the event, but BOOM is open to participants from across the university, and the wider Ithaca community is invited to attend.

The first BOOM took place in 1998, making this year the 25th anniversary of the event. Due to the COVID-19 pandemic, however, the event was paused from 2020-2022.

A color graphic showing the 2023 BOOM logo

BOOM provides the rare opportunity for students to network with representatives from industry and receive feedback on their work. Several corporate sponsors will be in attendance and participants are encouraged to hone their elevator pitches and meet with sponsors  in a reception following the event. This year, the sponsors include Boeing, Sandia National Labs, LinkedIn, Goldman Sachs, EY, Air Liquide and Pepsi.

Teams will also be competing for cash awards, with projects judged on their novelty, performance, the quality of engineering, project difficulty, social benefits and presentation. The selected teams will receive a commemorative trophy and $750.

BOOM is free and open to the public.

Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

This story was originally published in the Cornell Chronicle.

Date Posted: 4/17/2023
A color photo of an autonomous bus Caption: Autonomous buses in Linköping in Sweden must make freque

The town of Linköping, Sweden, has a small fleet of autonomous electric buses that carry riders along a predetermined route. The bright vehicles, emblazoned with the tagline, “Ride the Future,” have one main problem: Pedestrians and cyclists regularly get too close, causing the buses to brake suddenly, and making riders late for work.

Researchers saw this problem as an opportunity to design new ways of using sound to help autonomous vehicles navigate complex social situations in traffic. Currently, sound is still underexplored as a tool to enable autonomous vehicles and robots to interact with humans and each other. 

The research team found that jingles and beeps effectively move people out of the way. But more importantly, they discovered it’s the timing of the sound – not the sound itself – that allows the bus to meaningfully communicate with people in traffic.

“If we want to create sounds for social engagement, it's really about shifting the focus from ‘what’ sound to ‘when’ sound,” said study co-author Malte Jung, associate professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science (Cornell Bowers CIS).

Lead author Hannah Pelikan, a recent visiting scholar in the Department of Information Science at Cornell Bowers CIS and doctoral student at Linköping University, presented their study, “Designing Robot Sound-In-Interaction: The Case of Autonomous Public Transport Shuttle Buses,” on March 15 at the 2023 ACM/IEEE International Conference on Human-Robot Interaction. The work received a nomination for the best design paper award.

The researchers designed potential bus sounds through an iterative process: They played sounds through a waterproof Bluetooth speaker on the outside of the bus, analyzed video recordings of the resulting interactions, and used that information to select new sounds to test. Either the researchers or a safety driver, who rides along in case the bus gets stuck, triggered the sounds to warn pedestrians and cyclists.

Initially, the researchers tried humming sounds that became louder as people got closer, but low-pitched humming blended into the road noise and a high-pitched version irritated the safety drivers. The repeated sound of a person saying “ahem” was also ineffective. 

They found that “The Wheels on the Bus” and a similar jingle successfully signaled cyclists to clear out before the brakes engaged. The song also elicited smiles and waves from pedestrians, possibly because it reminded them of an ice cream truck, and may be useful for attracting new riders, they concluded.

Standard vehicle noises – beeps and dings – also worked to grab people’s attention; repeating or speeding up the sounds communicated that pedestrians needed to move farther away.

In analyzing the videos, Pelikan and Jung saw that regardless of which sound they played, the timing and duration were most important for signaling the bus’ intentions – just as the honk of a car horn can be a warning or a greeting. A sound that is too late can become incomprehensible, and is ignored as a result.

These insights came from applying conversation analysis, an interdisciplinary approach influenced by sociology, anthropology, and interactional linguistics, which has not been used previously for robot sound design. By transcribing the pedestrians’ reactions in the video recordings in great detail, the researchers were able to see the moment-by-moment impact of the sounds during a traffic interaction.

“We looked very much at the interaction component,” Pelikan said. “How can sound help to make a robot, bus, or other machine explainable in some way, so you immediately understand?”

The study’s approach represents a new way of designing sound that is applicable to any autonomous system or robot, the researchers said. While most sound designers work in quiet labs and create sounds to convey specific meanings, this approach uses the bus as a laboratory to test how people will respond to the sounds in the wild.

“We’ve approached sound design all wrong in human-robot interaction for the past decades,” Jung said. “We wanted to really rethink this and bring in a new perspective.”

Pelikan and Jung said their findings also underline another important factor for  autonomous vehicle design: Traffic is a social phenomenon. While societies may have established rules of the road, people are constantly communicating through their horns, headlights, turn signals and movements. Pelikan and Jung want to give autonomous vehicles a better way to participate in the conversation.

The research received funding from the Swedish Research Council and the National Science Foundation.

By Patricia Waldron, a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

This story was orignially published in the Cornell Chronicle.

Date Posted: 4/17/2023
A color photo of a man and woman standing outside

Social media companies need content moderation systems to keep users safe and prevent the spread of misinformation, but these systems are often based on Western norms, and unfairly penalize users in the Global South, according to new research at Cornell.

Farhana Shahid, a doctoral student in the field of information science in the Cornell Ann S. Bowers College of Computing and Information Science, who led the research, interviewed people from Bangladesh who had received penalties for violating Facebook’s community standards. Users said the content moderation system frequently misinterpreted their posts, removed content that was acceptable in their culture and operated in ways they felt were unfair, opaque and arbitrary.

Shahid said existing content moderation policies perpetuate historical power imbalances that existed under colonialism, when Western countries imposed their rules on countries in the Global South while extracting resources.

“Pick any social media platform and their biggest market will be somewhere in the East,” said co-author Aditya Vashistha, assistant professor of information science in Cornell Bowers CIS.  “Facebook is profiting immensely from the labor of these users and the content and data they are generating. This is very exploitative in nature, when they are not designing for the users, and at the same time, they’re penalizing them and not giving them any explanations of why they are penalized.”

Shahid will present their work, “Decolonizing Content Moderation: Does Uniform Global Community Standard Resemble Utopian Equality or Western Power Hegemony?” in April at the Association for Computing Machinery (ACM) CHI Conference on Human Factors in Computing Systems.

Even though Bengali is the sixth most common language worldwide, Shahid and Vashistha found that content moderation algorithms performed poorly on Bengali posts. The moderation system flagged certain swears in Bengali, while the same words were allowed in English. The system also repeatedly missed important context. When one student joked “Who is willing to burn effigies of the semester?” after final exams, his post was removed because it might incite violence.

Another common complaint was removing posts that were acceptable in the local community, but violated Western values. When a grandmother affectionately called a child with dark skin a “black diamond,” the post was flagged for racism, even though Bangladeshis do not share the American concept of race. In another instance, Facebook deleted a 90,000-member group that provides support during medical emergencies because it shared personal information – phone numbers and blood types in emergency blood donation request posts by group members.

The researchers also found inconsistent moderation of religious posts. One user felt the removal of a photo of the Quran lying in the lap of a Hindu goddess with the words, “No religion teaches to disrespect the holy book of another religion,” was Islamophobic. But another user said he reported posts calling for violence against Hindus and was notified the content did not violate community standards.

The restrictions imposed by Facebook had real-life consequences. Several users were barred from their accounts – sometimes permanently – resulting in lost photos, messages and online connections. People who relied on Facebook to run their businesses lost income during the restrictions, and some activists were silenced when opponents maliciously and incorrectly reported their posts.

Participants reported feeling “harassed,” and frequently did not know which post violated the community guidelines, or why it was offensive. Facebook does employ some local human moderators to remove problematic content, but the arbitrary flagging led many users to assume that moderation was entirely automatic. Several users were embarrassed by the public punishment and angry that they could not appeal, or that their appeal was ignored.

“Obviously, moderation is needed, given the amount of bad content out there, but the effect isn’t equally distributed for all users,” Shahid said. “We envision a different type of content moderation system that doesn’t penalize people, and maybe takes a reformative approach to better educate the citizens on social media platforms.”

Instead of a universal set of Western standards, Shahid and Vashistha recommended that social media platforms consult with community representatives to incorporate local values, laws and norms into their moderation systems. They say users also deserve transparency regarding who or what is flagging their posts and more opportunities to appeal the penalties.

“When we’re looking at a global platform, we need to examine the global implications,” Vashistha said. “If we don’t do this, we’re doing grave injustice to users whose social and professional lives are dependent on these platforms.”

By Patricia Waldron, a writer for the Cornell Ann S. Bowers College of Computing and Information Science.

Date Posted: 4/13/2023
A color photo of a man smiling for a photo

Cornell Ann S. Bowers College of Computing and Information Science announced the appointment of David Mimno as chair of the Department of Information Science, effective January 1, 2024.  

“I am delighted to welcome David as the next chair,” said Kavita Bala, dean of Cornell Bowers CIS. “His dedication to the department and expertise make him the ideal candidate to continue the exciting trajectory of our growing IS department. I look forward to working with him to advance the missions of the department and college.”

Mimno has been a valued member of Cornell’s faculty for nearly a decade and has made numerous positive contributions to the academic community. He holds a Ph.D. from the University of Massachusetts, Amherst, and was previously the head programmer at the Perseus Project at Tufts University and a researcher at Princeton University. His machine learning research has been supported by the National Endowment for the Humanities and the National Science Foundation (NSF). In 2016, he was awarded the prestigious Sloan Research Fellowship from the Alfred P. Sloan Foundation, and in 2017, the NSF’s Faculty Early Career Development (CAREER) Award.

“I got into machine learning research because I saw how transformational computation could be in giving people new ways to connect with the world,” said Mimno. “I joined Cornell’s Department of Information Science because I found a community of people who shared this vision of technology and society. As computation becomes ever more present in every aspect of our lives, with all the good and bad effects that it could have, I can't imagine a more exciting and important time to lead this department.”

Mimno succeeds David Williamson, professor in the School of Operations Research and Information Engineering, who has held the role since July 2021 and is extending his appointment until the end of the year.

Date Posted: 4/13/2023

Pages