At the University of Houston, a cluster of tents sprouted up on the central lawn, surrounded by plywood pallets and keffiyeh-clad students. But what the protesters didn't know was that their university had contracted with Dataminr, an artificial intelligence company with a questionable track record, to gather open-source intelligence on them.
Dataminr's AI tool, "First Alert," used social media activity and chat logs to identify potential incidents of concern, sending alerts directly to university administrators. The first alert came from a Telegram channel called "Ghosts of Palestine," where students had chatted about their demands for an end to genocide in Gaza. First Alert flagged it as an incident of concern and forwarded the information to university officials.
This use of AI-powered surveillance raises serious concerns about free speech and the role of corporations in higher education. Universities, self-proclaimed safe havens of expression, have been using private partners to surveil their students' dissent. The University of Connecticut, for example, used Datminr's First Alert to monitor student protests, even going so far as to watch protesters sleep.
The trend is not unique to the University of Houston or UConn; numerous universities across the US have employed similar practices, using open-source intelligence to map out planned demonstrations and gather information on students. The use of corporate partners in surveillance has been criticized by experts, who argue that it creates an unsafe environment, chills speech, and destroys trust between students, faculty, and administrators.
The University of Houston's use of Datminr highlights the dangers of relying on AI-powered surveillance to monitor student protests. The company's record on constitutional rights is troubling, with past involvement in domestic surveillance of Black Lives Matter protesters and abortion rights activists.
As Emily Tucker, executive director of the Center on Privacy and Technology at Georgetown Law, noted, "Institutions that are supposed to be for the public good are these corporate products that make them into vehicles for wealth extraction via data products." Universities have become increasingly corporatized, using data collection as a means of extracting value from their students.
The consequences of this trend are far-reaching. Students who spoke with The Intercept reported feeling chilled by the surveillance, leading some to limit communication about potential demonstrations or use secure messaging channels. For others, the experience was a wake-up call, highlighting the need for digital security measures.
As Tariq Kenney-Shawa, US Policy Fellow at Al-Shabaka, noted, "These universities are the epicenter of the future generation of Americans, future policy makers." The use of surveillance to monitor student protests undermines this mission, silencing dissent and chilling free speech.
Dataminr's AI tool, "First Alert," used social media activity and chat logs to identify potential incidents of concern, sending alerts directly to university administrators. The first alert came from a Telegram channel called "Ghosts of Palestine," where students had chatted about their demands for an end to genocide in Gaza. First Alert flagged it as an incident of concern and forwarded the information to university officials.
This use of AI-powered surveillance raises serious concerns about free speech and the role of corporations in higher education. Universities, self-proclaimed safe havens of expression, have been using private partners to surveil their students' dissent. The University of Connecticut, for example, used Datminr's First Alert to monitor student protests, even going so far as to watch protesters sleep.
The trend is not unique to the University of Houston or UConn; numerous universities across the US have employed similar practices, using open-source intelligence to map out planned demonstrations and gather information on students. The use of corporate partners in surveillance has been criticized by experts, who argue that it creates an unsafe environment, chills speech, and destroys trust between students, faculty, and administrators.
The University of Houston's use of Datminr highlights the dangers of relying on AI-powered surveillance to monitor student protests. The company's record on constitutional rights is troubling, with past involvement in domestic surveillance of Black Lives Matter protesters and abortion rights activists.
As Emily Tucker, executive director of the Center on Privacy and Technology at Georgetown Law, noted, "Institutions that are supposed to be for the public good are these corporate products that make them into vehicles for wealth extraction via data products." Universities have become increasingly corporatized, using data collection as a means of extracting value from their students.
The consequences of this trend are far-reaching. Students who spoke with The Intercept reported feeling chilled by the surveillance, leading some to limit communication about potential demonstrations or use secure messaging channels. For others, the experience was a wake-up call, highlighting the need for digital security measures.
As Tariq Kenney-Shawa, US Policy Fellow at Al-Shabaka, noted, "These universities are the epicenter of the future generation of Americans, future policy makers." The use of surveillance to monitor student protests undermines this mission, silencing dissent and chilling free speech.