You have /5 articles left.
Sign up for a free account or log in.

A silhouette of a military drone that looks like an airplane. The silhouette is set against a light blue sky with white clouds. A serviceperson standing beneath the rear of the drone, is also silhouetted.

Airmen from the 29th Aircraft Maintenance Unit check over the first MQ-9 Reaper—an unmanned aerial vehicle, or drone. In 2023, U.S. deputy defense secretary Kathleen Hicks announced the Replicator Initiative—an ambitious two-year plan to counter China’s massive military with thousands of autonomous weapons systems across multiple domains.

U.S. Air Force photo by Airman First Class Autumn Vogt/public domain

The United States Army Futures Command is at work modernizing weapons and equipment and identifying, acquiring and developing next-generation military technologies. It is “the best example of our commitment to the future lethality of the force” and “probably one of the boldest reforms” the Army has pursued, Secretary of the Army (later Secretary of Defense) Mark Esper told Congress in 2018. The command makes its home not on an Army base but on the campus of the University of Texas at Austin. That choice was by design.

“It’s critical that we have access to talent,” Esper said at the time. “Talent that can help us think about the future strategic environment, thinking in the 2030s, 2040s, because that will inform, in many ways, the steps we take with regard to material … It’s proximity to innovation. It’s proximity to academia.”

UT Austin has a mission, too. It seeks to “transform lives for the benefit of society” and “to serve as a catalyst for positive change in Texas and beyond.” These human-centered ideals echo mission statements crafted by universities around the country that also lend expertise to the Pentagon.

Within the next few years, the United States is expected to possess fully autonomous lethal weapons systems—or “killer robots,” as they are known to opponents. Now, some people are asking whether the Defense Department’s massive higher education funding stream engages universities in supporting that work. While the Pentagon is clear about its lethal objectives, higher education is less so. Many universities welcome DOD funding with expressions of pride and altruism—and no mention of potential for harm.

“In the pursuit of talent, [the Defense Department] is like a parasitical creature that attaches itself to another entity to feed off of its energy and capabilities,” said Michael Klare, the Five College Professor Emeritus of Peace and World Security Studies at Hampshire College, who also serves on the board of the Arms Control Association.

But some, including Emelia Probasco, senior fellow at Georgetown University’s Center for Security and Emerging Technology, where she works on the military applications of artificial intelligence, suggest that a portrait of the Defense Department as overly “focused on the war machine” is simplistic. In higher ed–military collaborations, “there’s quite a bit of health research and business operations research that gets overlooked,” said Probasco, who served as a surface warfare officer in the U.S. Navy and deployed twice to the Indo-Pacific. She currently serves as a special government employee advising the Defense Innovation Unit.

Universities as ‘Agents of the State’

To counter China’s massive, asymmetric military advantage, the United States plans to field, within the next two years, the Replicator Initiative—thousands of autonomous systems across land, air, sea, space and cyberspace. Far from science fiction, Replicator’s real-world autonomous weapons systems will deliver capabilities at “volume and velocity.”

“We are not taking our foot off the gas, and in fact we’re accelerating,” Deputy Defense Secretary Kathleen Hicks said in a speech last year.

The Pentagon relies on contracts with private companies such as Lockheed Martin, Raytheon, General Dynamics, Boeing and Northrop Grumman to help develop, manufacture and supply advanced military technology such as fighter aircraft and missiles.

But universities outshine defense contractors in at least two areas—expertise and research capacity, according to Margaret O’Mara, the Scott and Dorothy Bullitt Chair of American History at the University of Washington. O’Mara’s research connects the growth of America’s high-tech sector with its political history.

The U.S. government’s “mighty” higher education investment “essentially makes universities agents of the state, where they help achieve what the government wants to see happen in science but particularly in new military and space technology,” O’Mara said. “Some might call it a devil’s bargain.”

Nearly 50 universities help the United States government build nuclear weapons, according to the International Campaign to Abolish Nuclear Weapons (ICAN), winner of the 2017 Nobel Peace Prize. Lucrative contracts may offer an incentive for overlooking moral quandaries concerning weapons design and development, according to Alicia Sanders-Zakre, ICAN policy and research coordinator. Now, ICAN is concerned that the Defense Department may substitute artificial intelligence for human judgment in nuclear weapons use.

Governments that offer significant funding to universities or academic researchers who help develop weapons “threaten the remit of independent and curiosity-driven research,” a Nature editorial argued in 2018. “It breaks down the bonds of trust that connect scientists around the world and undermines the spirit of academic research.”

Not everyone sees the status quo in such stark terms. Faculty and students in Case Western Reserve University’s military ethics program, for example, devote time and expertise to understanding the ethical use of emerging military technologies, among other objectives. In doing so, they seek to support “long-term humanitarian goals, such as preventing unjust wars, decreasing incidents of war crimes, genocide, human rights abuses, and other atrocities produced by the dehumanizing effects of armed conflict.”

Either way, much of the Defense Department’s work—from managing weapons labs and training the next generation of weapons scientists—is classified for national security reasons. That makes gleaning information about university-military collaborations challenging, though not impossible.

“Look at the budget and follow the money,” Sanders-Zakre said.

The Money

The 2024 Defense Department budget offers rationales for its spending, including for funding streams to universities. The department seeks to “keep our nation safe while delivering a combat credible Joint Force that is the most lethal [emphasis added], resilient, agile, and responsive in the world,” according to Secretary of Defense Lloyd J. Austin III.

The U.S. Defense Department expects to spend $842 billion in 2024. (In contrast, the Education Department got $79.2 billion in the fiscal 2023 budget.) Defense funding is divided into three broad categories: operations and support (which accounts for nearly two-thirds of the budget); acquisition (which includes procurement, research, development, testing and evaluation and accounts for 37 percent of the budget); and infrastructure (which includes military construction and family housing and accounts for 2 percent of the budget). Higher education receives only a fraction of the defense budget, much from the acquisition category.

But a fraction of hundreds of billions can be significant for individual universities or academic researchers. UT Austin, for example, received a five-year, $65 million contract for serving as the Army Futures Command headquarters.

In support of its “most lethal, resilient, agile, and responsive” goals, the Defense Department funds the highly competitive Multidisciplinary University Research Initiative (MURI) program. In this program, teams of university researchers “focus on [Defense Department]–specific hard science problems.” For this and other university research initiatives, the 2024 DOD budget allocates $347.3 million.

The institutional recipients of the 2024 MURI program have not yet been named. But in 2023, this initiative awarded $220 million to 31 teams at 61 U.S. academic institutions, or an average of $7.1 million per team.

Also in 2023, the Defense Department, through its Defense University Research Instrumentation Program, awarded $161 million to 281 universities to purchase equipment supporting defense-relevant research. This program seeks to foster innovation leading to “unprecedented military capabilities” in next-generation wars, according to Bindu Nair, director of basic research in the Office of the Under Secretary of Defense for Research and Engineering.

Scant Details on Potential Harm

The University of South Florida College of Engineering, in announcing its (relatively modest) $5 million Defense Department grant to study AI models for the U.S. military, highlighted that the work would “benefit society at large.” The work involves conducting research “related to AI and associated technology,” including “recognizing legitimate targets.” With this word choice, the work is cast as an academic exercise, not one with a potential human toll. (The computer science professor referenced in the news release did not respond to a request for comment.)

The University of Dayton’s poetically named Soaring Otter—an $88 million Air Force award—provides research and development “to advance, evaluate and mature Air Force autonomous capabilities,” according to the university. In the military, “autonomous capabilities” could have applications in lethal autonomous weapons.

But the institution’s press statement is vague about whether the work would advance AI agents that are capable of making decisions to kill without human input. (The principal investigator at the university referenced in the press release did not respond to a request for comment.)

“The [Defense Department] does not rule out any system ever,” said Probasco, who is not involved with Soaring Otter. “But in considering any future autonomous or AI-enabled weapon … they put in place what, in the technical world, we call a risk-management process.”

When Texas A&M University’s Engineering Experiment Station announced its lead on a five-year Defense Department applied hypersonics project valued at $100 million, it spoke of “advancing innovation” and “nurturing the next generation of researchers.” The news release was opaque about whether the work would contribute to hypersonic weapons research. Hypersonic missiles, which fly faster than Mach 5, offer militaries a distinct advantage, as they can evade nearly all defense systems.

“It doesn’t matter what the threat is,” General John Hyten, the former vice chairman of the U.S. Joint Chiefs of Staff, said of hypersonic missiles’ significance, as reported in Voice of America. “If you can’t see it, you can’t defend against it.” (The engineering professor referenced in the university press release did not respond to a request for comment.)

A Blurry Line

Universities are not wrong when they suggest that DOD funding supports innovation, especially when the influx of cash amounts to billions. The modern internet, for example, was born from a project at the precursor to the Defense Advanced Research Projects Agency, or DARPA. Concerning the development of artificial intelligence, machine learning and autonomous systems, the U.S. government acknowledges its interest in defense applications. But it also observes that the research has applications in “fields as diverse as manufacturing, entertainment and education.”

Complex software systems underpin Defense Department operations, according to a Carnegie Mellon University press release announcing a renewed defense contract. The award, which was valued at $2.7 billion over five years, provides funds to operate the Software Engineering Institute.

In the statement, J. Michael McQuade, Carnegie Mellon’s vice president for research, championed the institution as “a high-tech anchor” and the contract for “supporting jobs.” Whether that software could be applied to lethal autonomous weapons systems was unclear. Software can be susceptible to algorithmic bias based on race, disability and gender, which would be especially problematic if targeting humans.

In 2019, Carnegie Mellon University president Farnam Jahanian, a computer scientist and entrepreneur, was asked whether he would endorse an autonomous weapon ban. The question was posed during a press conference with local media upon the expansion of Carnegie Mellon’s collaboration with the Army’s Artificial Intelligence Task Force. At that time, he declined to endorse a ban, which aligns with the position of the U.S. government.

U.S. Policy: Lethal Autonomous Weapons Are an Option

The Defense Department has dedicated time, attention and resources to support its understanding of responsible AI. The 47-page policy directive, “U.S. Department of Defense Responsible Artificial Intelligence Strategy and Implementation Pathway,” for example, offers evidence of that.

The department’s 2012 autonomous weapons policy directive also “assigns responsibilities for developing and using autonomous and semi-autonomous functions in weapon systems” and “establishes guidelines designed to minimize the probability and consequences of failure” in those systems.

Said differently, the Pentagon can imagine circumstances under which lethal autonomous weapons may be used. Some, including Probasco, agree. She, for example, “would much rather have a missile that’s better at hitting its intended target than just hoping … ‘we threw them as best we could.’”

The Defense Department “has terrible, terrible incidents in our history where people made decisions that honestly break my heart,” Probasco said. “But every time we get up, we try to get better, and we try to put in place rules and operating procedures and training and technologies that will prevent the harm but achieve the mission.”

But many universities that have accepted military funding appear to avoid conversations—nuanced or not—concerning whether campus research could contribute to destruction or death.

UT Austin, for example, did not respond to a request for comment about its mission statement. Also, the Massachusetts Institute of Technology’s AI Accelerator—funded with $15 million from the Air Force—does not post an email address for its director on its webpage. A staff member in the MIT media relations office said that the director was “traveling with a packed schedule” and needed to decline to speak. The university also did not respond to a subsequent request to be put in touch with any one of the other 25 team members at the AI Accelerator.

“The vagueness works in the military’s favor,” Sanders-Zakre said. “Maybe university researchers believe—and maybe rightly so—that their research can have multiple applications and will not just be used for weapons. But that’s why the Defense Department funds the work.”

Equal Opportunity Defense Work

For a long time, the Defense Department appeared to favor predominantly white institutions in its funding. In 2020, for example, Johns Hopkins University, the Georgia Institute of Technology and Pennsylvania State University received more than half of the department’s more than $2.5 billion allocated for university science and engineering programs, while historically Black colleges and universities received less than $1 million, according to University Business.

But in 2023, DOD awarded $90 million to Howard University—a first for a historically Black institution—for its new Research Institute for Tactical Autonomy. Dr. Wayne Frederick, Howard’s president at the time, dubbed the new institute “historic” and “tremendous” for diversifying the pool of scientists contributing to national security. At the same time, he made no mention of potential harmful social impacts.

Howard’s new institute will have “a direct impact on my classmates and my wingmen,” Victor Fabritz Lugo, a Howard sophomore and Reserve Officers’ Training Corps member, told The Hilltop, Howard’s student newspaper. Evidence supports Lugo’s claim.

Student Interests, Shaped by the Pentagon

On-campus military research may normalize weapons work among students. For example, many soon-to-be college graduates use Handshake—a career services platform promoted by many universities that matches college students with prospective employers—to find jobs. In 2023, three defense contractors cracked the top 10 searches on the website, according to a Handshake analysis.

Student Handshake search interest in Raytheon (No. 1 on the list) was up by 209 percent. For Lockheed Martin (No. 4), searches were up by 92 percent. For Boeing (No. 8), searches were up by 56 percent. These three companies also ranked in the top 10 of another list—the top 10 defense contractors. (The study’s random sample included nearly 1,000 college students at four-year higher education institutions; correlation does not imply causation.)

No big tech companies—such as Meta, Google and Microsoft—made Handshake’s top 10 searches.

Handshake explained the finding in the context of layoffs and economic uncertainty. In 2023, college graduates “want a stable job that pays well, and they’re willing to flex other requirements—from company brand and growth rate to remote work options—to get it,” the company wrote.

Last year was “the worst 12 months for Silicon Valley since the dot-com crash of the early 2000s,” which included eliminating 260,000 jobs, according to NPR. In 2024, the carnage at Microsoft, Amazon, Alphabet, Meta and other big tech companies has continued. Nearly 100 tech companies have already laid off close to 25,000 employees in the first month of the year, according to Layoffs.fyi.

Tech-savvy students are primed to consider alternatives to Silicon Valley. For those who came of age on campuses where administrators framed war technology in only positive terms, defense contractors may provide an appealing option—one that appears to promise security from layoffs and unexpected downsizing.

Bans and Other Resistance Efforts

The autonomous weapons era is already here, according to O’Mara.

“It’s less of an on-off switch and more of a slow layering on of one technological capacity over another,” O’Mara said. “It’s hard to put that genie back in the bottle.”

Some organizations, including Human Rights Watch, liken lethal autonomous weapons systems to antipersonnel land mines and cluster munitions, which cause indiscriminate harm. They cross an ethical line by dehumanizing violence, according to Bonnie Docherty, a lecturer at Harvard Law School’s International Human Rights Clinic.

That’s why many in industry, the nonprofit sector, foreign governments and the public have called for a new international treaty banning autonomous weapons systems. The European Parliament, for example, passed a resolution calling for a ban on lethal autonomous weapons systems. The United Nations, the Red Cross and some in big tech have also joined that chorus.

“Human control must be retained in life and death decisions,” UN secretary-general António Guterres and International Committee of the Red Cross president Mirjana Spoljaric said in a joint statement last year.

Google, responding to thousands of its employees’ concerns about helping the military with AI for lethal weapons, did not renew a 2019 Pentagon contract known as Project Maven. The research used AI to interpret video images, which might have helped the military improve drone attacks.

But academics who believe that lethal autonomous weapons harm—rather than benefit—humanity say that international treaties are not the only path forward.

“At the university level, scientists and academic researchers can take a stand by adopting voluntary guidelines, codes of conduct or standards of their own against these weapons that raise legal, ethical and moral concerns,” Docherty said.

Higher education is currently engaged in intense debates over race, gender and justice. But academe has been largely quiet about the possibility of on-campus research contributing to the development and production of hypersonic missiles, lethal autonomous weapons systems and other technologies that can destroy or kill.

“In the war in Vietnam, students were drafted. It was personal,” O’Mara said. “Now, [the threat] is more ambiguous … [Students] may be experiencing and protesting anti-Black racism, but they don’t feel the danger of being killed in a drone attack.”

Next Story

More from Artificial Intelligence