One of the founders of Oculus, now called Meta Quest, has been busy since being pushed out of the company in 2018 with, um, military tech that may be involved in the Russian invasion of Ukraine.
In an frustratingly coy interview with Wired, Palmer Luckey skirted around the question of whether or not technology from Anduril Industries, a military technology company Luckey founded in 2017, is being used in Ukraine.
“There’s a few assumptions in that question, like we aren’t involved,” Luckey responded without saying whether that assumption was correct or not.
In a follow-up question where he was asked explicitly whether he and Anduril are involved in Ukraine, Luckey outright refused to confirm or deny this detail. He did, however, mention that Ukrainian President Volodymyr Zelenskyy “reached out” to Anduril in the interest of deterring conflict.
Kotaku reached out to Luckey and Anduril for comment but did not receive a response by the time of publication.
Anduril also struck a deal with the Trump administration to install surveillance towers around the border between the U.S. and Mexico in 2020. He was also a vocal supporter and donor for former President Donald Trump.
Anduril uses its Lattice technology, among others, which is a counter-drone system that detects hostile drones using AI-powered sentry towers, then deploys its own drones to take the other out of the air. A demo video boasts that Lattice operates autonomously with “computer vision, machine learning, and real-time data.” It’s already under development for the U.S., the U.K., and Australia, leading to the question of whether Ukraine might be part of that list.
Luckey mentions in the Wired interview that working on weapons is “less sunny” than the “fun” he had in developing video games. Of course, that could be in part due to his unnerving stance towards AI weaponry. While Luckey acknowledges the controversy behind machine decision-making, his answer makes it a bit difficult to sleep at night. The military startup founder says that he doesn’t want to “make it impossible for these systems to ever be used in certain ways.”
Here’s his approach to AI in layman terms: Luckey doesn’t want to make it impossible for a weapon to fire on a target if an actual human isn’t manning the communications. His rationale is that the enemy could learn that shutting down communications is the key to disabling an entire defence system. Instead, he looks to ensure that “the responsibility for [weapons firing] always lands on a person,” rather than the pulling of the trigger itself ” A Republican donor thinking that the ethical rammnifactions of murder technology should be dictated by personal responsibillty? Who could have possibly seen this coming?
Luckey talks big about the future of military technology and his good intentions, but he’s making good money from militarised conflict. The startup recently landed a billion-dollar contract from the Department of Defence in January. And its work on the border wall wasn’t cheap either–that five-year contract with the U.S. Customs and Border Protection agency was worth $US250 ($347) million, though it’s unclear if that deal is still in place with the current Biden administration. And though Luckey likes to send “mean tweets,” as he refers to them, about Anduril having more money than taxpayer-funded weapons manufacturers since Anduril and other private companies aren’t tied only to public funds. But maybe it’s actually a bad thing for private companies to be incentivized by armed conflict.