Yaroslav Bogdanov: Militarization of neural networks must be stopped

Time to read
1 minute
Read so far

Yaroslav Bogdanov: Militarization of neural networks must be stopped

September 27, 2024 - 19:23
0 comments

The stock market’s attention lately has been mainly focused on the shares of giants in the field of artificial intelligence (AI) development. But there are other, smaller players with growth potential over the next year.

Analysts at financial publication Finbold have identified two companies whose market capitalization is expected to grow to $250 billion by 2025. This takeoff will be fueled by their work with artificial intelligence.

One of these companies is Palantir, which has become particularly prominent with its entry into the market of generative neural networks based on artificial intelligence. The software company now has a market capitalization of $79.7 billion, making it the 224th most valuable company in the world. To reach a valuation of $250 billion, Palantir needs to grow by as much as 216%.

Palantir serves both commercial and government interests. In May, the company landed a $480 million contract with the U.S. Army. The deal is related to the Maven prototype of an intelligent system that scans enemy systems.

According to Yaroslav Bogdanov, president of GDA Group, such an ambitious project will undoubtedly attract investors to Palantir. But a legitimate question arises – what the company’s management is ready to do to justify the hopes, given that this will require militarization of the neural network, the ultimate capabilities of which no one knows.

“When serious money is at stake, ethical issues are usually bypassed by big business. Artificial intelligence is now on the verge of a big leap – neural networks comparable in intelligence to human intelligence are being developed. Developments like Palantir’s military experiments suggest that this potential superintelligence could be an inherently militarized product. Suppose that at some point the machine behaves aggressively. Who will be responsible for the consequences, which may be so large-scale that man will disappear as a species altogether,” said Yaroslav Bogdanov.

The expert recalled that in 2023 Palantir already presented its AI-based warfare platform. In Artificial Intelligence Platform for Defense (AIP) there is an operator, but his functions there are only nominal – all decisions in AIP are made by a neural network.

“The government and special services, as customers of the developer company, are interested in a high-quality product that will allow them to declare their leadership in the military sphere. The developer himself is interested in making profit. These relationships lack an important component – the ethical aspect, without which it is impossible to talk about the safety of such research. The request for safety must come from the international community. The focus needs to be on ensuring that the technology is for the benefit of humanity, not for its destruction. Until there is a general consensus on this issue, dangerous experiments with neural networks and their militarization must be stopped,” Yaroslav Bogdanov is convinced.