What part will your country play in World War III?

By Larry Romanoff

The true origins of the two World Wars have been deleted from all our history books and replaced with mythology. Neither War was started (or desired) by Germany, but both at the instigation of a group of European Zionist Jews with the stated intent of the total destruction of Germany. The documentation is overwhelming and the evidence undeniable. (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11)

That history is being repeated today in a mass grooming of the Western world’s people (especially Americans) in preparation for World War IIIwhich I believe is now imminent



Saturday, December 28, 2019

As the US, China, and Russia Build New Nuclear Weapons Systems, How Will Artificial Intelligence (AI) be Built in? What are the Dangers?

Researchers in the United States and elsewhere are paying a lot of attention to the prospect that in the coming years new nuclear weapons—and the infrastructure built to operate them—will include greater levels of artificial intelligence and automation. Earlier this month, three prominent US defense experts published a comprehensive analysis of how automation is already involved in nuclear command and control systems and of what could go wrong if countries implement even riskier forms of it.
The working paper “A Stable Nuclear Future? The Impact of Autonomous Systems and Artificial Intelligence” by the team of Michael Horowitz, Paul Scharre, and Alexander Velez-Green comes on the heels of other scholarly takes on the impact artificial intelligence (AI) will have on strategies around using nuclear weapons. All this research reflects the fact that militaries around the world are incorporating more artificial intelligence into non-nuclear weaponry—and that several countries are overhauling their nuclear weapons programs. “We wanted to better understand both the potentially stabilizing and destabilizing effects of automation on nuclear stability,” Scharre, a senior fellow at the Center for a New American Security, told the Bulletin.
“In particular, as we see nations modernize their nuclear arsenals, there is both a risk and an opportunity in how they use automation in their nuclear operations.”

The report notes that nuclear weapons systems already include some automated functionality: For example, warning systems automatically alert nuclear weapons operators of an attack. After the Cold War, Russian missiles were programmed to automatically retarget themselves to hit US targets if they were launched without a flight plan. For its part, the United States at one point designed its entire missile arsenal so that it could be retargeted in seconds from its peacetime default of flying into the ocean. Even these forms of automation are risky as an accidental launch could “spark a nuclear war,” the report says. But some countries, the report warns, might resort to riskier types of automation.
Those risks could come from a variety of different sources. Countries could develop unmanned vehicles carrying nuclear weapons; with no one on board and responsible for deploying a nuclear weapon, the systems could be hacked or otherwise “slip out of control,” the authors say. In fact, the report notes, Russia is already reportedly developing an autonomous nuclear torpedo. Horowitz, a University of Pennsylvania political science professor, told the Bulletin that the weapon, called Poseidon or Status-6, could be the start of a trend, though it’s not yet clear how or if AI will be included. “While so much about it is uncertain, Russia’s willingness to explore the notion of a long-duration, underwater, uninhabited nuclear delivery vehicle in Status-6 shows that fear of conventional or nuclear inferiority could create some incentives to pursue greater autonomy,” Horowitz said.
Countries might also build more artificial intelligence into the so-called early warning systems that indicate whether a nuclear attack is underway, or insert more powerful AI into the strategic decision support systems they use to keep tabs on other militaries and nuclear forces. Even simple forms of automation in such systems have, in the past, exacerbated nuclear tensions. The report cites a famous 1983 incident where a Soviet officer, Lt. Col. Stanislav Petrov, had to disregard automated audible and visual warnings that US nuclear missiles were inbound. Fortunately, Petrov chose not to trust what his systems were telling him and defied the powerful cognitive phenomenon known as automation bias.
Another problematic form of early automation was the Soviet strategic decision support system known as VYRAN. It was a computer program in place to warn Soviet leaders when the United States had achieved a level of military superiority that required Moscow to launch a nuclear attack. But Soviet intelligence agents were inputting information that often confirmed their pre-existing beliefs about US intentions. “This feedback loop amplified and intensified those perceived threats, rather than providing Soviet leaders with a clearer understanding of US intentions,” the report notes. There is evidence that countries including Russia and China are placing more emphasis on developing these sorts of so-called computational models for analyzing threats.

The US military tests a missile in the ocean.

A Trident II D5 missile test. The US military, along with others around the world, is upgrading its nuclear weapons systems. Credit: US Navy/Mass Communication Specialist 1st Class Ronald Gutridge.
Despite all these drawbacks, however, the report’s authors believe there could be reasons to implement more AI and automation into nuclear weapons systems. They note how artificial intelligence systems could process more data and allow officials in charge of nuclear weapons greater situational awareness. Automation could also be useful in communicating commands in “highly contested electromagnetic environments,” as the report dryly puts it—perhaps, say, during a war. But, the report says, “many of these ways that autonomous systems could increase the resiliency and accuracy of [nuclear command and control systems] are speculative.”
The countries most likely to take on the risks of incorporating greater levels of artificial intelligence and automation in their nuclear weapons systems are the ones that are less certain of their ability to retaliate after an attack on their nuclear arsenal. As the report notes, that’s because the consequences of missing signs of an actual incoming attack—a false negative–would be relatively lower in more confident countries.
Horowitz believes that incorporating artificial intelligence in nuclear weapons systems themselves poses mostly low probability risks. In fact, what concerns him most is how AI in non-nuclear military systems could affect nuclear weapons’ policies.  “The risk I worry most about is how conventional military applications of AI, by increasing the speed of war, could place pressure on the early warning and launch doctrines of nuclear weapons states that fear decapitation in conventional war,” Horowitz told the Bulletin.
Or, as the report puts it, AI-induced time pressure could lead to a chain of decision-making that, in the worst cases, could result in a country launching a pre-emptive nuclear attack. “Fear of losing quickly could create incentives for more rapid escalation to the nuclear level.”
The report predicts that there’s a pretty strong likelihood that more automation will “creep its way” into nuclear operations over time—especially as nations modernize their nuclear forces. The United States has already embarked on a multi-decade, trillion-dollar-plus plan to upgrade its nuclear forces; Russia and China are similarly modernizing theirs.
“What is interesting, though, is that both the United States and Russia—and the Soviet Union before that—have had elements of automation in their nuclear operations, early warning, command-and-control, and delivery systems for decades,” Scharre said. “So it is an issue worthy of deeper exploration.”
Maybe that’s even a bit of an understatement.
Note to readers: please click the share buttons above or below. Forward this article to your email lists. Crosspost on your blog site, internet forums. etc.
Featured image: A US Air Force commander simulates launching a nuclear weapon during a test. Nuclear command and control systems already incorporate various forms of automation. As countries build new systems, will they insert more artificial intelligence? Credit: US Air Force/Staff Sgt. Christopher Ruano.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.


2007 Speech


Discurso do Presidente da Rússia, Vladimir Putin, na manhã do dia 24 de Fevereiro de 2022

Discurso do Presidente da Rússia, Vladimir Putin, Tradução em português

Presidente da Rússia, Vladimir Putin: Cidadãos da Rússia, Amigos,

Considero ser necessário falar hoje, de novo, sobre os trágicos acontecimentos em Donbass e sobre os aspectos mais importantes de garantir a segurança da Rússia.

Começarei com o que disse no meu discurso de 21 de Fevereiro de 2022. Falei sobre as nossas maiores responsabilidades e preocupações e sobre as ameaças fundamentais que os irresponsáveis políticos ocidentais criaram à Rússia de forma continuada, com rudeza e sem cerimónias, de ano para ano. Refiro-me à expansão da NATO para Leste, que está a aproximar cada vez mais as suas infraestruturas militares da fronteira russa.

É um facto que, durante os últimos 30 anos, temos tentado pacientemente chegar a um acordo com os principais países NATO, relativamente aos princípios de uma segurança igual e indivisível, na Europa. Em resposta às nossas propostas, enfrentámos invariavelmente, ou engano cínico e mentiras, ou tentativas de pressão e de chantagem, enquanto a aliança do Atlântico Norte continuou a expandir-se, apesar dos nossos protestos e preocupações. A sua máquina militar está em movimento e, como disse, aproxima-se da nossa fronteira.

Porque é que isto está a acontecer? De onde veio esta forma insolente de falar que atinge o máximo do seu excepcionalismo, infalibilidade e permissividade? Qual é a explicação para esta atitude de desprezo e desdém pelos nossos interesses e exigências absolutamente legítimas?

Read more


Ver a imagem de origem



(China, France, India, Israel, North Korea, Pakistan, Russia, the United Kingdom and the United States)


manlio + maria




Read more at Moon of Shanghai

World Intellectual Property Day (or Happy Birthday WIPO) - Spruson ...

Moon of Shanghai

L Romanoff

Larry Romanoff,

contributing author

to Cynthia McKinney's new COVID-19 anthology

'When China Sneezes'

When China Sneezes: From the Coronavirus Lockdown to the Global Politico-Economic Crisis


James Bacque


irmãos de armas

Subtitled in PT, RO, SP

Click upon CC and choose your language.



Before the Presidential Address to the Federal Assembly.

The President of Russia delivered
the Address to the Federal Assembly. The ceremony took
place at the Manezh Central Exhibition Hall.

15, 2020


President of Russia Vladimir Putin:

Address to the Nation

Address to the Nation.




PT -- VLADIMIR PUTIN na Sessão plenária do Fórum Económico Oriental

Excertos da transcrição da sessão plenária do Fórum Económico Oriental


The Putin Interviews
by Oliver Stone (



Um auto retrato surpreendentemente sincero do Presidente da Rússia, Vladimir Putin



Personagens Principais em 'Na Primeira Pessoa'

Parte Um: O Filho

Parte Dois: O Estudante

Parte Três: O Estudante Universitário

Parte Quatro: O Jovem especialista

Parte Cinco: O Espia

Parte Seis: O Democrata

Parte Sete: O Burocrata

Parte Oito: O Homem de Família

Parte Nove: O Político

Apêndice: A Rússia na Viragem do Milénio

contaminação nos Açores

Subtitled in EN/PT

Click upon the small wheel at the right side of the video and choose your language.

convegno firenze 2019