Questo sito web ed i sui contenuti sono esenti da cookie, pubblicità invasiva, occulta, subdola, eticamente scorretta e pure da files geneticamente modificati:
per una libera informazione in un libero Stato. -
Niente Google, niente Facebook, nessun link a programmi spioni..
- No Malicious softwares, spam, coookies, phishings, trojans etc. No Annoying Popus - Zero Facebook, Zero Google, Zero "Suggestions" , Zero Snake'apps..

I nostri lettori: in oltre 77 Nazioni - Our readers: in over 77 Nations - Nos lecteurs: dans plus de 77 Nations


ref:topbtw-1865.html/ 26 Dicembre 2019/A



UMANITA'.

Intellegenza Artificiale..

Ad un soffio dalla nostra fine..
Scientists warn AI control of nukes could lead to 'Terminator-style' war..

N.Y.

The world may be inching closer to an apocalyptic nuclear war could be possible as control over nuclear weapons is yielded to artificial intelligence (AI).

The world may be inching closer to an era where a Terminator-style apocalyptic nuclear war could be possible due to yielding control over nuclear weapons to artificial intelligence (AI), according to publications by nuclear scientists and defense experts.

While numerous AI experts have told the Jerusalem Post over the years that people worried about AI turning on humanity as in the famous "Terminator" movies simply misunderstand the technology, the likelihood of AI making a catastrophic mistake with nuclear weapons is no fairytale.

A recent article in the Bulletin of the Atomic Scientists, a top group of nuclear scientists, as well as other recent publications by defense experts have said that Russia may already be integrating AI into a new nuclear torpedo it is developing known as the Poseidon, to make it autonomous.

According to the Atomic Scientists report, the US and China are also considering injecting AI deeper into their nuclear weapons' programs as they modernize and overhaul their nuclear inventory.

There have been no express reports about Israel integrating AI into, what according to foreign reports, is an apparatus of between 80-200 nuclear weapons.

But there have been reports of the IDF integrating AI into conventional weapons, such as its spice bomb carried by F-16s.

Part of the concern in the report was that integrating AI into nuclear weapons' systems could become culturally inevitable once non-conventional weapons become more dominated by AI.

The nuclear holocaust risks that scientists and experts are writing about are not a hostile takeover by AI, but by AI getting hacked, slipping out of control by a technical error or badly misjudging a situation.

Such risks could be magnified by unmanned vehicles carrying nuclear weapons where there is no one on board and responsible for making the final decision to deploy a nuclear weapon.

As a secondary but still serious risk, AI integration into early warning systems could overwhelm human decision-makers who could be faster on the nuclear trigger finger to yield to the technology despite any human judgment doubts they might have.

Some studies have shown that AI and automated evidence in general can reinforce bubble-style thinking and make it more difficult for analysts to entertain alternate narratives about what might be occurring in murky and hi-stress situations.



An example that the article gives of human judgment's importance was a 1983 incident when a Soviet officer named Stanislav Petrov disregarded automated audible and visual warnings that US nuclear missiles were inbound.

The systems were wrong and had Petrov trusted technology over his own instincts, the world might have gone to nuclear war over a technological malfunction.

The article also points out potential valuable aspects of AI in the nuclear weapons arena, such as gathering more accurate and comprehensive data so that decision-makers are guessing in the dark less often.

In addition, AI can get such key information to decision-makers much faster whereas in the past key information might be stuck in the collection process without getting to leaders in time before they had to make a decision.Moreover, the article noted that AI has been integrated into aspects of countries' nuclear programs for some time.

Even in earlier decades of the Cold War, both the US and Russia had certain capabilities programmed into some nuclear weapons to be able to quickly switch to targeting each other, as opposed to landing harmlessly at sea, should certain scenarios occur.

Overall, the greatest concern about AI in nuclear weapons is with the weaker side in a potential standoff.

A country like China, with much more limited nuclear or conventional weapons capabilities, might seek to integrate AI into its nuclear weapons program with the hope of accelerating deployment speed so that the US would be unable to knock it out of a war with a preemptive "first strike."

Some analysts believe this might be a reason that Russia is entertaining AI in its nuclear program, while others view Moscow as wanting to speed up its nuclear weapons deployment in order to be more offensive-minded, and not merely in self-defense.

Although such abilities would seem to be far away from what Iran can achieve, Tehran has had sudden jumps in nuclear technology in the past when given assistance by Russia, China, North Korea or Pakistan.


( Gagrule )

168- - 5 G DIFFUSION - - ENGLISH -- " 300 ITALIAN CASTLES " - 3' -

La nuova serie televisia: i 300 castelli Italiani La nouvelle série télévisée " 300 châteaux d'Italie"


- Torna alla Prima Pagina - Back to the Front Page -

Condividi su Facebook -

- Today' new contacts -

I lettori di questa pagina sono:


WOP!WEB Servizi per siti web... GRATIS!