What killer robots mean for the future of war – Yahoo Finance UK

With mortgage rates increasing, there are new and innovative ways you can help your kids onto the property ladder
You might have heard of killer robots, slaughterbots or terminators – officially called lethal autonomous weapons (LAWs) – from films and books. And the idea of super-intelligent weapons running rampant is still science fiction. But as AI weapons become increasingly sophisticated, public concern is growing over fears about lack of accountability and the risk of technical failure.
Already we have seen how so-called neutral AI have made sexist algorithms and inept content moderation systems, largely because their creators did not understand the technology. But in war, these kinds of misunderstandings could kill civilians or wreck negotiations.
For example, a target recognition algorithm could be trained to identify tanks from satellite imagery. But what if all of the images used to train the system featured soldiers in formation around the tank? It might mistake a civilian vehicle passing through a military blockade for a target.
Civilians in many countries (such as Vietnam, Afghanistan and Yemen) have suffered because of the way global superpowers build and use increasingly advanced weapons. Many people would argue they have done more harm than good, most recently pointing to the Russian invasion of Ukraine early in 2022.
In the other camp are people who say a country must be able to defend itself, which means keeping up with other nations’ military technology. AI can already outsmart humans at chess and poker. It outperforms humans in the real world too. For example Microsoft claims its speech recognition software has an error rate of 1% compared to the human error rate of around 6%. So it is hardly surprising that armies are slowly handing algorithms the reins.
But how do we avoid adding killer robots to the long list of things we wish we had never invented? First of all: know thy enemy.
The US Department of Defence defines an autonomous weapon system as: “A weapon system that, once activated, can select and engage targets without further intervention by a human operator.”
Many combat systems already fit this criteria. The computers on drones and modern missiles have algorithms that can detect targets and fire at them with far more precision than a human operator. Israel’s Iron Dome is one of several active defence systems that can engage targets without human supervision.
While designed for missile defence, the Iron Dome could kill people by accident. But the risk is seen as acceptable in international politics because the Iron Dome generally has a reliable history of protecting civilian lives.
There are AI enabled weapons designed to attack people too, from robot sentries to loitering kamikaze drones used in the Ukraine war. LAWs are already here. So, if we want to influence the use of LAWs, we need to understand the history of modern weapons.
International agreements, such as the Geneva conventions establish conduct for the treatment of prisoners of war and civilians during conflict. They are one of the few tools we have to control how wars are fought. Unfortunately, the use of chemical weapons by the US in Vietnam, and by Russia in Afghanistan, are proof these measures aren’t always successful.
Worse is when key players refuse to sign up. The International Campaign to Ban Landmines (ICBL) has been lobbying politicians since 1992 to ban mines and cluster munitions (which randomly scatter small bombs over a wide area). In 1997 the Ottawa treaty included a ban of these weapons, which 122 countries signed. But the US, China and Russia didn’t buy in.
Landmines have injured and killed at least 5,000 soldiers and civilians per year since 2015 and as many as 9,440 people in 2017. The Landmine and Cluster Munition Monitor 2022 report said:
Casualties…have been disturbingly high for the past seven years, following more than a decade of historic reductions. The year 2021 was no exception. This trend is largely the result of increased conflict and contamination by improvised mines observed since 2015. Civilians represented most of the victims recorded, half of whom were children.
Read more: Deaths from landmines are on the rise – and clearing them all will take decades
Despite the best efforts of the ICBL, there is evidence both Russia and Ukraine (a member of the Ottawa treaty) are using landmines during the Russian invasion of Ukraine. Ukraine has also relied on drones to guide artillery strikes, or more recently for “kamikaze attacks” on Russian infrastructure.
But what about more advanced AI enabled weapons? The Campaign to Stop Killer Robots lists nine key problems with LAWs, focusing on the lack of accountability, and the inherent dehumanisation of killing that comes with it.
While this criticism is valid, a full ban of LAWs is unrealistic for two reasons. First, much like mines, pandora’s box has already been opened. Also the lines between autonomous weapons, LAWs and killer robots are so blurred it’s difficult to distinguish between them. Military leaders would always be able to find a loophole in the wording of a ban and sneak killer robots into service as defensive autonomous weapons. They might even do so unknowingly.
We will almost certainly see more AI enabled weapons in the future. But this doesn’t mean we have to look the other way. More specific and nuanced prohibitions would help keep our politicians, data scientists and engineers accountable.
For example, by banning:
black box AI: systems where the user has no information about the algorithm beyond inputs and outputs
unreliable AI: systems that have been poorly tested (such as in the military blockade example mentioned previously).
And you don’t have to be an expert in AI to have a view on LAWs. Stay aware of new military AI developments. When you read or hear about AI being used in combat, ask yourself: is it justified? Is it preserving civilian life? If not, engage with the communities that are working to control these systems. Together, we stand a chance at preventing AI from doing more harm than good.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Jonathan Erskine receives funding from UKRI and Thales Training and Simulation Ltd.
Miranda Mowbray is affiliated with the University of Bristol, where she gives lectures on AI ethics in the UKRI-funded Centre for Doctoral Training in Interactive AI. She is a member of the advisory council for the Open Rights Group.
(Bloomberg) — Brazil’s ex-President Jair Bolsonaro has been admitted to a US hospital with abdominal pain a day after his supporters stormed Brasilia demanding military intervention against his loss in the October election, according to his wife. Most Read from BloombergUS Safety Agency to Consider Ban on Gas Stoves Amid Health FearsRental Housing Is Suddenly Headed Toward a Hard LandingBrady, Gisele, Patriots’ Bob Kraft Among FTX Shareholders Facing WipeoutApple to Begin Making In-House Screen
The Met Office has predicted that England is to be affected by flooding this February.
A new device promises it can help ensure you stay in good health via an unusual method – monitoring your urine. Health tech firm Withings announced the release of U-Scan, billed as a breakthrough in-home biomarker analysis platform and miniaturised health lab, at the Consumer Electronics Show in Las Vegas. With more than 3000 metabolites, urine is an extraordinary way to assess and monitor one's health. It gives an immediate snapshot of the body's balance and is integral in monitoring and detect
TikTok user shares how to save money on car insurance.
Welsh government considers whether to block plan after experts say manure from intensive units is turning Wye into ‘pea soup’
The criticism came as the Duke of Sussex gave a series of bombshell interviews to promote his controversial memoirs.
The French government has announced stricter rules against hunting under the influence of drugs or alcohol, and extended protection for walkers and local residents, but stopped short of a hoped-for Sunday ban. Junior Environment Minister Bérangère Couillard said hunting under the influence of drugs or alcohol would be banned, training and safety rules for hunters strengthened and digital systems developed to warn other countryside users away from active hunting zones.Penalties will also be incre
STORY: The "horizontal launch" mission had left from the coastal town of Newquay in southwest England, where a crowd had gathered to watch as Virgin's LauncherOne rocket was carried under the wing of a modified Boeing 747 and later released over the Atlantic Ocean."It appears that LauncherOne has suffered an anomaly, which will prevent us from reaching orbit," Director of Systems Engineering and Verification at Virgin Orbit, Christopher Relf said. "We are looking at the information and data that we have gotten."The apparent failure deals a further blow to European space ambitions after an Italian-built Vega-C rocket mission failed after lift-off from French Guiana in late December. The rockets have since been grounded.
NEWQUAY, England, January 10, 2023–The historic first attempt to launch satellites from British soil reached space late last night, but ultimately fell short of reaching its target orbit.
The nominations will take place on Instagram live, and be read out by Emily In Paris star Ashley Park and The White Lotus’s Haley Lu Richardson.
Harry and the NHS remain front and centre in the papers.
Celebrations turned to disappointment as head of Spaceport Cornwall acknowledges 'space is hard'.
Statement comes after regulator pulls up airline for ‘systemic failure’ in dealing with incident
Almost 600 cannabis plants were seized following a raid on a warehouse in the Portswood area of Southampton, police have confirmed.
Patrick Stewart’s Picard is heading into its third and final season at Paramount+. However, the venerable British actor is leaving the door open to return to Star Trek: Picard, as is exec producer Alex Kurtzman, who also alluded to more series coming to the sci-fi universe. Star Trek: Picard features Stewart reprising his role as […]
There was still a lingering smell of tear gas in the air.
Ambulance unions placed people's lives at risk when they refused to introduce countrywide minimum levels of service during strikes last month, the business secretary has said. Grant Shapps hit out at ambulance workers for only agreeing on a local level to minimum service levels after he earlier told Sky News it was a "regional or postcode lottery". The business secretary introduced a bill to parliament on Monday afternoon that will mean unions representing key workers will have to agree to minimum levels of safety and service when their members go on strike.
Three men are wanted by police on recall to prison, while one is being sought for breaching his restraining order
Gibson is one of Hollywood’s most controversial stars
Shadow work and pensions secretary unveils a raft of proposals to overhaul welfare system which "too often… disincentivises work".

source

Leave a Comment