fbpx

Terrorists are great believers in technology and have a historical track record of embracing it.

Most recently, terror groups have begun employing drones and Artificial Intelligence (AI). Both can be harnessed to respond effectively to multiple security challenges, but they also hold the potential to unleash a dystopian future envisioned by apocalyptic books and films. This piece outlines the challenges posed by terrorists’ use of emerging technologies, in particular drones and AI, and will offer some policy measures to counter this threat.

Unmanned Aerial Vehicles (UAVs or drones)

Uncrewed aerial systems (UAS) have been identified as one of the key terrorist threats by the United Nations (UN) Security Council Counter-Terrorism Committee. UAS are remotely piloted, pre-programmed, or controlled airborne vehicles. They are also referred to as unmanned aerial vehicles (UAVs), or more commonly drones.

First used during World War I and further developed during the Cold War the terror attacks on 9/11 and the subsequent “Global War on Terror’” greatly increased their use. Within the military, drones are used for surveillance, reconnaissance, targeting support and direct or indirect attacks. From 2001-2002, strikes in Afghanistan and Yemen marked the start of increasingly drone-oriented military operations. Drones are currently the “weapon of choice” for tracking and striking insurgents and terrorists. The current Russia-Ukraine conflict has been described as the “first full-scale drone war”.

Today, state and non-state actors possess the ability to acquire drones and can assemble and operate commercial-off-the-shelf drone technology (COTS). There are 113 states with a military drone programme and conservative estimates maintain that 65 non-state actors are now able to deploy drones. The market for civilian mini and micro drones weighing between 200 g to 50 kg has multiplied, by 2024 the drone market could reach nearly USD 43 billion.

Threats Posed by UAS

UAS are attractive for terrorists because they are affordable and require minimal training. Terrorists have deployed drones to attack state military assets, diplomatic sites, international trade, energy infrastructure, and civilian centres. State sponsorship of terrorist groups has also helped increase the number of drone attacks.

Drone strikes are being deployed by armed non-state actors for reconnaissance, lethal attacks and targeted assassinations, within and outside zones of armed conflict, with devastating humanitarian consequences for affected civilian populations. While high-technology military armed drones still remain largely inaccessible to non-state actors (e.g. MQ-9 Reaper, RQ Global Hawk), the possibility to weaponize civilian drone technology provides terrorists with some limited-air-based military capacity.

Deployment and Access to UAVS

Weaponised UAS are increasing in range and precision. Rogue actors can strike targets further away. Drones can travel up to 1,500 km, ideal for attacks on military targets deep within state territory. Also, civilian infrastructure, located far from conflict zones, is now increasingly vulnerable. Since 2020, energy infrastructure, international shipping, international airports, and capital cities have all been targeted by drones.

On the horizon is the growing issue of saturation drone strikes. Working in partnership with other unarmed UAVs, weaponised drones can be used to pin-point and destroy air defence systems, opening the gates for an incoming volley of rockets, missiles, and other armed drones. By combining downloadable software and online tutorials drone users can launch rudimentary ‘swarms’ — between five and ten drones can be ‘hooked-up’ to a single device.

Governments’ worldwide will also face a significant national security threat in adversarial use of small, unmanned aircraft systems (sUAS). Technology to produce swarms represents a multi-layered and unmanageable new threat. Current US DOD strategy includes some ways to confront drones; however, it has no strategy to solve future armed drone swarms. To thwart sUAS attack swarms, militaries worldwide must confront and reassess their respective technical, legal and doctrinal issues.

Terrorists use of UAS is facilitated by: (1) the unregulated civilian market and the ability to acquire tech in the Dark web; (2) the availability of unsecured explosives, used as payloads; (3) easy access to explosive precursors; and (4) the access to technical expertise via the internet and social media.

Non-State Actor Use of UAS

Multiple non-state actors have used drones in combat, including Boko Haram, Hamas, Hezbollah, Houthi rebels, and ISIL. Smaller drones are used for intelligence, surveillance, and electronic warfare measures, and assist in target acquisition to increase precision and lethality from ground-based systems. Drones are used as decoys to distract while strikes are directed elsewhere. Drones also perpetuate the psychological dimension of terrorism by spreading fear.

ISIL employs UAVs in conflict zones to discharge small grenade-size bombs. ISIL conducted attacks with drones to kill enemy combatants in northern Iraq. They formed an “Unmanned Aircraft of the Mujahedeen” unit. In 2017, ISIL pinned down Iraqi security forces for 24-hours in Syria by executing 70 drone missions Houthi rebels in 2019 made headlines when their drones helped knock off nearly 6% of the world’s oil supply with their targeted attacks on Saudi oil refineries.

Reports from the Monitoring Team (MT) maintained that in 2021, affiliates of ISIL and al Qaida demonstrated a growing UAS capability in parts of West and East Africa. Al-Shabaab, in East Africa, uses drones for reconnaissance and surveillance and could conceivably launch attacks on civil aviation.

Drones can be armed with Chemical Biological Radiological Nuclear agents. In 2015, a drone with radioactive sand targeted the roof of the Japanese Prime Minister’s office. In 2019, France’s UCLAT alerted about a possible terrorist attack on a football stadium by using a drone carrying lethal chemicals. The Chinese designed blowfish drones can throw grenades, launch mortars and fire machine guns while an AI driven system determines the target.

Targeted assassinations

Drones are being used for targeted assassinations, mostly by state actors but progressively by terrorists and criminals. In August 2018, Venezuelan President Nicolás Maduro was the target of a failed assassination attempt. In 2021, Iraqi Prime Minister Mustafa Al-Kadhimi escaped an assassination attempt. In 2021, a Taliban drone unit that strategically assassinated Piram Qul, an ethnic Uzbek warlord claimed to have helped win the war.

Policy Response to UAS

Building an international counter-UAS community

In 2017, the Security Council requested that action be taken to mitigate the threat of UAS falling into the hands of terrorists. Multiple organizations cooperate on UAS including: UN Counter-Terrorism Executive Directorate (UNCTED), UNODC, UNIDIR, IOM, INTERPOL, the EU Commission, and the GCTF published the ‘Berlin Memorandum on Good Practices for Countering Terrorist Use of Unmanned Aerial Systems’. A joint UN Office of Counter-Terrorism, and CAR project is underway to assess global trends.

In 2022, the European Commission finalized the European Drone Strategy 2.0, which builds a framework for operating and setting the technical requirements of drones. Academics are working on a robust Drone Technology Control Regime (or DTCR).

National approaches

National drone management structures will need to keep pace with commercial advances. Managing UAS requires a whole-of-government strategy including procurement and/or deployment of counter systems as well as the need to support partnerships with the private sector, academia, civil society.

The chart below outlines the multidimensional responses to threats posed by misuse of UAS. It highlights how states can reliably prevent terrorists from acquiring and using UAS. This model shows that by implementing effective upstream measures, fewer downstream measures will be required. This is further reinforced by a critical feedback loop through which downstream measures inform strengthened upstream measures.

The ubiquity of UAS will lead to a third drone age with uncrewed aerial, ground and underwater vehicles and the risk of increased swarm attacks. New UAS technological innovations will require greater safeguards; including upholding international law and human rights and limiting proliferation.

Artificial Intelligence and ChatGPT

Artificial Intelligence (AI) will have a heavy impact on security in the future. It works by mimicking human brain patterns including learning and making decisions. AI can work with uncertainty and make decisions and predictions based on large datasets and multiple conditions.

AI will allow adversaries to act quickly with micro-precision, but at macro-scale. AI will also enhance cyber-attacks and digital disinformation campaigns. AI is dual use, so no skills are needed to construct AI; adversaries will only need to manipulate existing AI systems. These are in development, such as the European Union’s project to use AI drone swarms to protect borders.

OpenAI has recently launched a new coding architecture that is capable for specific tasks such as question-answering and text summarization called ChatGPT. Launched in November 2022 – by January 2023, 500 million people have used it. ChatGPT carries with it potential risks in the areas of disinformation, cybercrime and terrorism. It will empower low skilled hackers to transform basic phishing schemes into professional attacks. ChatGPT could write malware capable of mutating its appearance to dodge detection and help terrorists to “industrialize the creation and personalization of malicious web pages and social engineering reliant scams.”

DeepMind is creating Artificial General Intelligence (AGI) to help build machines that can think, learn and be set to solve humanity’s greatest problems. However, Hassabis warns that with AI’s promise also comes peril, stressing the importance of building in guard rails, especially given that AI is at the cusp of being able to make tools that could be deeply damaging to human civilization.

A legal and regulatory framework is required almost immediately from both public and private institutions for AI and AGI. The European Union’s AI Act can provide some guidance.

AI has the potential to be abused by terrorists to recruit, spread hatred and support their insurgencies. Technology has already shown to have long-term societal implications. As Patrikarakos has argued in his book War in 140 Characters, that today a singular person can be empowered with the use social media to change the course of both the physical battle and the discourse around it.

Drones and AI will continue to evolve. Therefore, it is imperative that we deepen our understanding of how terrorists are harnessing technologies to increase their power in both the physical and psychological domains. The focus should not only be on disruption but also prevention, and the responsibility to act must be shared across government agencies, academic institutions and technology companies. We are finding ourselves in the trenches of an increasingly widening digital war and we have not yet mastered how to escape it while terrorist digitalis is marching ahead.

This excerpt is taken from ‘Terrorist Digitals: Preventing Terrorists from Using Emerging Technologies’ by Dr. Christina Schori-Liang – Head of Terrorism and PVE, Geneva Centre for Security Policy and Visiting Professor, Paris School of International Affairs, Sciences PO, originally published in the Global Terrorism Index 2023 report.

voh-articles-top-banner-gti-2023

Download the Global Terrorism Index 2023.

Global Terrorism Index 2023

AUTHOR

LIANG-Christina-Schori-285×214

Dr. Christina Schori Liang

Head of Terrorism and PVE Geneva Centre for Security Policy

Vision of Humanity

Vision of Humanity is brought to you by the Institute for Economics and Peace (IEP), by staff in our global offices in Sydney, New York, The Hague, Harare and Mexico. Alongside maps and global indices, we present fresh perspectives on current affairs reflecting our editorial philosophy.