Defending The Gig Economy Against API Attacks
DeepSeek, the Chinese Large Language Model (LLM), has exploded onto the AI scene disrupting the nascent market. The open source technology has made available the blueprint to train its models and delivers generative AI (GenAI) computing at a fraction of the cost of its rivals due to lower power consumption.
The net effect of those differences is that DeepSeek will democratise GenAI, making it much easier for organisations to harness and benefit from the technology.
One of the sectors expected to benefit significantly from a more accessible and affordable GenAI is the gig economy. Renowned for its disruptive start-ups, the sector is defined by the UK government as involving “the exchange of labour for money between individuals or companies via digital platforms that actively facilitate matching between providers and customers, on a short-term and payment-by-task basis”. GenAI could help these businesses become more intuitive and efficient by matching freelancers and clients, interpreting feedback and translating it into action, and generating content, allowing companies to find more opportunities and deliver better results to clientele.
However, LLMs are not infallible. DeepSeek succumbed to a ‘large scale malicious attack’ that forced the company to temporarily suspend new registrations at the end of January, although it was able to quickly recover. But this has refocused attention on the susceptibility of the technology to attack and comes hard on the heels of numerous other stories concerning skewed results, hallucinations and data leakage, all of which could prove crippling for businesses who come to rely on it heavily.
Attack Paths
These issues have been documented by the OWASP industry group which has a project specifically devoted to LLM security and has just updated its Top 10 for LLM Applications for 2025. The list covers the most critical exploits and vulnerabilities associated with the technology, including sensitive information disclosure, supply chain attacks and unbounded consumption.
Many of these issues can see the abuse of an integral component in the way GenAI works: the Application Programming Interface (API).
APIs are essential for LLMs to connect to one another and access data in lightning quick time. They’re also the reason why we have a gig economy in the first place, as they facilitate the provision of real-time services and the processing of payments, as well as connecting together all of the ecosystem players.
So not only does GenAI need APIs, so too does the gig economy, meaning both should be prioritising API security.
In addition, we need to remember that threat actors will also be leveraging GenAI technology to orchestrate attacks. The technology can allow the attacker to ‘humanise’ their assault, making it much more difficult for the business to detect rogue activity.
Effects On Gig Businesses
Such attacks against gig economy APIs could have devastating consequences. If we consider ride sharing and delivery platforms, for example, these use APIs to facilitate real-time matching between drivers and customers. Attacks against these via GenAI could see the use of advanced scraping techniques to extract pricing data, or AI-powered bots to simulate customer requests, overwhelming the platform’s systems.
Similarly, job marketplace platforms that seek to match recruiters with candidates could see AI used to generate fake jobs and manipulate proposals or to automate scraping of sensitive freelancer information, enabling competitors to undercut prices or steal business.
Online staffing agencies could see GenAI technologies used to automate job application fraud or even hijack worker accounts, submitting fraudulent claims for job completions or manipulating availability slots. Tutoring platforms could be subjected to fake tutoring sessions, see payment structures manipulated or refund systems abused, all through APIs that handle transactions, communications, and scheduling. And content creation platforms could see attackers create massive bot networks to siphon off ad revenue or manipulate engagement metrics like views or likes.
Such attacks could prove devastating, damaging customer trust, causing loss of revenue and ceding market share so it’s vital these gig economy businesses adopt comprehensive API security strategies. They need to be able to combat the scraping, prevent account takeover, mitigate payment fraud, block business logic abuse and protect against the creation of fraudulent postings and interactions all powered by GenAI.
Countering Attacks
The problem they have today is that many are using traditional application defence solutions which are wrong tool for the job for three reasons.
- Firstly, they rely on embedding code into end-user applications and devices which can allow attackers to conduct reverse engineering or bypass these systems altogether by using AI-generated scripts that mimic human behaviour.
- Secondly, they are designed for end user interactions, not API calls, so will struggle to detect AI-automated bots performing scraping or volumetric attacks.
- Finally, these solutions are reactive, slow, and ineffective in recognising the complex patterns and subtle behaviours that are the main giveaway when it comes to GenAI-enabled attacks.
Countering these attacks against gig economy APIs will therefore require a more comprehensive, API-specific approach. These businesses will need to use advanced bot management that utilises machine learning to detect abnormal scraping patterns and block them in real-time, particularly when it comes to blocking GenAI attacks that mimic humans. They’ll need to leverage entity behaviour analytics in order to recognise suspicious login attempts and stop account takeover attempts. And they’ll need to monitor payment behaviours for anomalies and use machine learning to detect and block fraudulent activities so that only genuine requests are allowed to pass through.
There’s little doubt that the GenAI market is moving fast and likely to act as a major enabler for the gig economy. But it also has the power to be hugely destructive when that technology is used for malicious purposes.
Gig economy businesses and their ecosystem of partners will need to reappraise their business models in the wake of these forces and grapple with the threat posed to their APIs otherwise they run the risk of being ill equipped to fight off such attacks.
James Sherlow is Systems Engineering Director, EMEA, Cequence Security
Image:
You Might Also Read:
Testing APIs Against The OWASP LLM Top 10:
If you like this website and use the comprehensive 7,000-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible