Evidentiary Safeguards in State Surveillance: A Comparative Study of Singapore and the United Kingdom
- UKSLSS

- 13 hours ago
- 5 min read

By Sri Ranjana
Category Award: Public & Administrative Law, UKSLSS Essay Competition 2025
Recent prominence of digital evidence and surveillance technologies in our modern society, has made its presence in criminal justice systems worldwide. Countries have amended their legislation governing evidence to be inclusive of digital evidence, in the myriad of forms it takes. In Singapore, it is governed by an ever-evolving legal framework, balancing technological efficiency against civil safeguards.
Evidentiary Shift:
The Singapore Evidence Act 1893 provides the primary foundation for the admissibility of electronic records as evidence in court proceedings. The presumption of authenticity is evident in S.116A of the Act, where it establishes a rebuttable presumption that electronic records produced “ordinarily used” for such purposes are accurate, unless it is adduced to raise doubt. Following the Evidence (Amendment) Bill, it was stated by the Ministry of Law that electronic evidence will be “subject to the same rules of admission as all other types of evidence....”. However, doubt has been raised as to whether the traditional admissibility will similarly apply to electronic evidence [1].
Surveillance and AI Oversight:
The state’s surveillance capabilities have extensively widened through the integration of Artificial Intelligence (AI) and other technologies. The Singapore Police Force have increasingly adopted Facial Recognition Technology to identify preparators of all kinds of crimes [2]. Furthermore, the government has introduced the Home Team AI Movement, which will focus on training law enforcement officers to use AI for tasks such as interpreting security footage and monitoring inmate health. New laws such as the Online Safety (Relief and Accountability) Bill, which institutes that the state can demand user data from platforms to identify perpetrators, whilst setting out a dedicated Online Safety Commission to be operational by mid-2026.
To mitigate the risk of abuse, the government and judiciary have set specific oversight mechanisms. Judges retain the power to exclude evidence if its prejudicial effect outweighs its probative value. The introduction of the Online Safety Commission in 2026 provides a specialised body to handle the implications of online harms, theoretically reducing the risk of over-policing by law enforcement. The legislation governing the Commission includes rights for the affected parties to appeal government directions to an independent tribunal.
The Online Criminal Harms Act 2023 [3] governs that directions issued by the state can be appealed to a Reviewing Tribunal consisting of either a District Judge or Magistrate. The courts also retain the ultimate authority to determine the weight of digital evidence and ensure that the definitions regarding wrongful loss or gain, such as the ones found in the Personal Data Protection Act are not circumvented in the evidentiary process.
Conversely, the landscape revolving around state surveillance and evidentiary safeguards is defined by a tension between expansive new powers and a strained human rights infrastructure. The Investigatory Powers (Amendment) Act 2024 significantly widened the state’s net. Furthermore, the introduction of “Bulk Personal Datasets”, defined as a large collection of personal data about many individuals. Alongside that, the controversial “Notification Notices” represents a shift, where the Home Office can effectively veto private sector security updates if they impede on lawful access. This undermines the traditional safeguard to private and encrypted communication as a barrier to state overreach.
The United Kingdom is also facing a “disclosure crisis”. The Data (Use and Access) Act 2025 [4] and the Disclosure in the Digital Age review [5], which advocates for the use of AI to assist in evidence review. While it enhances efficiency, the dependence on algorithms to identify "unused material” introduces an opaque decision-making process, where it cannot be justified by the technology utilised. The system is currently grappling with the forensic unreliability of these tools, as seen in recent parliamentary inquiries into the digital forensic factor [6]. The House of Lords Science and Technology Committee [7] conducted a short inquiry into forensic science, following up on its 2019 inquiry. The inquiry wrapped up on 7 January 2026, where Ministers were slated to be questioned regarding the persistent issues experienced within the Forensic Science Sector [8].
Comparative Safeguards:
Both jurisdictions rely on specialised oversight to prevent abuse:
The UK, under S.23 of the Investigatory Powers Act, denotes that intrusive warrants require approval from both a Secretary of State and an independent Judicial Commissioner from the Investigatory Powers Commissioner’s Office.
Singapore utilised specialised tribunals and appeal panels, such as the one in the Online Safety (Relief and Accountability Bill). While efficient, these are often perceived as more deferential to national security concerns that the UK’s Human RIghts Act 1998 challenges, which frequently reach the Court of Appeal.
An example of such a challenge, is the Bridges v South Wales Police [2020] EWCA Civ 1058, which was UK’s first ruling related to the police force’s use of Automated Facial Recognition Technology, finding the South Wales Police’s deployment unlawful for breaching privacy. This was pursuant to Article 8 of the European Convention on Human Rights. [9] The technology was part of a pilot project that was first implemented in 2017, to experiment with automated facial
recognition [10]. This case represents that such technology should be integrated into legal frameworks on a trust-based approach, whilst creating mechanisms to address the implications from such technological adoption [11].
Conclusively, the legal landscapes in both jurisdictions demonstrate the fundamental alteration to the equilibrium between the state and the individual. In Singapore, the risk lies in the institutionalisation of technological trust through rebuttable presumptions, a threat to erode the ‘presumption of innocence’ into a ‘presumption of technical infallibility’, where computer or technical systems output is inherently understood to be accurate, unless proven otherwise. In the UK, the expansion of investigatory powers and the state’s burgeoning “veto” over encryption security risk transforming traditional privacy rights. To ensure that these technologies do not permanently dismantle traditional safeguards, both jurisdictions must move beyond more administrative oversight. The challenge is to ensure that the “black box” of technology remains transparent enough to be held to the same standard of accountability as the state power it serves. This challenge has to be addressed alongside strengthening the judicial and technical scrutiny, whilst securing the pursuit of digital efficiency does not come at the cost of procedural justice to citizens.
References
[1] Wendy Low, ‘A Commentary on the Amendments to the Electronic Evidence Provisions in the Singapore Evidence Act’ (July 2012) Singapore Law Gazette https://v1.lawgazette.com.sg/2012-07/468.htm accessed 16 January 2026.
[2] Ministry of Home Affairs, ‘Whether Police Deploy Facial Recognition Technology to Identify Perpetrators of Petty Crimes’ (Written Reply to Parliamentary Question, 22 September 2025) https://www.mha.gov.sg/mediaroom/parliamentary/whether-police-deploy-facial-recognition-technology-to-identify-perpetrators-of-petty-crimes/ accessed 16 January 2026.
[3] Online Criminal Harms (Reviewing Tribunals) Rules 2024, S 43/2024.
[4] Data (Use and Access) Act 2025 (DUAA).
[5] Office H, “Disclosure in the Digital Age: Independent Review of Disclosure and Fraud Offences (Accessible)” (GOV.UK, March 25, 2025) https://www.gov.uk/government/publications/independent-review-of-disclosure-and-fraud-offences/disclosure-in-the-digital-age-independent-review-of-disclosure-and-fraud-offences-accessible
[6] Shrey Jhalani and others, ‘UK parliamentary inquiry reports in forensic science – Plus ça change?’ (2024) 9 Forensic Science International: Synergy 100549, ScienceDirect https://doi.org/10.1016/j.fsisyn.2024.100549 accessed 16 January 2026.
[7] House of Lords Science and Technology Committee, ‘Forensic science follow-up inquiry launched by Lords Science and Technology Committee’ (UK Parliament, 7 November 2025)
https://committees.parliament.uk/committee/193/science-and-technology-committee/news/210119/forensic-science-followup-inquired-launched-by-lords-science-and-technology-committee/ accessed 16 January 2026.
[8] House of Lords Science and Technology Committee, ‘Ministers to face questions over persistent issues within the forensic science sector’ (UK Parliament, 6 January 2026) https://www.parliament.uk/business/lords/media-centre/house-of-lords-media-notices/2026/jan-2026/ministers-to-face-questions-over-persistent-issues-within-the-forensic-science-sector/ accessed 16 January 2026.
[9] Simmons & Simmons, ‘UK Court of Appeal finds facial recognition technology unlawful’ (Simmons & Simmons, 2 September 2020) https://www.simmons-simmons.com/en/publications/ckelg1z7p8kjt09008l0bet7c/uk-court-of-appeal-finds-facial-recognition-technology-unlawful accessed 16 January 2026.
[10] Barrie J Gordon, ‘Automated Facial Recognition in Law Enforcement: The Queen (On Application of Edward Bridges) v The Chief Constable of South Wales Police’ (2021) PER/PELJ 24 https://ssrn.com/abstract=3890653 accessed 16 January 2026.
11 Gary Kok Yew CHAN, Towards a calibrated trust-based approach to the use of facial recognition technology (Singapore Management University, 2021)
https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=5429&context=sol_research accessed 16 January 2026.



Comments