Scope: Technologies and actors that must be covered
The legal framework should be technology-neutral and future-proof. It should be based on purpose and function: covering tools used for a policing or law enforcement purpose which identify, classify, match, monitor, infer information about, and/or profile individuals or groups.
Scope should extend beyond police to all public bodies engaged in law enforcement (including local authorities, HMRC, DWP), and to private actors deploying such technologies for law enforcement purposes in quasi-public spaces (e.g. shops, parks, leisure centres). JUSTICE also urges consideration of what legal requirements would be beneficial to place on providers through law, to ensure rights protection and consistency, rather than relying entirely on contracting and procurement practices.
Rights, Thresholds, and Safeguards: How intrusions should be assessed
JUSTICE proposes a restrictive, exclusive statutory basis: these technologies may not be used for policing or law enforcement unless they meet the framework’s criteria. Statutory requirements and redlines should include:
(a) overall necessity and proportionality;
(b) purpose limited to preventing/detecting serious harm (which cannot include unwanted conduct by local authorities, criminalised through orders like Public Space Protection Orders; and serious harm must be clearly specified for high-risk uses, akin to the EU AI Act thresholds);
(c) reasonable suspicion tied to the specific deployment context (important for “stop and scan” uses like operator-initiated facial recognition, and live deployments locations and watchlists); and
(d) scientific validity of methods (to exclude pseudo-scientific inferential tools).
Factors for assessing privacy and other rights and freedoms (expression, assembly, non-discrimination) should include quality of intelligence, availability of less intrusive alternatives, geographic and population coverage, impacts on children and vulnerable individuals, accuracy and privacy by design, exposure of protected groups, and cumulative societal effects over time.
Procedurally, the law should mandate Human Rights Impact Assessments, transparent publication duties (registers, deployment records, evaluations, demographic performance), monitoring and evaluation duties researcher access, and graduated oversight proportionate to risk, with independent pre-authorisation for the highest-risk deployments (e.g. live biometrics).
Access to population-scale government image databases (passport/immigration/DVLA) should be exceptional, last-resort and independently authorised, given grave risks of mass retrospective surveillance.
Oversight: Powers and infrastructure
JUSTICE supports an independent oversight body setting codes of practice, technical/governance standards, bias and discrimination rules and testing regimes, performance benchmarks, data/interface standards, and monitoring and evaluation requirements.
The body should have the funding it needs and the power to effectively hold deployers of these technologies to account, including to conduct investigations; compel information; issue compliance notices; seek injunctions; and publish determinations and annual reports.
JUSTICE stresses the importance of public participation mechanisms to influence the future use and oversight of these technologies and ensure public trust.
Final recommendations
JUSTICE’s final recommendations to secure legitimacy and public trust in the consultation process:
(a) pause the announced nationwide roll-out of 40 LFR vans pending the consultation outcome;
(b) remove from the Crime and Policing Bill clauses that pre-empt the consultation (identity-concealment offence at protests; police access to DVLA facial images);
(c) commit to meaningful public engagement and pre-legislative scrutiny of any draft legislation.
Read JUSTICE's full consultation response here.