The Landscape

The Information Landscape.

What is happening right now, what it means, and how AGENCY can help.

Latest
POLICY

What does new guidance in the UK say about screen time for children?

The Guardian·March 2026
NEWS

Starmer vows to ‘fight’ social media firms to protect children from addiction

The Guardian·March 2026
NEWS

The Latest AI Documentary Asks: Just How Scared Should We Be?

Wired·March 2026
POLICY

Fake and misleading reviews: 5 businesses under CMA investigation

GOV.UK·March 2026
POLICY

New screen time guidance for parents of under-5s

The UK government has issued its first national screen time guidance for parents of under-5s, recommending no screen time for under-2s except shared bonding activities, a one-hour daily limit for 2-5 year olds, and screen-free mealtimes and bedtimes, developed with experts including the Children’s Commissioner. For organisations, this underscores the need to promote digital wellbeing in family-facing services, aligning employee and stakeholder practices with evidence-based habits to mitigate early childhood developmental risks from excessive screens. No immediate regulatory changes are outlined, but it accompanies pilots on teen social media restrictions and a consultation closing 26 May 2026, potentially informing future online safety policies.[2]

GOV.UK·March 2026
POLICY

Wikipedia cracks down on the use of AI in article writing

Wikipedia has implemented a new policy prohibiting editors from using large language models to generate or rewrite article content, with limited exceptions permitting AI assistance only for copyediting suggestions on editors' own work subject to human review.[1] This policy change, which received strong majority support (40-2) in a community vote, represents a significant tightening of previous guidance and reflects growing concerns within Wikipedia's volunteer editor community about AI-generated content quality and accuracy.[1] For organisations managing information integrity and digital resilience, this development signals the need for clear governance frameworks around AI use in content creation, as even crowdsourced platforms with distributed editorial oversight are establishing explicit restrictions to maintain source credibility and prevent AI systems from introducing unsupported claims.[1]

TechCrunch·March 2026
NEWS

A major hacking tool has leaked online, putting millions of iPhones at risk. Here’s what you need to know.

Advanced hacking toolkits **Coruna** and **DarkSword**, capable of exploiting iPhones and iPads on iOS versions from 13 to 18.7 to steal sensitive data like messages, location history, and cryptocurrency wallets, have leaked online via GitHub, with DarkSword posing an immediate threat due to its ease of use by malicious actors including state-sponsored groups from China and Russia. This exposes **hundreds of millions** of unpatched Apple devices to drive-by attacks via compromised websites, heightening risks for organisations reliant on mobile endpoints for communications and data handling, potentially leading to data breaches and operational disruptions. No specific new **policy or regulatory implications** are detailed, but the leaks underscore the need for accelerated patching, vulnerability disclosure, and scrutiny of commercial exploit sales by defence contractors amid ongoing use by surveillance vendors.

TechCrunch·March 2026
NEWS

I’m a young woman, and people keep telling me the internet has ruined my brain. Is this helpful? | Isabel Brooks

The Guardian·March 2026
NEWS

Meta and YouTube designed addictive products that harmed young people, jury finds

The Guardian·March 2026
NEWS

Big tech reckoning: Meta ordered to pay $375m in landmark case - The Latest

The Guardian·March 2026
NEWS

UK iPhone users face over-18 age check to use services after update

Apple has rolled out mandatory age verification for UK iPhone and iPad users via iOS 26.4 and iPadOS 26.4, requiring adults to confirm they are 18+ using a credit card or ID scan to access services, while unverified users or minors face web content filters or family sharing restrictions.[1][2][3] This matters for organisations as it signals escalating device-level enforcement of child online safety amid government pressure, recent fines on Meta and Google, and Ofcom's endorsement as a "real win for children and families," potentially raising compliance burdens for app developers and service providers.[1][2][3] Policy implications include alignment with existing UK laws on adult sites and anticipation of broader regulations, such as an under-16s social media ban under government consultation ending May 26.[2][3]

The Guardian·March 2026
RESEARCH

Hundreds of UK teenagers to trial six-week social media curbs for major study

The Guardian·March 2026
NEWS

Fake image of Jewish charity ambulances on fire shared online

Four ambulances belonging to the Jewish charity Hatzola Northwest were set on fire in London's Golders Green on March 23, 2026, in an arson attack police are investigating as an antisemitic hate crime led by counter-terror officers, with an Islamist group claiming responsibility via Telegram[1][5][6]. An AI-generated fake image of the burning ambulances has circulated on social media, amplifying misinformation amid a near-doubling of UK antisemitic incidents to 3,700 in 2025[1]. Organisations face heightened risks of disinformation exacerbating hate crimes and community tensions, underscoring the need for robust digital verification and potentially stronger regulatory measures on AI-generated content to safeguard emergency services and public safety[1].

Full Fact·March 2026
RESEARCH

Video shows queue for fuel in Spain, not UK

Full Fact·March 2026
Newsletter

THE STAND-UP

A fortnightly summary of the information landscape: the research, the policy, the risks, and the opportunities. Plus AGENCY commentary, new resources, and upcoming events.