Aspirations and applications of AI in social housing

Categories
Adaptation Information Management and Technology

Dr Simon Williams is Managing Director of Service Insights Ltd and is an Associate Faculty at Leeds University Business School. Dr Nicky Shaw is a Senior Lecturer at Leeds University Business School, with research interests in operations management, performance measurement, and innovation. Dr Emma Forsgren is a Lecturer in Information Management at Leeds University Business School. Her research focuses on how digital technologies change the nature of work and new modes of organising in the digital era. Stephen Blundell is an Associate Consultant at Service Insights Ltd.

Screenshot of report cover which has a black and blue background and the text: "Service Insights Ltd. Aspirations and applications of AI in social housing. 2025"

Artificial Intelligence (AI) technology is moving at a rapid pace, yet, despite the speed of its development, many service sectors and industries in the UK are still at the relatively early stages of AI technology adoption. 

Our research (a partnership between Service Insights Ltd and Leeds University Business School, sponsored by technology company BCN), explores the relatively new phenomenon of AI in the English social housing sector, with the aim of better understanding how providers are applying AI in practice and how they aspire to use it in future. 

These are especially important considerations for social housing organisations because service quality, consistency, and the approach to decision making are factors which can have a profound impact upon quality of life.  

It is important to understand more about how AI is currently being deployed in practice, what social housing providers aspire to achieve through its use, and, more fundamentally, how AI may help the social housing sector achieve its core aims.  

Key findings 

Drawing on survey responses from 220 employees and 50 in-depth interviews across 10 housing associations, we found the following: 

The current state of AI adoption 

  • Many staff seem unclear about what types of AI they have in place in their organisation 

  • Whilst 22% report AI tools being made available for staff in specific roles, more (around 31% of staff) are actually using it in practice 

  • Those currently using AI in their working roles are overwhelmingly positive about the benefits of the technology (93%), suggesting a strong desire for use once deployed 

  • There remains a smaller group of employees who do not want to use AI due to a perceived lack of trust in the technology, lack of accuracy, deskilling of employees, and a belief in protecting the value of human interaction.  

Aspirations and applications of AI 

Our findings suggest that current applications of AI in social housing are predominantly focussed on the use and adoption of Large Language Models (LLMs) such as Open AI’s ChatGPT for tasks such as writing letters to tenants, generating meeting minutes and, on occasions, generating AI policy! Aspirations of future AI use lie predominantly upon predictive models for core services, such as identifying preventative maintenance schedules. 

AI is not always being implemented with the values of social housing providers in mind  

Our study suggests that core sector values and strategic priorities such as equality, diversity, and inclusion may risk being overlooked. The excitement associated with this new technology could divert attention away from the fulfilment of social housing providers’ core aims and purpose. For example, replacing in-person or phone conversations with automated chatbots or digital portals, reducing the opportunity for human interaction, or digital-only services disadvantaging those without online access.  

Policy and strategy are not keeping pace with practice 

To date, the impetus for AI technology adoption seems often to have come from middle management and front-line operational staff. Our study indicates that the use of AI is not always being treated as an explicit strategic opportunity or choice, meaning that organisations may lack a clear policy and governance framework for AI adoption.  

We suggest that this position of ‘policy catch up’ will become a persistent theme over the next decade, as social housing tries to adapt to newly created technologies in various forms.  

Organisations need to facilitate safe experimentation and innovation 

The pace of change in AI is much more rapid than previous technologies, with AI capabilities evolving all the time.  In addition, its ‘general purpose’ nature requires organisations to problematise effectively before evaluating how AI might be useful to them. We argue that social housing providers need to become much better at experimenting with new AI technologies by testing, investing (or disinvesting) more quickly than before, and ultimately evolve and adapt to see what best helps their organisation.  

A safe and ethical environment to enable innovation and experimentation is crucial to the beneficial adoption of AI in social housing. Furthermore, it is critical that tenant needs, expectations, and experience are kept at the centre of service design. 

The impact of AI on decision-making could be profound 

One of the starkest findings from the research interviews was the view that decision-making processes will always require human oversight. However, other findings from literature and desktop research bring this into question – when technology speeds up processes and AI capabilities become embedded in applications, this can lead to reduced scrutiny of automated decisions.  

The reality is that it is often difficult to spot nuanced changes in large volumes of information. There is a need to ensure that efficiency doesn’t come at the cost of accuracy and fairness. We argue that there is an urgent need for debate around this topic. 

The use of AI may compromise critical thinking capabilities 

Our research suggests that the use of AI may act to reduce or limit critical thinking within service operations with continued use and increased familiarity. How this impacts service experience for tenants will emerge over time. We surface concerns about the ‘black box’ nature of some AI deployments where the reasoning behind a decision or process outcome cannot easily be scrutinised, either at the point of use or in retrospect. This suggests that the use of AI is likely to have some unexpected and potentially undesirable consequences, such as a maladministration adversely affecting outcomes for tenants following prediction of rent arrears. 

Quality of data will influence the quality of AI 

In keeping with our 2024 research study exploring ‘Data Challenges in English Social Housing’, our 2025 research on AI finds a strong link between perceptions of data quality and the anticipated efficacy of AI deployment.  

A significant gap remains for social housing in ensuring that the quality and consistency of data is sufficient to enable accurate and reliable outputs from AI, and so that the full capability and value of these technologies can be harnessed. Simply put – AI is only as good as the data it uses. Consideration needs to be given to the adoption of sector-wide data standards, and a data maturity model (i.e. a framework that helps organisations assess and improve how well they manage and use data). 

In summary, AI has clear potential to support and transform the work of social housing providers, but its adoption raises critical questions about reliability, governance, and tenant experience. To realise the benefits while avoiding unintended consequences, the sector must move forward with caution, curiosity, and a firm commitment to keeping people - not just technology - at the heart of housing services. 

Read the report. 

Related content

Contact us

If you would like to get in touch regarding any of these blog entries, please contact:research.lubs@leeds.ac.uk

Click here to view our privacy statement. You can repost this blog article, following the terms listed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International licence.

The views expressed in this article are those of the author and may not reflect the views of Leeds University Business School or the University of Leeds.