In an adapted version of his Housing Conference 2016 presentation on housing and data, our Chief Executive Matt Leach tries to pin down why the housing sector is so indifferent to the power of big data.
Whilst much of the chatter at last week’s CIH conference in Manchester was overshadowed by Brexit and its fallout, there was also a massive focus on business transformation – how housing providers can get their businesses ready for what is going to be a hugely challenging decade ahead.
Part of that is about getting our values base right as a sector, something HACT has always had at the heart of its work – and on Thursday we launched our Social Value Procurement Toolkit, enabling housing providers to more effectively manage the social value they generate from their procurement activities.
But we also need to fix our systems, and I was lucky enough to have the opportunity at Manchester to speak on the role of data in housing businesses. This blog is an adapted version of my presentation.
It being a late afternoon presentation, I got my heavy biblical analogy out of the way early on – the surprising extent to which housing provider IT systems appear to have been prefigured in the Old Testament Tower of Babel story…
Mankind once had a single language, started to get more and more confident in their abilities and powers, built the tower. God got concerned that with one language a culture of cooperation, there might be no limit to what they can do and tore down the tower and scattered mankind in tribes across the globe, all speaking different languages.
And whilst this might also be a useful way of viewing the aftermath of Brexit (in the unlikely event you attribute divine powers to Boris Johnson and Nigel Farage) it is an even better one for the state of housing IT.
Fragmented systems, poor quality data, an inability to integrate information, and all of it seemingly designed to prevent housing providers reaching their full potential.
And these fall into three main areas:
- the problems with housing data
- the costs and risks falling onto our businesses and the opportunities we are missing as a result
- and the ways in which we might start to put that right over time.
But first of all its important when thinking about problems to be honest about your own failures. Big personal failures.
About 3 years ago, everyone was talking about big data. It was at the very top of the Garner Hype charts. The private sector was starting to invest heavily in data analytics and claiming to see huge improvements in performance and profitability. Big data was the future. The new oil.
HACT was approached by Microsoft, who had just launched a sector leading suite of cloud and data analytic products. Housing providers had data. They were looking for ways to demonstrate the value of their new technology and social housing seemed to provide a great space to achieve that.
We jointly brainstormed a project. Why don’t we take data from loads of housing providers, clean it up, map it, analyse it, develop some great insight. Kind of like a turbo-charged precursor to HCA regression analysis
And with those noble intentions we embarked on 2 and a half years of absolute misery.
Initially it all went well. We build – with Microsoft’s help – a phenomenal technical infrastructure for collecting data, anonymizing it, encrypting it and generating insights from it.
We got the Information Commissioner’s office – after much effort – to sign off what would have been one of the first ever exercises in cross sectoral data sharing.
We found five hundred thousand homes worth of housing providers recruited to share their data. They were very excited. We were excited. It was brilliant.
Then we hit problems.
Fairly straightforward data outputs from quite well organised housing providers turned out to be largely unuseable. When it wasn’t, we found that despite housing providers doing largely the same things in the same ways, the data they used to record it was barely comparable. And even when it was, fairly simple records like accurate rent and tenant profile data could take months or more to extract.
When we started the project, Microsoft assured us their phenomenal machines could deal with and make sense out of any sort of data. Structured. Semi-structured. Unstructured.
We – and they – learnt the hard way that there was in fact a fourth class of data previously unrecognized by the text books. Housing data. And that it was hard to work with to the point of unintelligibility.
We did learn a few bits and pieces. Like absence of any link between house age, household age and repairs demand. We also learnt that a lot of housing providers have tenants that were zero years old because no-body bothered to fill in the field when they entered the property….
Of course the problems we face are common to the sector as a whole. Unusable data. Poorly maintained and expensive to make sense of…
For a sector struggling with political turmoil, financial pressures, the return of regulation, what is astonishing is the extent to too often which data is largely unexploited – or incapable of being used - as a resource to improve provider businesses.
Housing data is a mess. Data is spread between multiple systems. Integration is rarely easy or cheap. Too often businesses can’t get the information you want. Or as fast as you want it.
It imposes costs on business that wouldn’t be tolerated in any other sector. Potentially millions.
Last week I spoke to a housing provider looking to bring together 28 items of data on tenants from a number of different systems to put on their CRM to look at arrears risk. It proved impossible to do in any straightforward way.
They asked their housing management system provider to help out with cleaning it up and automatically downloading it and were – initially - quoted a six figure sum to put the necessary web services in place.
I spoke to another housing provider today, and not a small one, about how they had abandoned an effort to get relatively simple data together for the Experian Rent Exchange project (aimed at credit scoring residents) because the costs to produce it were so astronomic.
I spoke to another one frustrated that in a world in which most other businesses can look to integrate cloud based web services that are becoming ubiquitous elsewhere, in housing the lack of any clear data standard and the penal costs imposed by housing management system providers on access to functional APIs (the bits of code that enable systems to talk to one another) was acting as a major blockage to innovation.
And last year, I met a number of housing providers spending £10ms on system integration exercises that would have cost a fraction of that if they were starting with systems and data that was already consistent described across their businesses.
But the sector – fragmented as it is between multiple technology providers and data definitions (whilst largely all doing the same things) seems powerless to respond. At a strategic level, whilst other businesses are – finally – making the leap to a world of customized services, sophisticated risk mapping, customer analysis, too many housing providers just don’t see it as a priority.
The truth is that proactive and positive use of data in the sector is nowhere near as good as it should be. The reliance on aggregated cost and performance data to provide VFM benchmarks is indicative of sector that has little real interest in generating insight or value from its own data. Indeed its slightly shocking that it took an organisation like the HCA to lead the way with the introduction of basic regression analysis (one of only many ways of making sense of data). Why weren’t we doing it for ourselves?
So what could we do if housing data got fixed?
We might be able to bring together data on tenants or assets quickly and easily and use it to drive decisions in our businesses at a time when doing that right is more important than ever.
We might be freed to move between technology platforms without the barriers and costs imposed by decades of broken data and systems incompatibility. Or integrate new systems and services on a lower cost basis. That might promote the emergence of a more diverse market in housing tech, and new and innovative insight and services.
We might be in a position to start to share data with others. Interesting that no housing providers appear to be sat round the table of GM Connect – the massive attempt to link up data across Manchester public services. My guess is in part because the state of housing data is such that they would have little or nothing to talk about.
We might be able to generate our own performance insight with partners without the need for intermediaries to expensively collect and clean data and tell us what it means.
It might open the door to the internet of things. Connected homes is a huge potential new source of data and functionality. But live streaming data only makes sense in the context of decent quality static contextual data. And we don’t really have than now.
Most of all, we might start to get a handle on driving value across our businesses in ways that haven’t been possible before – or not without huge effort.
HACT has been working for over a year with a number of housing providers on the link between community investment and bottom-line performance.
We found that – for one provider - there was a difference of 25% - £750 v £1000 in reactive repairs costs depending on whether there was an unemployed person in the house.
With another we found that there was no link between their financial inclusion activity and rent arrears.
With another, it appears that there may be a 28% difference in arrears outcomes depending on two different types of letters being used.
All of these required months of work on data collecting, data cleansing, data mapping. It’s the sort of thing that businesses ought to be able to do quickly and for themselves.
So can it be fixed? The answer is. I don’t know. Particularly when there is so much else happening across the sector. But arguably unless its fixed a lot of other issues will become a lot more difficult to solve.
What we would undoubtedly benefit from some basic cross sector data standards – not as a top down imposition but as something that could in time be migrated to.
It has been done in Holland where the CORA sector wide standard reference architecture has been signed up to by most of the top 150 HAs and all of the tech providers. It has taken five years and is still a work in progress. But they’ve made a start.
But equally, we need effective digital leadership with inside housing providers. At a time when the biggest change that will take place in housing providers will be technical there are still precious few Board level CDOs, CIOs or CTOs.
And non-executive Boards are almost entirely bereft of anyone with modern or business relevant digital skills. With data seen as a “problem” and tech as a cost, not a drive of business transformation.
That attitude still dominates across much of the sector. When I spoke about all of this to one of the major housing tech providers this afternoon in the exhibition hall, he said “Matt, its all true. Data in housing is terrible. We could sort it collectively. But the truth is there is zero market demand to sort it out. No one is interested.”
Its astonishing really.
We’ve had an ever increasing number of Housing Goes Digital conferences over recent years. Usually with Nick Atkin banging the drum for cool digital stuff. But housing hasn’t yet gone digital. We’re at best still managing the digitized versions of the paper ledgers of the 1970s, with similar issues making sense of the numbers they contain. In too many businesses, we haven’t even sorted out basic stuff like data integration and data quality.
If we want a positive, practical distraction from the challenges of Brexit, governmental collapse, housing crisis and rising social disaffection splitting the nation, perhaps this might be something for us all to take an interest in.