HR + IT = HRT: Algebra or Alchemy?

HR Magazine has been running an HR Technology series of special articles, of which the first – David Woods’ Should HR and IT be friends? – caught my eye, not least as the title was couched as a question (thereby quite firmly implying that these operational functions are either distant acquaintances or at daggers drawn). I had a vision of the organisation remodelled along the lines of the Big Brother house, with IT sulking in the smoking area and complaining that no-one understands them while HR lurks in the kitchen, asking everyone else why IT is so off-hand with them.

You can probably quite see it as a relationship that probably wasn’t destined in the stars, although the caricature of IT as all data crunching, clouds and code offended me less than the one of HR as ‘all about soft skills’. Some of the HR functions I’ve encountered in my time could teach their IT colleagues a thing or two about obsessive fascinations with checklists, regulations and permissions. I couldn’t help but feel that the problem was being painted in terms of IT imposing an ill-fitting rigid structure over the squishiness of human resources (or at least the ones being managed rather than doing the managing). When it comes to imposing a model of reality regardless of the closeness of fit, some of my former HR colleagues needed no additional lessons. If there is any tarring with stereotypical brushes to be done, both sides have clearly already had several primer coats added in preparation.

The voice in the article series that I wanted to amplify was that of Kevin Streater, the Open University’s head of IT Industry Engagement, as it seemed to me to identify the crucial first issue:

I work in an environment with HR. It uses a different set of language [from IT]. IT will know how technology is structured and the processes – these terms will often not make sense to HR.

IT professionals are experienced in developing technology. They just need a ‘currency converter’, so HR and other departments can understand the value of their resource. This development for IT will allow them to have better recognition in business.

HR and IT have to manage technological change together. They have to come to the table together with their respective expertise – look at what needs to be achieved from both sides, then work on their own implementations.”

Two functions that each fundamentally exist to support the business (unless we’re talking software companies or HR consultancies) need to find channels for dialogue that enable them to understand how to work together to what should be their shared aim: driving their sponsoring businesses forward. Unwieldy, unhelpful or dangerously misleading systems are banes of organisational life, and can be largely attributed to avoidable causes:

  • Systems that were inadequately specified, reviewed or understood before they were implemented
  • Imposing a solution as it was either feasible or available rather than actually desirable
  • Implementation of systems without due regard to the use and value of the information that will come out of it (most organisations manage to be good at identifying the information they can put in, but ‘garbage in, garbage out’ isn’t a cliché of IT just because it’s witty.)

These are failings that speak of a lack of dialogue and mutual understanding of what it was hoped would be achieved. (They also speak of inadequate budgeting, which may in turn speak of a failure to make a strong enough business case for a better solution: both HR and IT need to fluently master the language of the budget holder as well as each other.)

In the third of the articles in HR Magazine’s series, Transforming data into business intelligence remains a stumbling block for HR technology strategy, David Wilkins, Taleo’s vice president, research, makes a point that illustrates at least two dilemmas I’ve seen all too often:

When asked about performance and compensation metrics, top performer retention was considered the most important issue for HR pros, but top performer flight risk and top performers without a career plan were the two worst performing metrics, in terms of reliable access. The metric HRDs have the best access to is average performance review scores – which they consider to be the least important.”

The first is that you can’t expect a system to produce meaningful ‘information’ (scare quotes very intentional) without access to data that would support its calculations. The second is that some data is elusive. How exactly do you measure ‘top performer flight risk’? You can draw averages on churn rates in people flagged as ‘top performers’ (if the system has such a flag, and even then the methods and reasons for triggering its application merit some substantial unpacking), but that shows you a pattern that may not be meaningful. There’s almost a ‘faith in the tea leaves’ moment here: a pattern may be created, but we might be finding meaning because we’re so eager to look for it.

Information, like tea leaves, can support a wide range of interpretations and readings. An IT department whose input to HR runs along the lines of ‘if we multiply field A by field C and deduct a weighted calculation of the values in fields L – P, we can then populate field G with the outcome’ can build a functioning system with data integrity, but if G isn’t really (A x C) – ((L + M + N + O +P)/n) that all that integrity counts for little. Indeed, there are two separate dilemmas here: firstly that the system will be devalued by HR for containing meaningless calculations, or – and far more dangerously – that HR will take G seriously and base activities and decisions on it. Data integrity is not the same as information integrity: the total output by a system from its calculation of its inputs may be mathematical unimpeachable but have no ‘meaning’ in a sense that a human being would derive. And let’s not even start on wisdom integrity …

In their discussion of IT plans, HR and IT would, I would hope, address this important issue: that the outputs of systems can be seen either as decisions (where it is entirely safe to do so – “income minus expenditure=overdrawn” is the kind of maths where the outcome requires action) or as suggestions or flags. Early in the life of ASK, we worked to provide support around the introduction of lending decision analysis support system. Throughout the training, the emphasis was firmly that the system produced not answers but prompts to the human lender to consider different elements (succession planning, sustainable growth rates and so on) that needed to be factored into the lending decision. The system could indicate potential concerns, but it could not – in itself – explain them.

My own career has also involved examination board meetings in Universities. Each essay and exam questions has been marked and the course regulations and marking scheme applied to give an initial set of figures. Universities could simply feed the outcome into the student records system, press ‘Print’ and watch the certificates roll off the printer. But they don’t. Academics both internal and external to the institution (the latter also employed as ‘blind markers’ of exam papers), review any students on the borderline of grades and may elect to move up students whose performance is felt (probably the best verb for the process here) to merit the higher grade, and also review any students where health or personal issues may have had an adverse impact on part or all of the course duration. Unless HE has moved on a long way since I left, there is no ‘a sprained ankle is worth 2 marks, a hernia 4, a divorce 10 …’ The decision is based on impressions of the impact of the event on the individual student (and in some cases discussion with them). There’s recognition of a human issue that raw marks or weighting factors can’t necessarily accurately convey. They might be able to answer ‘What?’, but they can’t – although we may be strongly tempted to believe that they can – tell us why.

It’s a point made by David Wilkins in his own company’s blog, where he talks about “a strange disconnect between the subjective assessment of value and the objective ability to measure value”. Writing on the topic of company intelligence relating to talent, he observed that:

The only two conclusions we could reasonably draw were that 1) respondents were lying about what they really think is important (perhaps to satisfy some sense of what they *should* value more highly) or 2) they are abjectly failing in their efforts to align talent data strategies with their ability to access such data. The reality is probably somewhere in the middle – we know we should aspire to track more meaningful data, but it’s a giant hairball of a challenge requiring expertise in areas where HR practitioners are typically weakest (analytics, technical integrations, data scrubbing, statistics, number crunching etc…) and where the current norm of silo’ed HR systems presents additional barriers.”

I’d throw in a third conclusion: that some data – raw data to support accurate estimation of ROI on training springs to mind – is difficult to identify and capture. The giant hairball of that challenge is not just one where HR practitioners are well in technical aspects of both IT and statistics, but also one where IT are likely to be weak in understanding the complex and potential subjective nature of the potential input data.

As with so many things technical, I’d encourage any organisation to seriously under-estimate the time required for planning and assessing future system builds: the parties involved (and HR and IT should invite some other new friends, including a statistician and the people who will be informed by the new system’s outputs) should not attempt to answer ‘How?’ or ‘When?’ (or, indeed, ‘How Much?’) until they’re convinced they’re entirely clear about ‘Why?’ and ‘What?’. Positivity is a great thing, but resist the temptation to say ‘Sure, we can do that’ until everyone’s happy that whatever it might be is something that they should be doing. (Another old cliché deserves a mention here: people may join organisations, but I can name a few who’ve left not so much a manager as a system. Actions can have unexpected consequences.)

Yes, the way forward is dialogue. But the kind of dialogue matters. I’ve only hinted at a few of the topics I’d hope HR and IT would discuss here, and I hope that David Wilkins would agree. Because I can’t help but notice his hopes for future talent management systems rest partly on his company’s previous success with systems for sales management:

It’s easy to get discouraged about the prospects of addressing this gap given the clear disconnect between current skills and required skills, but it wasn’t so long ago that marketing and sales were in similar positions. Sales people are by nature driven by interpersonal dynamics as well. Yet any reasonably professional sales team is driven today by deep sales analytics that measure pipeline to close ratios, sales funnel progression, anomalies in sales stage drop-off rates between teams, and many other critical metrics. Marketing has made a similar transition. Ten years ago, most marketing teams would have been hard-pressed to measure any business impact. They were free-wheeling creative types who were hip to the latest memes on the internet but couldn’t spell ROI if you asked them to. Today, marketing teams measure ROI on individual campaigns, specific events, the number of touches before a lead converts to a suspect, and even the value of social interactions in driving awareness and shaping market perception.”

All very true, but I’d venture the traditional failing in marketing and sales have been rather more along the lines of being aware of the data and information it would be valuable to have but lacking the technology and the systems to achieve it. Or in Donald Rumsfeld terms, there were known unknowns that are now known knowns. (I’ve always known what ROI was, I just had no means of knowing if I was generating any. Mind you, on the bright side, I’m still hip to the latest web memes, I just have less time to keep myself up-to-date as I’m capturing and analysing all this blooming data …)

I’m not yet so convinced that the HR challenge is quite the same. The dynamics of individuals and their actions within an organisation is rather less cut and dried that ‘how many widgets did we sell last week, and how many of these were to people who saw the red version of the ad?’ The missing role feels like Research Methodologist – someone can ask whether the intended raw data can be manipulated to provide meaningful proof. And that may be a new language for both IT and HR to learn?

Link to original post

Leave a Reply