Is crime analysis at the heart of policing practice? A case study

ABSTRACT Following the introduction of National Intelligence Model (NIM) in 2004, this paper explores whether crime analysis has been integrated into policing practice. Fieldwork was conducted in one UK police force with both analysts and police officers. Findings from the analysis of semi-structured interviews and focus group discussions suggest that while crime analysis is acknowledged as being central to the business of everyday policing, police officers’ general lack of understanding of how analysts work and their capability leads to underutilisation of their skills. The research uncovered knowledge and process gaps, deepened by cultural constraints, budget cuts and resource reallocations, which inhibited incorporation of analysis into the heart of policing practice. Findings also indicated that analysts lack the resources, time, and sometimes, motivation to undertake sophisticated analysis, and often feel frustrated when officers’ opinions override analytical wisdom. The paper argues that at least in the UK, better training and cultural change are necessary for creative utilisation of analytical resources and for bridging the knowledge and process gaps in the organisation.


Introduction
Crime has been declining for the past 20 years in several Western nations (Van Dijk et al. 2012, Tonry 2014 with various explanations offered to account for it (see e.g. Levitt andDubner 2005, Zimring 2007, Barker 2010, Farrell et al. 2014, Weisburd et al. 2014. Nevertheless, new and complex challenges in the form of terrorism, organised crime, cybercrime-and internet-facilitated crimes have placed increased demands on finite (and often shrinking) police resources. Few police services are able to allocate resources to combat these myriad problems to the satisfaction of all (Ratcliffe 2004). The role of intelligence and strategic analysis therefore becomes central to most modern democratic police services, especially in an era of increasing accountability to the media, politicians and communities. It offers the hope that scientific analysis can provide an objective 'rhetoric of rationality' (Ericson and Shearing 1986) for resource allocation in the face of competing and multiple demands. Intelligence and strategic analysis therefore are, or theoretically should be, at the heart of modern policing (Innes et al. 2005). The aim of this research was to take stock of the role of analysis in policing as it currently exists in one police force in the UK from the perspective of analysts, police officers and supervisors using semi-structured interviews and focus group discussions. With some exceptions, the analysis presented here resonates with the findings of the existing literature, namely that barriers to analytical work becoming core to policing practice are not only cultural, but systemic, and stem from specific shortfalls in organisational knowledge about analysis and the process of commissioning and using analytical products, amongst other factors.
Launched in 2000, the National Intelligence Model (NIM) represents a standardised approach to intelligence gathering and analysis intended to inform strategic and tactical decision-making at all levels of police business. Since its introduction, crime analysis in the UK has been said to be the mainstay of policing business . However, previous research on the role of analysts in policing organisations (O'Shea and Nicholls 2003, Cope 2004, Innes et al. 2005, conducted using interviews and observations revealed a dissonance between how intelligence analysis ought to inform police policy and practice and the way it actually does. Ratcliffe (2004) identified three main problems that prevent intelligence and strategic analysis from achieving their full potential in dictating resource allocation and determining policy. These problems stem from, firstly, a lack of adequate training for senior managers, resulting in poor tasking of police operations and improper interpretation of analytical products. Secondly, a lack of connectivity between decisions made at strategic levels and their relevance to street level day-to-day operational policing. And, finally, a lack of agreed methodologies, techniques and even definitions of what intelligence is amongst practitioners and analysts lead to confusion and misunderstanding amongst decision-makers. Adding to this, early work by Cope (2004) suggested a fundamental lack of understanding of the analyst's role in policing. Sissen's (2008) study which was conducted several years later came to the same conclusion. Additionally, it has been noted that the quality and objectivity of intelligence products are often suspect and have limited applicability in operational policing (Cope 2004, Innes et al. 2005. The problem with crime analysis not reaching its full potential in crime prevention is not limited to the immediate factors identified above but can be traced to wider police cultural resistance to new values and styles of policing (Chan 2001). There is extensive literature on the enduring nature of police culture and its resistance to change (see e.g. Reiner 1992, Chan 1996, Paoline 2003, Loftus 2010, Crank 2014. For example, research ) concerned with the adoption of COMPSTAT 1 by some US police forces suggests that attempts to anchor crime prevention efforts in crime analysis have been impeded by a number of issues, including cultural factors. In particular, Willis et al.'s (2007) study indicates that policing continues to be dominated by traditional crime fighting strategies; ignores innovative evidence based approaches in favour of the personal experiential wisdom; and is unwilling to jeopardise pubic or external support by introducing experimentation at the risk of failure. Willis et al. (2007) hypothesise that COMPSTAT is adopted by police forces in the US either for technical/rational reasons intended to improve efficiency and performance, and/or because of institutional pressure to make the agency appear progressive. In the former case, the adoption of COMPSTAT could work in one of two contradictory waysbringing about a 'sea change' in a goal driven, efficient, transformational police organisation, or by way of reinforcing the traditional hierarchical structure of a military model of policing ). In the latter case, adoption is driven not by demands for efficiency but by appearances of what the organisation 'should look like and what work they should be doing' (Willis et al. 2007, p. 15, italics in original). Here, the organisation may adopt procedures and structures that conflict with its institutional sub-culture and lead it to effectively de-couple the routine structures (business as usual) from the imported structures of technical efficiency (crime analysis) (Maguire and Katz 2002).
Previous research has situated the barriers to acceptance of crime analysis in policing culture (Cope 2004, Sissens 2008, Atkinson 2016. While acknowledging the role of culture and sub cultures as pivotal to the acceptance of analysis within policing, in this research we identified perceptible gaps, or shortfalls in knowledge and shortcomings in processes, which inhibited complete integration of crime analysis into policing practice. Knowledge gaps existed not only on the part of analysts as individuals and collectively, but also in terms of what policy-makers and senior leaders knew about analysis and analytical capability of the organisation. On the other hand, gaps in the process of commissioning analytical products and disseminating results undermined their contribution. These gaps were not discrete but often overlapped, with one gap contributing to the other, an example being training. While unavailability of regular training courses can be viewed as a processual gap, lack of training also contributes to the knowledge gap. Reasons for the existence of these gaps or challenges are systemic, mainly anchored in police culture, but more prosaically, in poor communication, lack of adequate resources and frequent reorganisation. The paper explores these challenges as well as the reasons why they might exist, thus identifying specific areas for policy recommendations.

Methodology
The research reported here is a subset of a larger knowledge exchange project between academics and police practitioners. Interviews and focus group discussions were conducted with analysts and police officers to get a broad understanding of what analysts do and their contribution to wider policing operational and strategic decision-making processes. The identified police force is one of the larger police organisations in the UK, serving predominantly urban areas with a diverse population. The organisation has a progressive attitude towards research and was a willing partner in this project as part of its continued engagement with academics. At the time of the research, the force was undergoing major structural re-organisation of the analytical capability, partially as a result of the budget cuts following 2010, but equally in an attempt to improve efficiency and re-conceptualise analytical and supervisory roles given the available resources.
A qualitative research method approach was adopted in consultation with the police force. The first step involved drawing an organisational map of crime analysts to get a clearer idea of the number of analysts, where they sit in the hierarchy, and the varied roles they occupy. This task proved to be more difficult than envisaged, primarily because there appeared to be no central analytical unit that possessed the information readily. We discovered that analysts in different parts of the organisation reported to different section heads who did not necessarily sit within the same part of the organisation. Further, there were different funding arrangements for different roles. For example, 'community safety analysts' were paid by the local council but were police analysts seconded to that role. There did not seem to be a clear and uniform chain of command for all analysts within the force, for example, performance analystswho worked centrallyreported to one part of the senior command team, whereas analysts at the local police units were answerable to another section of the team. We were told that part of the reason for the lack of a clear picture was due to the reorganisation of analytical roles being undertaken at the time the research was conducted in 2014. The total number of police analysts (including those seconded to partnerships and related agencies) in the force was less than 70. Further, there appeared to be just two levels in the analytical hierarchythe bulk made up of analysts, with six senior analysts.
Data collection involved semi-structured interviews and focus group discussions. A total of 24 semi-structured interviews were carried out by the first author with analysts (14, including 6females), intelligence officers (4, including 1 female) and intelligence supervisors or managers (6, including 1 female). Both authors jointly conducted two separate focus group discussions with (six, mix of male and female) analysts and with (five, all male) senior police officers. Two of the focus group discussants were analysts that we had previously interviewed. 2 To provide complimentary insights to the qualitative data collected, a small face to face survey was completed by the analysts interviewed to elicit in detail the percentage of time they perceived that they spent on particular types of task and the various kinds of products they routinely produce. The survey was restricted to the analysts interviewed, and closed questions were used because of the detailed nature of the information to be collected. The survey information was not intended to provide a nationally (or forcewide) representative picture of analyst activity, but to simply provide additional context to the analyst interviews. As with the interviews, survey data were anonymised and in what follows they are summarised in the aggregate.
Once access was negotiated with senior police leaders, senior analytical staff helped arrange individual interviews. The analytical workforce was initially mapped to identify the range of roles and to ensure the sample was representative. The final sample included crime, intelligence and performance analysts from local police units, special units (e.g. for rape and violence against women, or serious and organised crime), community safety partnerships (CSP), Criminal Investigation Department (CID), Force Investigation Bureau (FIB) and the Confidential Unit. Although interviewees were initially selected by the senior analytical staff, which creates the possibility of inherent bias, interviewees were given every opportunity to refuse participation or being recorded (which some of them availed of for parts of the interview). Further, the range of interviewee demographics, level of experience, and opinions, was fairly widespread. Average analytical experience of the sample was 9.7 years, with the newest analyst in post for only one and a half years and the most senior analyst having over 17 years of experience. Several interviewees were quite critical of their own and organisational shortcomings. Thus, we can cautiously conclude that interviewees did not appear to have been specifically chosen to present a particularly positive image of the organisation. Intelligence officers were usually serving or retired police officers who worked alongside analysts and supported them. Interviews with supervisory staff included police officers at the rank of Sergeant and Inspector and one senior civilian staff member. Sampling was thus a combination of criterion (from all units) and purposeful (various levels of seniority and experience), but the actual choice of individuals interviewed was dictated by availability on the day.
Semi-structured interviews were conducted in 2014 at police stations or in a secluded room at police headquarters. Interviews were recorded with the permission of the interviewee and later transcribed. Notes were made during the interviews and during informal discussions, with the interviewee's permission. Interviewees were assured of anonymity and confidentiality. Participation was voluntary and interviewees were offered the option of withdrawing from the interview at any time. The content of the interviews were strictly within the limits of the Data Protection Act. Moreover, although questions were sensitive, they were not designed to cause personal discomfort or anxiety. As such, there were no major ethical dilemmas associated with the research.
The focus group discussion and interview data were coded and analysed using the qualitative software NVIVO. The data were analysed thematically to understand accounts of analysts and their supervisors. The data reflected the accounts of participants and included both pre-specified and emergent concepts (Ritchie andSpencer 1994, Silverman 2011). Category grouping under themes was generally clear-cut given the sequence of the interview. Transcripts, field notes and informal conversations with analysts and police officers gave rise to emergent themes during the coding and analysis phases.

Knowledge gaps
The research identified three main areas where knowledge gaps existed: understanding of analyst roles, the range of possible analytical products and whether analysts and/or their products were effective. These gaps existed because of a lack of clarity around role and job descriptions, lack of knowledge about analytical abilities and finally a lack of systematic frameworks to evaluate the effectiveness of analysts and their products.

Role description and content
In order to first understand what they actually do, we asked analysts and intelligence officers what their role or official job description was and what kinds of products they regularly produced.
Responses to the first question indicated some lack of clarity about whether their actual job content actually matched their official job description and title. Analysts' responses essentially describing what they did on a day-to-day basis. Over half of the analysts interviewed confessed that they did not know their actual job description. Self-perception of roles ranged from the very general -'a bit of jack of all trades' (I10); 'to basically make sense of everything' (I2)to particular objectives such as, 'identifying trends and patterns in crime' (I2); 'provide analytical and statistical reporting' (I4); ' identify threat, risk and vulnerability' (I9); 'problem solving' (I8, I24) and 'inference development and hypothesis testing (Focus Group, Analyst)'. A few interviewees 3 described their job in terms of very specific tasks such as: 'monitor crime rates' (I17), 'responsibility for major crimes' (I18), and 'writing tactical and strategic assessments' (I6, I19).
A focus group participant explained the more generic objectives as, It's about helping senior officers with their decision making, so it is helping to prioritise things for them and helping them to decide where they're going to put their resources. (Analyst, Focus Group) This echoed findings from Evans and Kebbell's study (2012) suggesting that the role of analysts is changing from providing mere technical expertise, to being part of the support structure for decision-makers.
A universal theme running through most analysts' perception of their role and job content was one of role fluidity and responding to demands for whatever kind of analysis was currently deemed important. There was broad consensus that the volume of demands for quick data analysis or status updates on particular situations from senior officers meant that, A lot of our job is just extracting data from force systems quickly. That's where the frustration lies at the moment in that there's no room for interpretation. (I5,Analyst) This situation appears unchanged since Cope (2004) reported it in her study. One analyst said that resource cuts prevent them from completing the kinds of analysis they should and would like to do, meaning they focus on more mundane and administrative tasks like data collection.
When I first joined six years ago, the idea was we're fed information and we just focus on the analysis … But, unfortunately, in the current climate there's not enough people to go around anymore … So we do spend much more time doing administrative type duties. (I8, Analyst) Amongst the myriad ways in which analytical roles have been described and delineated the International Association of Crime Analysts (2014) recognises four types of analysis: crime intelligence, tactical, strategic and administrative crime analysis; differentiated on four dimensions: degree of confidentiality of product, regularity of production, importance of offender to the analysis and data sources used. 4 We explored analyst perceptions of different analyst roles and the analyses they conducted. The research revealed that although analysts were aware of the different types of products, understanding of the differences between the different types of roles identified in the literature was fuzzy. Admittedly, this could be partially explained by the fact that different police organisations conceive and operationalise analyst roles with some variations, and equally also by the fact that either analysts were unaware of specific role descriptions or the latter were not set in stone in the organisation.
In order to get a snapshot of their routine job content, analysts were asked to fill out a short survey during their interview, reporting the types of analysis they performed routinely in a typical month and to estimate the number of hours they spent per week on each type of task. Nine different types of routine analytical products were identified in consultation with senior analysts during the planning stages of the research and included different types of trend and pattern analyses, and special reports. These loosely match the nine analytical techniques specified by the NIM (NCIS 2000) 5 . The results based on the sample of 14 analysts interviewed are presented in Figure 1. The figure shows a boxplot to illustrate the distribution of responses and the variation across the analysts sampled. The boxplot itself is rank ordered by the median values for each type of task to enable the reader to easily identify trends in the data.
It is apparent from Figure 1 that the analysis of crime trends was a relatively frequent activity for those in our sample, with analysts estimating that they spent about 20% of their time on this form of analysis. However, even for this category there was considerable variation across interviewees (ranging from zero to 80% of the time). Other activities were much less common, and depended upon the specific role of the analyst. For example, just as Local Policing Unit (LPU) analysts were involved in patrol route tasking and those in special units or at headquarters were not, similarly, the latter were more involved in producing in-depth and special reports than LPU analysts.
It was evident that analysts produced several products routinely. Only one analyst reported doing every kind analysis, and only one reported spending most (70%) of their time on one type of activity (special reports).
Variation in perceptions of different analytical roles hinged around the focus, purpose and final consumer of the analysis, as well as the sources and kinds of data used. Analyst perceptions were often skewed by their personal analytical experience and suggested an incomplete understanding of other kinds of analysis. Consequently, one reason to account for this mosaic of often clashing opinions might possibly be the fact that analytical roles and products differed significantly between various units, as did expectations of managers and clients.
Despite some disagreements on the exact details, interviewees universally agreed that performance analysis was different from crime or intelligence analysis. However, the difference between crime and intelligence analysis was much less well understood. Previous research suggests perception of the difference between crime analysts and intelligence analysts was one of scope: one more narrowly focused on problem-solving and resource allocation, while the other used a wider array of data sources and specialist analytical skills to aid investigation, detection and prosecution (Taylor et al. 2007). Our research found that the difference between the roles was less clear-cut, suggesting that the division between crime analysis and intelligence analysis is becoming blurred. Some interviewees went so far as to say there was no difference whatsoever. Those who recognised them as being separate roles disagreed on how they differed. Some considered crime analysis to be more immediate, focused and therefore mainly tactical, whereas intelligence analysis was envisaged as more remote, involving other data sources, and therefore more strategic.
On the other hand, some interviewees confessed to not knowing that the two types of analysis were different. One analyst with years of experience said, But we were always called crime analysts and some call themselves intelligence analysts. I don't know. I don't know if it's just the title really. I don't know if there is any difference … We're all analysts, aren't we, looking at intelligence? (I21, Analyst) The suspicion that this faux distinction is the result more of role titles than actual job content was strengthened from the near unanimous opinion expressed by analysts that role boundaries were blurred and often analysts seemed to work on various kinds of products at both the tactical and strategic levels, with varying levels of confidentiality and using different data sources.
One interviewee with years of experience in different analytical roles said, It could be argued that the lack of conceptual clarity around exact role or job description of analysts as well as varied understanding of the main purpose of analysis might explain why crime analysis is still in the process of trying to find its niche at the heart of policing practice.

Analytical capacity and contribution
The research explored whether police officers appreciated and understood the contribution that analysis did and could make to policing practice at different levels of the organisation. This was crucial for understanding how and what kinds of analytical products are commissioned. From their perspective, analysts felt that police officers often failed to appreciate that producing proper in-depth problem profiles can be time consuming, whereas they demand quick, real-time solutions. One analyst in the focus group discussion observed, I think culturally the police are very impatient … they seethey tick the box and go, yes, we'll have a problem profile … But we want an action plan and we want it solved by the end of next financial year. So quite often you write a document but it's redundant before you've printed it off. (Analyst, Focus Group) Another interviewee expressed the opinion that often police officers do not match crime patterns with the intelligence coming in and often 'fail to join the dots' because they 'just work for that minute' (I21, Analyst). That is, their responses are reactive and short-term goal driven. We were told that some officers did not want to be given descriptions of the complexity of the problem, but desired simple, direct solutions. As one analyst described, Cops just want to go in very quick time. It's like we've got a problem there, let's go and sort it, we've done that, let's move on to the next thing. (I9, Analyst) One analyst said that the demands made on them by senior police officers can sometimes be quite unrealistic in terms of time or resources.
We have a bit of a joke about the analytical button on our keyboards on the computer that everybody thinks that we have. You can just press it and … [the result appears]. (Analyst, Focus Group) Officer expectations from analysts were varied. Officers in the focus group discussion expressed a generic need for analysts to identify and present enough information about a problem to enable decision-making. Sometimes they expected specific operational solutions to problems, We're surrounded by tonnes of noise and we can go off trying to listen to all of it, or as an analogy, what we look for the analyst to do is understand what that noise is saying, give me an option but also actually something which I can work with. (Senior Police Officer, Focus Group) Clearly this senior officer wanted definite solutions and operational alternatives to be clearly spelled out. In reality, situations can be more complex and nuanced and demand greater involvement on the part of the officers working with analysts to find suitable solutions.
Paradoxically, analysts felt that officers do not really expect or adequately respect solutions that come from their analysis. This was because analysts acknowledged that they are not the 'experts'; do not have adequate knowledge of operational policing; and therefore are not really expected to provide solutions or action plans. A supervisory intelligence manager said that sometimes when analysts get creative and make original recommendations in their reports, they can get knocked back by senior officers and this can be quite demoralising. This was confirmed by one analyst who said, Depending on who you're talking to in the level, sometimes people will listen to you and then other times they won't, which can be frustrating, particularly sometimes when you put recommendations in and then you're told to take them out. (I1, Analyst) Interviewees were asked whether in their opinion the intelligence products produced by police officers and civilian staff were treated with equal respect by the wider police organisation since previous research indicated that this was not the case (MacVean and Harfield 2008). One police officer working as an intelligence officer replied that there was little difference in the way he and the analysts operated or were treated by other police officers, except perhaps his experience helped in shaping recommendations that would be more suited to tasking. Analysts were aware that trust was pivotal in gaining any kind of respect from police officers, 'trust has to be gained … because you can lose that trust quite easily' (I24, Intelligence analyst), especially in making recommendations for frontline officers.
Cope (2004) reports that since most of the analysts in the UK are civilian, their products are often overlooked by police officers. Previous research indicated that analysts perceive officers' attitudes towards them as neither positive nor negative (Taylor et al. 2007). Our research suggests that things had moved on since then and there was far more acceptance of analysts' skills and abilities even though their recommendations might often be overlooked. One interviewee, with years of experience of working with and managing analysts felt that over time the organisation had begun to appreciate the contribution made by analysts. However, this process had slightly reversed in the past couple of years.
But it's only quite recently that our organisation has started to understand the sophistication, shall we say, of what we can do as analysts and what they can do for the organisation. But in the last couple of years, the shift away from that position, which was quite dramatic, has beaten most of our analysts down and dumbed their skills down, dumbed their enthusiasm down. (I23, Police Officer) Further, some officers interviewed admitted that they perhaps underutilised the capacities of analysts, mainly because they were unable to task them intelligently. One officer in our focus group explained, I think a big frustration for the intelligence side of things as well is we sometimes ask questions without knowing what it is we want to know. We sometimes don't know what question we want answered … -and there's the old garbage in, garbage out thing. (Senior Police Officer, Focus Group) The comment is telling as it highlighted the complexity of dealing with daily policing operations. An intelligence supervisor articulated the attitude of the police towards analysis and intelligence by saying, 'Everybody buys into the idea of the national intelligence model and the police should be intelligence led.' (I20, Police Officer supervisor) but felt that instead of analysts being used to find a solution to the problem in conjunction with police officers, their involvement was often restricted to predicting problems and identifying trends.
One senior officer interviewed felt that analysts can be quite 'passive' to the extent that when working as a team to produce a larger document, like a tactical assessment, they sometimes have tunnel vision and are concerned only with that part of the product they produce, often not reading the document in its entirety. It was his opinion that analysts are probably 'overwhelmed with stuff that they're just wasting their time doing or shouldn't be doing' (I23, Police officer). The police officer admitted that perhaps analysts were not being asked to do challenging tasks, but on the other hand they were not volunteering either. The officer speculated that this might be the result of one of three explanations: analysts not having the abilities and skills to undertake complicated tasks, or the fact that they had the skills but had forgotten, or that they were not motivated to use them in innovative ways.
Analysts felt that often they did not stretch themselves for two reasons: firstly, because they were too enmeshed in collecting and cleaning data -Analysts have to step out from behind the data and the number crunching and the cleansing because it is their comfort zone, it is how they can justify not getting too far with their analyses and it is holding the profession back. (I4, Analyst) Secondly, because a large portion of their time was spent on management talk and strategizing which left analysts with little time to actually do some original and in-depth analysis: We spend too much time talking about strategies and delivery plans rather than actually doing something … I think we get too hung up in delivery plans, governance structures and quite a lot of strategic management type talk. (I9, Analyst) A majority of interviewees (16 of the 24) shared the opinion that even when analysts might want to produce something new and original there is little scope to do so and they are left producing very much the same document year after year. The interviews suggest that analysis was being treated very much like 'business as usual', or a routine exercise to fulfil bureaucratic obligations, but seldom viewed as a tool to provide new and possibly creative solutions to existing problems.
Part of this dumbing down has been explained by Atkinson (2016) as 'infantalisation' of analysis whereby police officers de-professionalise analysts and inhibit agency by treating them as powerless, ignorant and immature and analysts themselves contribute to the process by becoming complicit and accepting this role. This research found some evidenced to support the argument. One officer (I23) explained that young analysts lacked the experience to understand the nuances of the problem on the ground and shied away from taking initiative. They did not undertake to learn first-hand from visiting an area or speaking to the stakeholders involved, who might give them a different perspective or put forward alternate hypotheses to explain the situation. Instead they looked at patterns and graphs and come up with limited solutions from that. However, another explanation for the 'dumbing down' provided by interviewees was the dramatic reduction in the number of analysts in the force since 2010 after the financial crunch. This meant that fewer analysts were doing the same work as was previously being completed by greater numbers. It left them with little time to undertake more in-depth hypothesis testing and theoretically sophisticated analysis, after they had delivered the standard, regular products and maps, a phenomenon not restricted to the UK (Wartell and Gallagher 2012).
The administrative changeover from higher or senior analysts supervising their junior colleagues to police officers becoming intelligence managers was fairly recent. When we asked whether the post of intelligence manager was sought after in the police organisation, a senior manager in the organisation (I3) felt it was not considered to be particularly career enhancing. On the positive side, the introduction of police officers as intelligence managers helped analysts understand police operational capacity and what measures could realistically be recommended as a solution to particular problems. One interviewee said he could help his analysts as a police officer because, I often get asked aboutfrom some of my staff about, can we do this? Can the police do this or can the police do that? So they're not always actually in a position to be able to make recommendations because truth be told, they actually don't know what tools the police have. (I20,Sergeant,supervisor) The flip side of police officers becoming intelligence managers, without adequate training (as we shall see) was that they were heavily reliant on the analysts themselves to dictate what kind of analysis was possible and the time it would take. I've worked in intelligence for a little over two years nowand one of my real issues iseven though I supervise intelligence analysts, I don't think, I fully understand theirwhat they can do, what their full potential is. (I20,Sergeant,supervisor) A senior manager said that supervisors were often frightened of commenting on the quality of analytical products because of this lack of understanding.
Organisational change meant that police officers were selected to be intelligence managers based on willingness rather than skills or aptitude. It was suggested, officers were drawn to intelligence work as a pathway into other coveted jobs in the covert field or because it might lead to some other intelligence jobs. However, the research indicated that training provided to supervisors did not adequately equip them to make many decisions and they continued to be dependent on analysts to tell them what they can or cannot do. One interviewee reflected, 'I don't think the skill of managing analysts is recognised and it is not valued' (I23 Inspector).
The quality of the final product depended on how intelligence managers allocated and distributed the work load. However, the hierarchical nature of the organisation meant that intelligence managers at the level of Sergeant or Inspector sometimes found it difficult to refuse work that the analysts considered to be not particularly helpful or useful if the client was of a senior rank.
You want to service the command team because you want to help the LPU and the community but sometimes you have to be honest and say, we can't do it. But they don't expect you to say you can't. (I11,Sergeant,supervisor) Analysts explained that sometimes even the line managers could see that the analytical product would not be of much practical use but reacted differently. Some would 'fight their corner tooth and nail' whereas others would just expect analysts to 'get on with it regardless' (I16, Analyst).

Evaluation of analytical effectiveness
Evaluation of crime analysis has two aspects: are analysts effective and are their analyses effective? Assessing whether analysts are effective involves evaluation of their ability and skills in developing and critically disseminating a product. In other words, not only their technical ability, but whether they have good written and oral communication skills (Evans and Kebbell 2012). On the other hand, evaluating whether analysis is effective involves outcome analysis that is, assessing whether intended outcomes are achieved, be they crime prevention, investigation and detection, or rational resource allocation. Evaluation of analyst and analytical efficacy is seldom undertaken in police organisations (Boba Santos 2014). Our research indicated that managers were responsible for evaluating whether analysts were effective, but neither were they trained to do so nor were there specific set criteria against which they could measure efficacy. One intelligence supervisor said that he had to evaluate the performance of his analysts by way of a yearly appraisal report saying that they did well because they produced products they were tasked with. He went on to add, I am quite blessed with my analysts particularly, I have received really positive comments off the end users, which I have been able to interpret into good results. But were it not for them coming to me to tell me that, I wouldn't have been able to say that personally. (I 20,Sergeant,supervisor) Assessing the effectiveness of crime analysis on outcomes and how this should be measured is not straightforward. It requires a high degree of analytical ability to conduct 'robust and rigorous quantitative evaluations of crime reduction programmes', abilities that few police organisations possess (Ratcliffe 2016, p. 174). Clearly, analysis that achieves intended outcomes can be considered effective. However, where crime reduction is the goal, success will depend upon the crime reduction strategies employed and how well they are implemented as well as the analysis that informed the approach taken (Boba Santos 2014, Ratcliffe 2016). Additionally, beat and management cops are more focused on tactical rather than strategic analysis and consequently the evaluation of strategies is rarely undertaken in police forces (Boba Santos and Taylor 2014). Our research indicated that officers had a tendency to demand analysis urgently, implement some prevention strategies, and then move on to the next crisis. Rarely was evaluation work of either outcomes or processes of implementation undertaken to measure whether and how particular strategies worked. A senior police officer said, What I can't afford to do is hit them with a big evaluation stick. The audience would go, oh that's horrible, and they'll just recoil from it. So you have to bring in that mind-set gently, but get them to think, actually, I've just spent £10,000 on this. If I go and do it again next year and the year after and the year after, how do I know it works? (I23, Senior police officer) However, evidence-based approaches seemed to be infusing slowly into senior management levels (as the second part of the above quote illustrates) although the resources for evaluation appeared to be limited.
One analyst interviewed (I17) suggested that as crime analysts it was not really their job to evaluate what worked and why, but that it was the domain of performance analysts. However, the interviewee felt that even if performance analysts did do this, they only looked at crime statistics, whether they went up or down and rarely analysed this in detail or examined why something worked and how. Managers felt that product efficacy was evaluated regularly in Task Meetings on the basis of whether crime went down. Evaluation, if done, was restricted to that level.
There was a sense that analytical products took time to produce but had a very short shelf life, mainly perhaps because in police work new challenges and new problems crop up all the time. Product efficacy was judged on the basis of how quickly a solution was presented rather than whether and how the solution worked.
One analyst commented that products worked only as much as officers took cognisance of them or found them to be useful or decided to act on them. He said, You usually find police officers want the easiest option with things … when we identify a problem … they either can't be bothered with because they've already looked at it previously or they don't perceive as being a problem. So there's a very big danger of them saying, yeah thanks for that, and just leaving it on the side. (I15, Analyst) Thus, our findings indicated that products had a short shelf life, were often in response to an immediate crisis and forgotten once the crisis was dealt with or a new one arose. There were neither wellestablished mechanisms nor processes in place to either evaluate the contribution of analysis to crime reduction or resource allocation, or to evaluate the performance of individual analysts.

Process gaps
Analysts were also asked to estimate the number of hours spent each week on two main types of tasksroutine tasks and analytical tasks. The former included administrative tasks, attending meetings, and responding to telephone queries; while the latter referred to time spent doing actual analysis and report writing. Figure 2 is a boxplot indicating that the number of hours spent per week (an average working week is 37.5 hours) on each type of task.
In terms of how analysts perceived their time to be divided between analytical tasks (includingsetting terms of reference, doing the actual analysis, report writing and dissemination) and other activities (includingadministrative tasks, attending meetings, responding to telephone queries, data cleaning and data collection), the findings suggest that most analysts (with the exception of two) felt that they routinely spent less time on analytical tasks than other activities.
Only four analysts (interviewee numbers 2, 3, 6 and 13) interviewed thought that they spent more time on analytical as compared to non-analytical tasks, while two (4 and 9) perceived spending five hours or less on analytical tasks per week. It is significant that 10 analysts interviewed felt that the bulk of their working week was made up of tasks that might be useful, but were strictly not analysis. Most felt that they spent the least proportion of their time on the dissemination of products or clarifying the terms of reference, and only three interviewees said that they would attend meetings where their products were discussed or were asked to present their product to their client or a team.
Consistent with the findings from the interviews, these patterns suggest specific gaps in the way analysis was commissioned and its products disseminated, which would prevent full utilisation of analytical capacity within the organisation. Further, frequent re-organisation resulted in capacity being diverted to readjust and reinvent roles and responsibilities causing further fuzziness around who does what.

Setting terms of reference
Task definition as explained by Nicholl (2004) 'is the process by which intelligence professionals ensure that the product requested is the product created'. Previous research has indicated that analysts are rarely given a clear steer about the purpose of their product, are culturally not in a position to challenge officers in a hierarchical organisation, and therefore end up producing reports that are more descriptive than explanatory (Cope 2004, Weisel 2005, Chainey 2012). There is often disconnect between what practitioners expect and analytical products which is the result of poor task definition in most cases (Nicholl 2004). Some of the disconnect between what officers think that they are asking for and the product they receive is because the terms of reference have either been unclear or one party has not understood the purpose and content of the tasking.
One of the most important factors to ensure a useful product is the setting of terms of reference between the analyst and the client. Its importance was described by one analyst, Sometimes … we've just been told to do this. You do what you're told to do and then it'll turn out it was completely wrong or it wasn't required because we didn't sit with the end customer to start with, to find out what it was they wanted. (I1, Analyst) Another analyst said that recently the trend was in favour of having verbal agreements on the terms of reference as against a few years earlier when they were written down and signed off by supervisors.
The intelligence supervisor or manager was the medium through which most negotiations took place. Police officers as intelligence supervisors were in a better position to negotiate terms of reference with the end clients. However, sometimes this became counterproductive for the analytical team in a hierarchical organisation where supervisors could not actually refuse demand for products that were deemed unnecessary or not useful. Good supervisors acted as 'buffers' between analysts and the officers commissioning a product to ensure analysts were not tasked with unproductive analysis.
Analysts differed in their opinion on whether they had a say in negotiating the terms of reference or whether they had little choice and this depended upon their role (analysts in CSP and LPUs had less of a say than those in specialist units), their manager or their individual personality. One analyst who was quite experienced and confident said he would firmly refuse to accept work that was not 'going to go anywhere', It's not about being obstructive, it's about saying, I've got the experience to know when I'm going to do something that's going to make a difference than do something that's not going to make a difference. (I12, Analyst) Experienced analysts had the confidence to help the client structure the terms of reference and the nature of the product when the client was not really sure of what is needed, although not all analysts could do so. The refrain, 'clients know what they want but not what they need' was echoed in several interviews.
However, when analysts had an active role in setting the terms of reference with the client, the results were perceptibly rewarding. One analyst described the process as iterativeworking through the client's demands bit by bit: It's just about starting off sometimes and saying 'let's try and understand this part of it, and let me scope it out a bit and let's have another conversation', doing it by drip-feeding. It saves a lot more time in the long run and you end up with a happier customer. (I10, Higher Analyst) Analysts and managers identified that often the client could change their mind about what they wanted at the end of a task; or if officers moved posts, sometimes the new incumbent was dissatisfied with what was originally commissioned. In the former instance, working closely with the client during the analytical process, so that the 'development of the product is organic' (I23, Police Officer, Manager) helped. In case of the latter, analysts were realistic and accepted that sometimes the nature of the organisation meant that analysts could often be overruled by the requirements of senior officers. A senior analyst had the final word on the terms of reference issue, We're quite robust in terms of at least trying to put up some kind of -'no we shouldn't be doing that, this is what we should be doing', but you can only go as far as that because at the end of the day … if it's like the deputy constable said, 'we need that doing, that's a priority', then that's the priority. (I9, Higher Analyst)

Dissemination and feedback
Dissemination of the intelligence product is not a simple matter of posting a report to the client but involves decisions about who needs to see the product, what level of security clearance is required, and who are the secondary consumers of the product (Mackay and Ratcliffe 2004). The research indicated that how a product was disseminated depended upon the kind of product and who the intended audience was. For specific products, commissioned by particular clients for a specific purpose, the product was disseminated directlyand it was up to the client how they would like to take this further within the organisation. More routine products on the LPUthe daily crime analysis, for example, were disseminated mainly by the Intelligence manager/supervisor at appropriate management meetings. Action points arising from the analysis for field officers were disseminated in the briefing meetings for every shift.
While ideally analysts would like to be involved personally in presenting their end product to the customer, in reality the process was different. Often dissemination meant, 'Just literally send it to them, wait for them to read it, and then we sit down and have a meeting' (I9, Analyst). Not all analysts said that they had face-to-face meetings with their clients when the product was completed. According to our interviewees, often managers or supervisors acted as disseminating agents.
When asked whether their products were considered useful by frontline officers, one analyst frankly described the situation as not exactly ideal, mainly due to time constraints and communication gaps, We dowhen we talk to the team locally, we're relying on them to give us that local knowledge and tell us what that area's like and give us all the detail … this has been flagged, what are you doing about it? Tell us what you're doing and then we can paint a good picture and say they're trying all these different things to make the problem go away. I know that sometimes they can see it as we're grassing them up and putting it in the document. (I17,Analyst) There were indications of possible misunderstandings between officers and analysts when communication channels were not regular and established and there was low trust in the analysts. The lack of interaction between analysts and police officers affected the capacity to develop actionable crime prevention strategies and convert crime analysis into 'actionable intelligence' (Taylor et al. 2007). Evans and Kebbell's (2012) research indicated that the ability to communicate well and be able to adapt in order to 'prioritise, guide and assist' (p. 209) decision-makers, by effectively disseminating the product to large groups and deliver briefings, is considered essential for an effective analyst. Our research found that 3 of the 14 analysts did not engage in any dissemination in a typical week, whereas the rest spent between 30 minutes and 2 hours per week at best. This was attributed mainly to lack of time on the part of both officers and analysts. However, on the few occasions they did get involved in the dissemination process, especially to senior officers, the experience was generally perceived to be very satisfying.
However, and while not every analyst had this reaction, one analyst expressed concern about the process saying, Because you can come up with a raft of things that you believe should be looked at, in conjunction with everyone else of course,it will never be met by people who are just so stretched and so busy doing their own work and cannot see out of their own blinkers that that's why it falls on deaf ears. (I4, Analyst) Moreover, from the analysts' perspective communication between the police and analysts could sometimes be less than desirable. One analyst said that they seldom received feedback from operational teams on the success or otherwise of the results of their analysis and this could be very demoralising, It would have been really good if the team of us who volunteered were actually debriefed and said, yeah, because of the work you did, these are the arrests. We didn't get that, we just heard, there could have been some arrests. It is a bit disheartening because you just think, what's the point? (I17,Analyst) Another analyst added that the lack of feedback meant you had nothing to use in order to improve your product the next time around. Lack of any systematic mechanism to ensure analysts were given feedback for their productsto help them assess what was useful and what was notmeant that whether feedback was given, and its quality, depended upon individual officers. Analysts mentioned that in rare instances when police officers expressed their thanks in person or via email, it went a long way towards boosting morale.

Training
As mentioned above although provision of training or lack thereof can be considered a process issue for the organisation, it can also extend into the knowledge domain. The research indicated that training provision differed widely amongst analyststhe older more experienced analysts received adequate training upon joining the job but the more recent newer analysts had to wait a long time.
One analyst echoed the sentiment of other recent recruits, I've been in the department for a year and a halfand next month I will be getting my analyst training to actually tell me how to do the job I've been doing for a year and a half. (I17, Analyst) The same analyst went on to confess that the only way to get on with the job without adequate training was to learn from colleagues and through experimentation on the job. Mentoring was recognised by senior analysts and managers as essential in developing good analytical skills and did not just depend on attending courses but on applying them appropriately on the job, You can say you must do a basic analyst's training course but actually how you apply those skills when you come back, who helps you to apply those skills, how you develop it [that matters] (I3, Senior Management) The interviewee went on to say that mentors were very useful, but the system was on the decline partly as a resourcing issue, partly because restructuring had resulted in a lack of suitable mentors in place for new analysts. A police officer supervising analysts confessed, Obviously when I first came to intelligence I was given no training of what an analyst can do, no type ofparticular type of job description. At the risk of sounding really bad I always thought an analyst … put dots on maps or draw on a map to show where offences were. … you wouldn't think of all the other things they can do to assist in problems because no one's ever told you. (I11,Police officer,supervisor) This speaks to the theme of officers not understanding what analysts do in general and supervisors not receiving the training to know what their analysts are capable of. A possible danger was for supervisors to rely more than is healthy on an analyst's own estimation of workloads and resources required, resulting in a lack of adequate support when analysts were struggling to cope. One supervisory officer admitted, I have actually sat them down and said, I am largely reliant upon you telling me whether something is too much to do or whether you're at capacity in terms of your workload or whether you shouldn't be doing something … As a supervisor, that's not a healthy relationship because, as I said before, if I get the wrong individual, somebody who might take advantage of that situation or somebody who's just lazy or is actually struggling but refuses to admit that then I place that faith in them and I'm not supporting them as I should be. (I20, Police officer, Supervisor) Another experienced analyst raised another aspect of training linking it to the impact of budget cuts on job security and the resulting lack of retention of analysts. One consequence of this for civilian staff was that it increased the rate of migration from the police to the private sector in the past three years. In one analyst's opinion, this, in turn increased reluctance to invest resources in training.
In the old days … you would have a secure job and you'd have a good pension at the end. Well obviously that's been attacked over the last 12, 18 months and what we find now is almost likenot a mass exodus but a lot of people who are here for the duration are notthey're here as a training opportunity. Because the police will train you two, three, four years, and then thanks very much, go into the private sector fully trained at the cost to the Civil Service because there's no retention of your trained staff. So there's a reluctance then to train new people in case they clear off. (I14, Analyst) Thus, a lacuna in the process of supporting analysts to becoming more proficient and supervisors to elicit the best from their supervisees via specialised training, led to gaps in knowledge about what analysts can and should be doing better to support operational policing.

Conclusion
A recent national survey of law enforcement agencies in the USA indicated that crime analysis is weakly integrated into police operations and only particular kinds of policing approaches are reliant on, and are effective as a result of, crime analysis (Boba Santos and Taylor 2014). Our research indicated that while crime analysis was ubiquitous and essential to police operations, it was perhaps being underutilised and often used as a tick box exercise, or to support an identified task, rather than optimally guiding police responses and strategy. The police, as an institutional organisation (Crank 2003), seems to have accepted the 'structure' of crime analysis as central to guiding police practice, even though this relationship in actual practice is not 'well established'. This is mainly because the technical environment of the police continues to be weak compared to its institutional environment (Mastrofski and Uchida 1996). The research revealed that although the technical/rational justification for the incorporation of analysis in police work has been theoretically accepted by senior police officers, managers and the force at large, senior officers nevertheless felt that the quality of analytical products was often less than desirable. Several factors were identified as being responsible for this situation. Firstly, most police officers were not aware of the kinds of more sophisticated analysis that can be undertaken and therefore rarely commission such products. Considering that the NIM has been operational for over a decade, it is indeed a matter of concern that officers do not possess this knowledge and not enough is being done organisationally to address this lacuna. Secondly, the resource crunch has meant that there were fewer analysts and therefore most of their time was spent doing routine, mundane analysis which is useful but underrated and unexciting. Additionally, even if they did have capacity, analysts were sometimes content to produce routine descriptive products and not take the initiative to be more creative, thus contributing to what Atkinson (2016) describes as the 'infantilisation' of crime analysis by the police. Finally, evidence indicates that there was sub-cultural resistance to accepting analytical expertise when it might contradict operational experience or organisational priorities in parts of the organisation.
The need to understand police culture is paramount to the project of introducing reform (Crank 1997), but in and of itself may not be enough to push through the required changes. Change needs be introduced by 'winning hearts and minds' of police officers (Crank 2014). Although introducing mandatory procedures and models of working like the NIM may set the change process in motion, true progress will only be achieved when people at all levels of the organisation are convinced that the change is good for them. Our research indicated that such a change process is underway in the police force studied even though it appears this feeling has not yet diffused through the entire force. Research findings revealed gaps: a lack of shared understanding of what analysis is and of the different kinds of analytical tasks within the analytical world itself and more widely within the organisation. This means that products can often be commissioned unimaginatively or in partial ignorance of analytical capability.
Expectedly, officer attitudes towards the utility of crime and intelligence analysis varied according to rank and between individuals (Paoline 2003). The relative lack of power and status in a hierarchical organisation implied that unless analysts are experienced and/or confident enough to negotiate appropriate terms of reference before creating a product, often analysis is mired in the routine and the mundane. This is not to say that routine analysis lacks utility, but to accept that it has limited capacity to offer new insights into problems or engender creative solutions.
Identification of specific knowledge and process gaps through the research generated possible solutions. These centred on analysts having a greater role in setting terms of reference as well as in disseminating the product. At the same time, better training for analysts but more importantly, managers, was identified as key to improving police-analyst co-ordination. Moreover, it was clear that there are few, if any, systematic ways of evaluating the effectiveness of analysts and analytical products in order to demonstrate their contribution to policing practice at present. Setting aside resources to systematically evaluate the contribution of analysis to crime reduction or rational decision-making would provide the necessary evidence that is at the core of evidence-based policing practice, but currently lacking. Such an endeavour might draw on Ratcliffe's 3-I model 6 to evaluate whether analysts are effective (Ratcliffe 2008).
On the one hand, previous research has indicated that the weak integration of crime analysis in policing has resulted from cultural resistance to change (Boba Santos and Taylor 2014), on the other, our research indicates that for our case study, crime analysis is now squarely accepted by the force for both rational and institutional reasons, but still has some way to go before it becomes central to guiding police operations. This is mainly due to certain structural gaps that we term knowledge and process gaps. In line with what Willis et al.'s (2007) findings regarding the adoption of COMPSTAT, Boba Santos and Taylor's (2014) researchwhich was based on a national survey of police agencies in the USAdemonstrates that the adoption and implementation of crime analysis in the force is rooted more in the institutional approach rather than the technical/rational approach. This means work processes are governed more by the cultural environment and are judged more on the basis of how they are seen to be responding, rather than being truly accountable as a result of more rationalised and streamlined work processes being adopted. The use of analysis as a tick box exercise and in order to fulfil the obligations under NIM indicates the objective of 'being seen to be responding'. On the basis of our findings in one force, the conclusion remains, perversely, that while the police may have accepted the change in organisational practice imposed by the NIM, they have not changed their existing policing or management styles in order to fully capitalise on this (Chan 2001).