Friday 3 December 2010

Basic Intro to Statistics for Analysts

Don’t panic.

Before I was lucky enough to complete the National Intelligence Analyst Training Course at Greater Manchester Polices excellent Sedgley park training facility, I didn’t know my statistics from my elbow. (I fully recommend taking at least the Crime Pattern Analysis course by the way).

 I hadn’t done any proper math in years, and what little statistics I was exposed to in my university degree went completely over my head.

During this course I learned 2 important things.

1.     Statistics doesn’t have to be complicated. Once you understand the very basics, you’re pretty much up to speed with what you need to know day to day as a crime analyst.

2.     Statistics can really help you to identify a worrying trend from a natural “blip”. We all know that crime levels go up and down each month. There is a level of natural variation that would occur even if we did everything this month exactly the same as we did last month. Statistics allows you to tell which increases mean that something important has changed.

What I want to do in this blog is to introduce a very simple statistical tool that I use regularly. Hopefully this will convince those of you who don’t currently use any statistical analysis that there is merit in getting to grips with it a little bit.

I responded to a query from a colleague in the IACA recently by telling her about the technique, and got lots of positive feedback from other analysts, so I hope it can be of help to you also.

Hypothetical Situation

So you are in your office one day and the divisional commander sticks his head through the door. You haven’t seen them early enough, so you haven’t had a chance to hide under the desk.

“We’re taking a look at burglary across the division, and we need to know if any of our areas are showing different trends to the division and need attention” they say, before strolling off.



Put really simply, as analysts we know that we need plot the number of burglaries for each area across time, and see if the lines are behaving in the same way. Simple!

The problem we have is that the lines might not be comparable on the same graph. One area may naturally suffer a higher number every month or the local areas will have much smaller total than the divisional figures so its line will be really high on the graph, whereas the rest will be lower down. This will be even more so if we want to compare against regional and national trends. I’ve put together a simple example dataset below. The division is Oldham, and Werneth, Glodwick and Coldhurst are three towns in within the division (the actual figures are made up).


If I just drop these onto a graph, I get:






 This is technically right, but it’s also about as useful as a hole in a bucket.


But what’s that I hear you say? Excel lets you plot series on the second axis? By crikey I think you’re right!

As the figures for the towns are quite similar I could simply chart these figures and move the three towns over to a secondary axis. I could then play about with the scale for the secondary axis to get the lines placed somewhere similar on the chart and end up with something like this:


Much better! Or is it?


The problem with this is:

1.      I have manipulated the data to make it look like it does. Whilst this is simply for the purpose of making the lines easy to compare, it is hardly the "scientific method" and contains the risk of either misinterpreting the data because of the way it is manipulated or deliberately making it look a particular way to strengthen a particular argument. Oh Dear!

2.     If I wanted to add another data series that was dissimilar to both these (for instance the regional or national rates) I wouldn’t be able to. I've run out of axes!

What we need is some way of changing these numbers so that they are all on the same “scale”. That way, we can just put them all on the one axes on one graph.

The Statistical Bit

Ok, I hope you’re still there, and you’re about ready to do the statistical bit. The next bit isn’t complicated, but it’s important that the concept in the next few lines sinks in. Take a break if you need to. Have a cuppa.

The technique we are going to use is called “Z-Score” or “Standardised Score” (different names, same thing. Z-Score sounds cooler though, and is the more widely used term).

 Z-Score allows you to chart different series of data with wildly different figures on the same chart using only one axis.

This is because instead of showing the actual "counts", it instead shows how far each point in a series is away from the average for that series.

Ok that’s the important bit, so I’m going to say it in a different way if you didn’t get it.

Here’s the data again:


Take October 2009 as an example. When we do a Z-score, what we will be doing is replacing these numbers with a number showing how far each number is for the average for that series. So instead of 1569 being the number in the Oldham column, it will be a number which represents the distance 1569 is from the average for all of Oldhams results. And instead of 38 in Werneth, it will be a number which represents the distance 38 is from the average for all of Werneths results, and so on.

Now the number isn’t as simple as taking the figure away from the average. That wouldn’t help. The number for Oldham would still be different in scale from the other 3 towns. The measurement of distance that we use is something called the Standard Deviation.

There are some very complicated descriptions of what a standard deviation is. The simplest description I can give you is that it is a number which shows you how spread our your numbers are.

Do you remember the phrases “normal distribution” and “bell shaped curve” from school? They probably took the heights of everyone in the class and plotted them on a graph. You end up with something that looks like this shape:



The Standard Deviation tells you how spread out this curve is. In this example, the average is 100, and the Standard Deviation is 15. In a normal distribution (a graph that looks this shape), 68% of the results will be +/- 1 standard deviation, and 95% of the results will be +/- 2 standard deviations.

What the Z-score does is tell you how many Standard Deviations each result is compared to the average for that series.

This means that result is presented on the same scale as every other result, including the results in the other series, because they are all “standard deviations” (hence standardised score!)

The benefits of doing it this way are:

1.     You can put as many different types of data series on the graph as you like, instead of being limited to two

2.     Simply using a secondary axis can sometimes be misleading as the scales for the axis can be manipulated to move the lines up and down and to change their "depth", in relation to each other. Using a z-score graph makes all the data series consistent.


Instead, I calculate the z-scores for each data series.

The first thing to do is calculate the Mean and Standard Deviations for each of my 4 series. The easiest way to do this is to use the MEAN and STDEV functions in Excel.

For each point, I then subtract the Mean and divide the remainder by the Standard Deviation.


Doing this for each point gives me:









Which, when I chart the Z-scores gives me:


This gives me a much more robust graph that clearly shows the relationships between the different data series using 1 axis. It’s also a much stronger scientific and auditable method for comparing the series.

Also, because the y axis gives me the standard deviations, I can draw out the points along each series where statistically significant counts have been recorded.
Remember when I said that in a normal distribution “95% of the results will be +/- 2 Standard Deviations”, well that also means that only 5% of the results will be outside these limits. Therefore, there is a less than 5% chance that a result will be more than 2 standard deviations from the average. In crime analysis, this is enough of a level of significance (some scientific disciplines, such as medicine, require a 1% level of significance meaning that results less than +/- 3 standard deviations are considered normal).
If we find a result that is outside these boundaries, then we say that it is “Statistically Significant”. As a rule of thumb, we should never be saying that we have had “significant” results unless we can show they are “statistically significant”. It’s poor scientific practice. Use “exceptional”, “unusual”, “meaningful” or something else.
So I know from this graph that in June 2009 there was a significant level of offending in Glodwick (maybe we had a very active offender move in to the area), and in January we had near significant low levels of offending across all areas (maybe we ran an operation that month). I can also say that Werneth should be our priority, as it is heading towards recording significantly high levels of burglary.

Hopefully all this makes sense. If you can explain it more simply, share it here! (email me and I will post it).

Believe me when I tell you that it isn’t complicated. If you can get this technique to click in your head, then you have it licked! I find loads of uses for this technique. I compare local trends to national trends, I compare staffing levels to the number of crime in each area to make sure we have the right number of officers in the right place; I compare trends in incidents to trends in crimes. There are loads of uses that I’m sure you can think off.
As always, please feel free to leave a comment or email me, and have an awesome weekend.


Tuesday 30 November 2010

Keeping yourself informed

I was having a conversation recently with a colleague from another area, and we got round to discussing “keeping yourself in the know”.
I had happened to mention how I had been following a journalist from a local paper as he tweeted his experiences in our town centre on a typical Friday night. He was one of several journalists across the area that were spending the evening reporting on the “real” level of alcohol related violence in our town and city centres in this region.
Predictably, things were quiet in our town. There used to be a real problem with alcohol related violence and hospital attendances, but a proactive policing presence, stricter licensing enforcement and a general downturn in the number of bars as an impact of the economy has meant that Friday and Saturday nights are now much safer, and quieter, than they were previously. The record of this journalists tweets will make an interesting appendix at our next monthly performance meeting, and will help to put a face on our night time economy that is sometimes not seen by the people who attend these meetings.
My colleague basically asked how I kept myself informed about things like this. Do I spend all my time surfing the internet when I should be working?!
The answer is that, actually, I used to surf the internet a lot for news stories relating to crime and disorder in my area (not at work though, at home!). Over time, however, I started to find some tools that really help me to do it quicker and easier. These are the basic tools that I think are valuable for all analysts, and can really help you to start keeping plugged into what is happening in your area. Have a think about how you can use them to increase your local knowledge.
Get a good RSS reader.
 RSS readers are essentially tools for keeping on top of websites that you regularly look at. Think of them as websites that tell you about other websites! They will keep track of “feeds” that you specify, and will collate anything new that appears there. Almost all news websites (including local papers) publish RSS Feeds. All you have to do is tell your RSS reader the web address and it will find the feed for you. I personally use Google Reader, and find it reliable and easy to use, but there are a range of others out there. I use them to follow local and national papers, blogs that interest me, my forces news (and recruitment!) pages, some government departments, a few university departments public journal archives and, of course, the daily Dilbert cartoon (hey, all work and no play..!). I can quickly scan all these sources of info, and within about ten minutes every morning know about any breaking news, new policies, or relevant articles of interest that have been published online. If I were to scan each website individually, I reckon it would take about 2 hours.
Set up some sensible Google Alerts.
Ever try Googling the area you work in and adding “crime, police, antisocial behaviour, disorder”. It can get pretty interesting. If you did it every day, you’d learn a lot about that area that you might miss even if you work in the police station. Well, with Google Alerts you can set up a search that will run automatically (there are options for how often), and it will email you the most relevant new results to whatever email address you specify. I have about four different alerts set up, all of which relate to crime / disorder in my area or general crime analysis / mapping techniques. This means that every day I might get 2 or 3 emails about crime in my area, and any new books / articles relating to crime analysis that I can read in my spare time. Talk about making Continuous Professional Development easy!
Don’t be afraid of using social media at work.
Now let’s make this clear, I don’t mean use Facebook at work to whinge about your job. Doing that from home is dodgy enough without doing it from your desk! What you can do though, is to use the more popular social media sites to keep track of what’s happening in your area. There are a range of “hyperlocal” sites cropping up all over the place, and keeping your finger on the pulse can really help you keep track of what is important to your residents and local groups.
The two that I think are the most helpful at the minute are Facebook and Twitter. I have “work accounts” for both. They are separate from my personal accounts, and contain no personal info beyond my name and role. In both accounts, I have links to local groups of interest (such as residents associations, local community groups, neighbourhood watches, local councillors etc) and use them to pick up on the issues that start to crop up that I need to keep on top of. I was lucky in that the team I work for understands the value of doing this, and believe me when I say it has paid dividends. It has allowed us to respond very rapidly to hate groups that have sprung up, potentially violent incidents (young people will often schedule fights on Facebook!), keep up to date with community projects and activities that can help us keep our young people occupied and out of trouble, and has generally helped us to improve our standing in the community by seeming like we are modern and sensible in how we communicate with the public. We even have our own pages and feeds for some operations and projects we run throughout the year.
Don’t be afraid to speak to your manager and IT department about why you need access. If it is for a genuine reason, and you make the argument in the right way, most forces will “unblock” these sites if they aren’t already. Remember, your CID department will almost certainly have access for the purpose of running investigations. This is just having similar access for another, equally legitimate, policing purpose.

There are, of course, plenty of other tools that I probably haven’t discovered yet. There are also plenty of really helpful websites, associations and online resources that are really helpful. I think there’s probably another post in the wings about some of the sites I feel have helped me. Feel free to make other suggestions in the comments box below (if you can’t see it, click on the title for this post and it should bring it up). If you don’t think the resources above will help you, but have other ways of keeping yourself “plugged in” to your area, share your experience! Above all, the question you should be asking yourself is “do I have a wider understanding of my area than just what the crime data tells me? How can I expand my horizons?”

Friday 26 November 2010

The 5x5x5 system

In my last post I mentioned the 5x5x5 system, which is the standard method of assessing the usefulness of a piece of information in UK policing.
In this post we will discuss the system in more detail, and remind ourselves of how we, as analysts, can use it to help us make appropriate conclusions.
The 5x5x5 system was an evaluation process introduced under the National Intelligence Model to replace the “rule of thumb” evaluation process, and to make the storage and use of intelligence auditable. It is detailed in the Management of Police Information Guidelines published by the NPIA via Centrex in 2006. The benefits of having this formalised system are that it allows us to identify credible information easier; it means that the information is assessed in a consistent way; it facilitates the sharing of information between intelligence officers; and it provides operational officers with a stronger basis for initiating action.
The way in which the 5x5x5 system does this is by providing an assessment of the intelligence in 3 areas. These are;
1.       An Evaluation of the Source
2.       An Evaluation of the Data Validity and
3.       An Evaluation of  Handling  Sensitivity
As you have probably guessed, there are 5 different possible gradings for each area. We will discuss each grading in detail below.
Source Evaluation

The evaluation of the source refers to the assessment given to the person, agency or technical equipment that provides the information. It is important that this evaluation is made separately from an evaluation of the information itself, and, as with the other areas, it is also important that the evaluation is not influenced by personal feelings or bias. The source characteristics which may influence the grading given can include qualities such as the sources personal characteristics / circumstances, the quality of the information recorded by an electronic device, or the reliability / professional capabilities of the agency providing the information.
The possible gradings for this part of the assessment are:
A.      Always Reliable – This is only used when there is no doubt as to the authenticity, competence and reliability of the source. Usually, this grading will only be used where the information source is “technical” rather than human (such as surveillance video, DNA evidence etc)
B.      Mostly Reliable – This grading is the most commonly used and relates to a source which has, in the majority of instances, proved reliable. This may refer to police officers, regular informants who have proven reliable previously or witnesses.
C.      Sometimes Reliable – This grading is used where the source has proved to be reliable on occasion but also unreliable at times. This may include information from the media, or from an informant who is inconsistently reliable. As analysts, we should exercise caution when using information that is grade C, and should always seek to gain corroboration before trusting this intelligence.
D.      Unreliable – This is not often used, but often refers to information which is known to be maliciously false or from an informant with a likely ulterior motive. As analysts, we should use extreme caution of presented with grade D information. It may be possible, however, to make use of the information if we know it to be false or it helps us to understand criminal relationships.
E.       Untested Source – Sometimes it will not be possible to make an informed decision as to sources reliability. Information from Crimestoppers, for example, is second hand and anonymous, and so will often be graded E.  This does not mean that it cannot be used, but should be treated with caution and corroborated with more reliable information if possible.
Remember, even though this grading system allows us to make an informed choice about how reliable the source is, we should never accept our information at face value. Always seek corroboration where possible.
 Data Validity

The evaluation of data validity refers to an assessment of the circumstances in which the data was collected, which helps us to understand how “true” it may be.
1.       Known to be true without reservation – This information will often be collected by technical surveillance or witnessed first hand by a law enforcement officer. Remember, however, that even though the record of the information may be known to be true, it does not automatically mean that the information itself will be true. For instance, an officer may report information that is hearsay. The record of this information may be graded 1 as we know the report of him saying it is true, but the information itself may not be true.
2.       Known personally to the source but not to the officer – This grade is used where the information is second hand usually comes from a non law enforcement source, such as information from a witness.
3.       Not known personally to the source but corroborated by information already recorded – This is information that has been passed to a source from a third party, but is backed up by other information (such as CCTV footage).
4.       Not known personally to source and cannot be corroborated – This coding would be used if the source has received the information, but there is no way of cross referencing it to other information. The reliability of this information cannot be judged, and must be treated with caution.
5.       Suspected to be false or malicious – This information is known or suspected to be deliberately untrue. Any information with this code should be corroborated with other information, and should be handled with extreme caution.
Handling Sensitivity

The final assessment is designed to provide an initial risk assessment prior to disseminating the information. They allow the intelligence officer to decide whether or not to disseminate the information and, if so, to whom.
The first handling code permits dissemination to UK Police Services and other law enforcement agencies as specified. The use of this code permits dissemination to a wide range of police and law enforcement agencies, but only those agencies with a specific need to know the information will receive it.
The second handling code permits dissemination to UK-non prosecuting parties. This code allows for the dissemination of information to partner agencies. The information can be disclosed in full or just certain selections from it.
The third code permits dissemination to non EU foreign law enforcement agencies. This dissemination is handled by the Serious and Organised Crime Agency.
The fourth code restricts the dissemination of the intelligence agency to within the originating force. Information with this code must be reviewed to ensure that dissemination can take place at the earliest possible point. This code should never be used as the default handling code.
The final handling code permits dissemination, but requires the receiving agency to observe specified conditions of handling and dissemination. This code should not be overused, and should only really be considered if there is a clear risk of harm to the source, operation or technique. Again, this code should be constantly reviewed to allow more widespread dissemination.

Advice on Handling Graded Information

As a general rule, it is advisable that we should restrict our analysis to information graded B2 or above. By this, I mean that the first two codes are A or B and 1 or 2. Information below this grading may help us provide a fuller picture, but the bulk of the information should come from the more reliable gradings.

Remember to always check the grading of any information you use in your analysis. The grading can, in itself, be of use to us as analysts. For instance, the reliability of the information can be represented in association charts by representing the links as confirmed, unconfirmed or tentative based on the grading of the information.

The question I pose in this entry is “are you always cautious when reviewing information, and do you always check the grading of information when collating it for your analysis?”.

Thursday 25 November 2010

The intelligence cycle

As far back as 1971 the Godfrey & Harris book “Basic Elements of Intelligence” outlined the principles of the intelligence cycle. This is the cycle by which we still employ intelligence in the course of policing. There are five basic steps, and it is worth recapping them here. If you find yourself tasked with a piece of analysis, and don’t know where to start, then this process can help to get the juices flowing!

1.      Collect the information: In the modern world of policing, this is done through a variety of means. Information can come thorough a variety of sources, such as informants (Or Covert Human Intelligence Sources [CHIS] to give them their proper title); surveillance; from victims or witnesses; forensic information; in the course of normal police business (stop & search records) or any variety of other means. As analysts, this is often done for us where police data is concerned, but always be thinking about other data sources, such as that from partner agencies, which may be able to help us. The key is to think about what data will be helpful to explain the problem we are looking at.

Note that at this point we refer to our data as information and not intelligence. It is only after the evaluation and interpretation stages that we refer to it as intelligence.

2.      Evaluate the information: In the UK we have a standard evaluation system by which we evaluate the “trustworthiness” of police information. This is commonly referred to as the 5 x 5 x 5 system. The information is scored as to its Source (the first 5), its Data Validity (the second 5) and it’s Handling Sensitivity (the final 5). I will handle the interpretation and detail of the 5 x5 x 5 system in a future blog. For data from other sources, it is important that you discuss the intricacies of the data with someone who understands it so that you can make an educated assessment of how useful it is

3.      Collate the information: Collation involves the storage of information with cross references for retrieval. By collating in the collect way, the analyst brings together separate pieces of information for comparison, and places them in a way which presents the information in such a way as to make the next stage easier.

4.      Analyse the information: This is, obviously, the key stage for the analyst. The analysis is the systematic interpretation of the information in order to provide an evidence based understanding of the criminal problem. The analysis stage is, in itself, a series of separate steps.
a.      Data description: The goal of this stage is to assemble and integrate the available relevant information so that its meaning becomes clearer. As crime analysts we use a variety of methods to do this, such as link analysis; event charting; statistical analysis etc.
b.      Inductive reasoning: The next stage is the application of inductive logic to the information presented in the data description. Inductive logic is the mental reasoning process used to infer meaning from specifics and details. By this we mean evaluating each of the constituent known parts of our problem, evaluating their meaning, identifying the ways in which they are related and using this to “go beyond the facts” to explain the criminal problem. This is often the hardest stage, as it involves making conclusions that are based on the data available, but possibly not supported directly.
c.       Hypothesis development: Following the application of inductive logic, it should be possible to suggest one or more hypotheses regarding what is actually “happening” in our crime problem. A hypothesis is simply a tentative but plausible explanation of what is happening. A good hypothesis exists only to be confirmed or denied through further testing, and provides a theory that can focus further data collection to help test it.
d.      Hypothesis testing: After defining our hypothesis, we are usually required to do more data collection to test whether it is correct, or to select from competing hypotheses. We gather whatever further information is necessary to prove or disprove each hypothesis until only one is left. This final, proven, hypothesis is known as the Inference and is our stated explanation for the existence of the crime problem.
e.       Statement of Inference: The final stage is to construct our inference so that we are presenting a concise and clear explanation of the problem. A good inference should contain all of “Kipling’s 6 wise men”, namely Who, What, When, Where, Why and How. The order itself in not important, as long as what is presented is a clear, easily understood and memorable statement.

5.      Disseminate the final analysis: The dissemination of the completed intelligence analysis from analyst to user is a key stage of the process. The best intelligence in the world is useless if its significance and meaning is lost at this stage. I would encourage analysts to consider how they present their information, Too often we fall back into the mould of presenting a word document that looks the same as every other analytical product, and which we know will lie unread on a desk somewhere. Consider holding a presentation, or giving a verbal briefing with a written supporting document. Where possible, hand over the document in a prearranged meeting as it will allow you to explain the analysis and answer any questions, as well as ensuring the content of the document is communicated. Always consider that the dissemination should always be tailored to the needs of the audience and the case in hand.

This is a very brief description of the full intelligence analysis process. The question you should ask yourself is "Do I follow this process when I am producing analytical products, and if not, why not?".

Tuesday 23 November 2010

The role of the analyst

It is only comparatively recently that analysis has had a recognised role in policing. Policing itself goes back hundreds of years, and police forces can be traced back to 1667 Paris (or the Peelers in 1829 London, depending on who you ask!). Crime analysis in the UK has only been approached in a structured way since the early 1970s however.

After 40 years, crime analysis is still finding its place in policing. We are still faced with colleagues who don’t quite understand what analysis is, and requests for work that reflect a lack of understanding of what analysis can provide.

We often hear phrases such as “intelligence led approach” and “evidence based policing”, but we don’t often hear a good explanation of what these mean. Too often, they are phrases that are used to promote work which, in fact, has little or no proper analysis underpinning it.

Central to the philosophy of “intelligence led” or “evidence based” policing is the notion that decision making should only be made following a good assessment of what has happened.  This allows correct, informed decisions to be made, which will have a real impact on the crime problem.

The role of the analyst, therefore, is to undertake that assessment. The analyst provides a clear understanding of the problem, by way of systematic analysis, which allows decision makers to design an appropriate response.

Please note that what I am not saying is that the analyst recommends the response. Too often the analyst is expected to design the solution to a problem based on their analysis. This is beyond the scope of an analyst’s role, and in many cases the analyst will not be qualified enough, experienced enough or expert enough to suggest the correct response. The decision as to what should be done in response to the problem is the responsibility of others, based on the understanding provided by the analyst.

In order to do this right, of course, the analyst and the decision maker need to understand each others role. Each will have expectations of the other. The decision maker will expect the analyst to provide a clear understanding of the picture, and to get beyond the data to provide an explanation as to “why” the crime problem exists. They will also expect the analysis to be timely, focussed, forward looking, understandable and aimed at solutions rather than just describing the issue.

The analyst, on the other hand, will expect to be provided with clear direction, quality data, access to appropriate resources and systems, support and appropriate time to produce the work.

Too often, however, these expectations are not met. The main frustration for analysts is to be vaguely commissioned, with very short deadlines and often without access to appropriate or quality data. The result is that the decision maker is presented with a document that doesn’t tell them anything they don’t already know and doesn’t help them to make an “intelligent” decision as to the response.

The question you should be asking yourself is “Is my role clear, and am I commissioned / commissioning properly? If not, what should I be doing about it?”.  

Sunday 21 November 2010

Hello!

Why Hello there!

First of all, thank you for visiting my blog. If you have found yourself here, then you are in all likelihood a crime analyst, a student of crime analysis or a manager of crime analysts.

If so, then I hope this blog can help you to understand what crime analysis is, what it should be, and what it can do to help policing in the United Kingdom.

Secondly, it's important I get a big thank you out to Scott Dickson, author of the original Crime Analysts Blog. His advice and support have been greatly appreciated and I fully recommend reading his blog at http://www.crimeanalystblog.net/

The reasons for me writing this blog are essentially twofold.

Firstly, I have always been aware that crime analysts in UK police forces are generally not well understood, particularly by some of the people who commission "analysis". The work that an analyst is tasked to do is often not analysis, but rather a description of facts and figures that simply state what is already known. I feel that as analysts, we need to be training the people who use our analysis in what we can do for them. As I always say, and you will hear me say over an over again if you continue to read this blog, an analyst’s job is to provide an understanding of the problem so that the decision makers can design an appropriate response. This holds true if you are a partnership analyst, strategic analyst, CSP analyst, intelligence analyst or any of the other myriad types of analyst that have sprung up across the country over the last few years.

My second reason for writing this blog is to do with the challenges being faced by police forces all over the UK today. Over the next few years a lot of people you know will lose their jobs. Police forces are having to take a hard look at what they do, and decide which members of their policing family are too indispensible to lose (and, by default, which roles can go). Analysts are not immune from these cuts. I have already seen reductions on the senior analytical team at two forces, and the loss of the head of analysis at one.

As analysts, one of the things we do not do well is communicate to managers the value of our input. We need to get better at providing products and advice that helps our managers provide a better service. In order to do this, we need to get back to basics. I believe that a lot of analysts deskill quite quickly after initial training (if you are lucky enough to get some!). This isn’t any comment on the skill of the analyst, but it is the natural outcome if we don’t constantly practice and utilise the tools at our disposal. This blog, therefore, will provide a recap of basic techniques; theories and concepts so that as analysts we can remind ourselves how to best provide good, well rounded analysis.

I am aiming to write at least twice a week, and welcome any suggestions for content or comments that you have.  To begin with, this week I will start by looking at the role of the analyst and basic concepts in analysis.
Until then, the question you should be asking yourself after reading this is “do I use all the skills at my disposal, or are there techniques I should brush up on?”.
Rory