Guest blog by Oliver Walker, Head of Conversion Rate Optimisation at Periscopix:

I finished my undergraduate degree in Psychology without much of a clue with what to do next. However, with a keen interest in understanding why people behave the way they do, I completed a master’s in Consumer Psychology and Business, with a focus on the effects of advertising as a dissertation topic. This helped me to understand that my passion was in trying to understand what factors could affect the likelihood of person’s propensity to buy – and then determining how to influence that (in the least sinister-sounding way possible!)

Initially I wanted to put this into practice in physical stores, focusing on how the layout of a store, or the music played within it, would affect sales (it really does!) However, this being the ‘noughties’, it swiftly became clear that the future was digital, and that I would be better off swapping stores for sites, and window displays for web analytics. I was fortunate enough to join Periscopix, a digital agency, when we were tiny and I set up a team with a like-minded colleague dedicated to helping businesses understand what was happening on their website in order to improve.

We work with organisations across every discipline and size imaginable; from the smallest local charity to the largest retailers and brands in the UK, such as AutoTrader, Zoopla and Tesco. It is with these largest of businesses where we get into the world of Big Data; with monthly website volume creeping up to half a billion pageviews, it’s not a simple job to analyse what’s performing well or not. However, as Head of Conversion Rate Optimisation, myself and our team of Web Analysts are faced with these challenges on a weekly basis.

My current role

I head up one of two divisions within the Web Analytics team. Whilst one team focuses on creating the data, my team focuses on using the data. This could be analysing the performance of websites and apps, looking to group different types of users into segments to market to, or analysing the sequence of marketing channels a user interacts with to better understand their performance. In this sense, the data we work with might range from interpreting simple tables, to exporting millions or billions of rows of data to interrogate. Our overall objective is to use data to help our clients improve what they’re doing.

The work environment

Within our team of 26 people, and the wider company of 150, there’s a huge culture of knowledge sharing. We have weekly team meetings where we share our results of projects, tips and new tools, and monthly company-wide webinars where each department picks some useful examples of their work. The other arm of the knowledge-sharing is the fostering of an environment where we are constantly challenged to develop, learn and iterate in order to improve ourselves and the work we do. As above, this could be trying new techniques or new tools and then sharing with the wider team. Both the knowledge-sharing and the culture of learning lead to a company that’s committed to continually doing the best work we can for our clients, which has helped us grow from the ten people we were when I started, to the 150 we are now!

Tips and recommendations

Based on my experience, there are some simple tips and recommendations I can pass on to help you overcome the obstacles involved in tackling big data…

Be inquisitive

The most important recommendation I can offer when faced with big data is be inquisitive! You might not know how to use SPSS, R or Tableau right now but starting with a curious and creative attitude to what you want to achieve with the data will stand you in the best stead in the long run. Everything else can be learned so it’s these traits that we most often look for when hiring new graduates for example.

Group and segment data

Another recommendation is to group and segment data where it makes sense to do so, to break it down into more manageable, and powerful, chunks. In my role, that could be taking different page URLs on a website and grouping them into sections, for example: comparing the performance of Shoe pages versus Jacket pages, looking to understand whether certain types of marketing channels behave differently on website, or trying to understand whether one version of a page layout works better than another.

Compare data

If you’re comparing using time-series data, eg. to understand which version of a page led to more sales, then it makes sense to compare that data using stats tools like SPSS. This tool used to have me in cold-sweats back at university but employing software like this helps to clarify whether the effect you’ve observed is down to chance or the change you made (fingers-crossed!)

Look for patterns

Equally as important is taking your data sets and looking for patterns amongst them. Again, was a change observed because there was a correlation in the two sets of figures or was it sheer luck? We use a program called R to explore these relationships, for example looking at whether TV advertising correlates with website traffic or looking to understand whether a shift in global stock market prices affects job applications to finance companies.


Finally the most important point is to plan. Taking a file with billions of rows of data and hoping you spot something useful is virtually impossible. Taking the time to plan what you want to look at, why you want to look at that, and how you’re going to visualise it is imperative if you’re to avoid sinking in the data. In particular, visualising the result of your analysis is arguably the single most important part – you might have the best piece of insight ever, but without effective visualisation, it might be hard to get across. There are loads of great tools and languages (Tableau, Google Fusion Tables, Java) and examples of great visualisation of big data out there. A couple of my favourites are the UK government visualising how users navigate across their website and The Guardian’s mapped “London life”.