How to perform an audit of your data
How to perform an audit of your data 150 150 Simon Hollingworth

In a recent blog ‘Your data not big data’ we looked at the importance of ensuring you didn’t get swept away with buzz words such as ‘big data’ and ensured your focus stayed on your own data and releasing its value.

In this blog we are going to delve into the practical steps we touched on in more detail and highlight specific actions that may help resolve data challenges that currently exist.

Three key steps to undertaking a data audit

1. Where is your data?

With many organisations processes have evolved over the years, this could be the addition of new data being collected, additional users needing to store and access data or, indeed entire new databases and systems being introduced via acquisition. Whatever the circumstances organisations often find themselves with multiple data sources stored in multiple places and ways.

To be able to understand your data in the best possible way it’s important you can gain one central view of it. Consolidating multiple data sources and creating a central management reporting system is something we helped Clarks do, the result? Not only did the project result in time savings but it also provided cost savings by providing management with access to the information they need to negotiate better deals in terms of materials and production costs.

2. Who uses your data?

When you are undertaking your data audit and understanding the detail of this question don’t forget to capture how they are accessing it too. It may well be that you have people accessing the data from a central office, from home or whilst out on the road, what’s more it’s likely that it’s being accessed via a mixtre of desktop and mobile devices, this is all important information.

If you are really looking to unlock the true value of your data gaining user buy in from the start will be essential. A great user experience regardless of the device they are using and where they are accessing the system from is a great way to do this. Hear how an offline mobile app we developed for RSA has provided a real-time view of data even for Risk Managers on the move.

3. How easy is it to get to the data you need?

If you need to access, download and interpret your organisation data quickly are you able to? Does it involve clicking a ‘run report’ button or, hours of downloading and manipulating spreadsheets? If it is the later there is no question it is making informed business decisions harder and take longer than they should.

Being able to access and interpret your data is a tool that provides organisations with a competitive edge. It allows you to spot new opportunities, influence business growth, make better decisions, improve customer experience and so much more.

Knowing where to start when faced with a plethora of data can be daunting. Discover how Save the Children turned to Software Solved to help them collect and visualise data worldwide. The result? The ability to have instant access to data has helped improve treatment outcomes.

If you’re unsure where to start to unlock the potential of your organisations data, get in touch. We’d love to help.

Mobile overtakes desktop – are you prepared?
Mobile overtakes desktop – are you prepared? 150 150 Simon Hollingworth

How many times do you check your mobile device in a day? Whether you’re accessing emails, text messages, your organisation’s CRM, social media, the camera or something else, it quickly adds up. In fact it adds up so much that on average it is expected here in the UK we check our mobile device 50 times a day.

What’s more this increased use of mobile devices has increased so much that for the first time ever in October 2016 mobile and tablets using the internet overtook desktop worldwide, it may only be marginally in terms of percentage difference but it without question marks a shift that has been building for some time.

Mobile usage overtakes desktop


With this shifting usage in mind the big question is – are your customer and/or staff facing systems prepared for mobile and tablet?

The reality is that if your organisation is not prepared or quickly getting prepared to be mobile friendly there will likely be a competitor or two that will be.

Consider this – when you access a site via a mobile device to make a booking, check a balance, look up some details, get a quote or something else what do you do if it is hard to use, slow and anything but intuitive? Likelihood is you look elsewhere.

Ensuring your systems provide the best user experience regardless of the device your visitors or staff are using to access it is a key component of retaining a competitive edge and a great customer experience.

Key components for making your systems mobile friendly

  • Strategy – what will the mobile systems help you achieve? Increased revenue, improved customer experience, increased staff productivity, competitive edge or something else?
  • Integration – making sure that new mobile front end interfaces speak to existing back-end systems is vital. Becoming mobile friendly shouldn’t see the development of new disparate systems.
  • Design – whether the system is for staff or customers the user experience should be a priority and the design and layout will help make this happen.
  • Technology – there are many technology options and by working with experts will ensure that you get the right solution built in the technooogy that makes your requirements a reality.

Hear how our team developed an innovative offline web app for Save the Children that allows data to be collected efficiently, worldwide.

Big data or your data?
Big data or your data? 150 150 Simon Hollingworth

When you hear the term ‘big data’ what is your reaction? Does it leave you wondering what it means to your organisation? instil fear and avoidance? Or, ask ‘is it relevant to us at all?’

With the tech industry swamped with acronyms and buzz words the term ‘big data’ is unquestionably another one but, one that it is important that organissations don’t get swept away with unnecessarily.

A 2016 report from IBM stated that over the previous 2 years 90{7465c2450dcd042996416963879c72771606ba211532680daeb6e67dd6282842} of the world’s data had been created. Pretty staggering when you think about it, but what has driven this vast increase? Predominantly this can be put down to the increase of both mobile devices and social media.

When you stop and think about the volume of data individuals put on Facebook before their profile is even live and then multiply that by the total number of users you can see why and how the term ‘big data’ comes into play, and that’s just one part of one social media channel.

Although the world’s data is set to continue to grow in the coming years it is important that organisations remain focused on their own data and the value it does or, can provide rather than getting hung up on ‘big data’ which could prove costly and result in the creation of a plethora of data that is irrelevant and adds no value to your organisation.

Understand your data

The data you have collected and plan to collect is what’s important to your business and it should be this that is the driving force and the centre focus of your data strategy. There is no question that data grows quickly and in many cases when this does happen it more often than not ends up residing in multiple stand alone spreadsheets or systems across the organisation. With data in this ‘silo’ state it can make it hard to analyse accurately and in real time but, for those businesses that do manage this there is no question they are armed to make better decisions, influence business growth and spot new opportunities.

How to identify ‘your data’

Your data, not big data has the power to improve the performance of your business and customer relationships.

Conducting a data audit is a good place to start to get a better understanding of your data and the steps that need to be taken to get the most out of it.

  • Where is your data? Is your data currently in multiple spreadsheets? Is it stored centrally or locally? Is it in a data warehouse or central database?
  • Who uses your data? What do they use the data to do? Does the data educate or inform decisions they are making? How do they access the data, on the move or in the office?
  • How easy is it to get the data you need? Are you able to quickly access and download data you need? Is the process automated or manual and laborious?

These key questions may appear over simplified but understanding them is crucial to ensuring that data can be analysed and utilised in a way that adds value and uncovers new opportunities for your business.

We’ll be covering more about these key steps in the coming weeks.

Discover how Software Solved worked with Clarks to consolidate raw materials data globally to help save £millions in its supply chain.

Monitoring .Net Applications for Performance and Health
Monitoring .Net Applications for Performance and Health 150 150 Andrew Page


As part of the support service that we provide to our customers, we undertake to regularly monitor the health and performance of our applications. This way, we can proactively resolve any problems that may be impairing the quality of the user’s experience.

For applications that are deployed to the cloud using managed services, a host of monitoring facilities are commonly provided as part of the underlying service from the hosting provider. For example, various health and performance monitoring widgets are available on the Windows Azure Portal to allow use to monitor the services we have subscribed to (to include App Services for our websites).

However, for the application that has not been deployed to cloud managed services it is necessary for us to configure the monitoring tools ourselves. An example of such a deployment may be an application deployment to on-premises resources (such as a set of servers in an in-house data centre) or an application deployed to a set of virtual machines in the cloud.

The technology provided by Microsoft for such a task is ‘System Center Operations Manager’ (SCOM). We have recently configured version ‘2012’ to monitor an enterprise application – with good results. During this process we have formed the impression that the documentation currently unfortunately fails to address certain points, as we have been obliged to find the answers to various questions through our own experimentation.

The primary source of documentation that we have used has been ‘System Center 2012 Operations Manager Unleashed 2012’ by Kerrie Meyler and Cameron Fuller. Whilst it can be said that this is a generally well written book, we have found that certain essential pieces of guidance seem to be missing. I would recommend the use this book to provide the fundamental knowledge required. However, we have undertaken to write this BLOG so as to provide answers to the outstanding questions. This BLOG does not attempt to replace the book, as it covers nowhere near the breadth of scope. However, the BLOG should provide you with the essence of the fundamentals required for the specific case of configuring the technology to monitor an ASP.Net application (from the perspective of the server). More specialised or elaborate monitoring scenarios may be setup (these are outside of the scope of this BLOG).

The assumption has been made that SCOM 2012 has been installed on a server residing in the same Active Directory domain as the server(s) to be monitored. Please ensure that the management pack ‘.NET Application Performance Monitoring’ (APM) has been installed, as this will be essential to the following sections.

Basic Configuration of the Monitoring Agents

With the way that we intend to use SCOM 2012, monitoring agents will be deployed to each target server. The monitoring agents will be used to collect the health and performance data that we will be analysing.

To deploy or manage the agents, open Operations Console and navigate to Administration / Device Management / Agent Managed on the menu. The grid in the central section of the screen should list the agents currently deployed. If the server in question does not appear in the list, use the ‘Discover Wizard’ to search for it and add it to the list. The wizard may be found by opening the context menu on the ‘Agent Managed’ node.

Once added, your agent should appear with a green (healthy) tick against it in the grid. If the tick appear in grey, it may be necessary to invoke ‘Repair’ by opening the context menu on the row that corresponds with the agent in the grid.

The installed agent should appear as two Windows Services with the names below on the target server:

  • Microsoft Monitoring Agent
  • Microsoft Monitoring Agent APM

Configuration of the Monitor

Please read chapter 15 ‘Monitoring .NET Applications’ of the text book referred to above for an overview of the process.

To configure the APM monitor, navigate to Authoring / Management Pack Templates / .NET Application Performance Monitoring

Select ‘Add Monitoring Wizard’. Follow the steps detailed in the above chapter of the book to configure the new monitor.

Note that the ‘Performance event threshold’ figure will need to be finely tuned over time in order strike the correct balance between over and under-reporting events.

Analysing System Performance Alerts

With the monitor configured, any system performance alerts that are recorded will be accessible under the following menu item:

Monitoring / Application Monitoring / .NET Monitoring / <monitor name> / <application name> / Overall Component Health

The individual alert may be selected, with the details shown at the foot of the screen.

Automatic notifications of alerts may be configured so that the details can be delivered via a medium such as email to the relevant addresses.

Reporting System Load Measurements

With the monitor configured, it will be possible to access data pertaining to system load levels via the following menu item:

Monitoring / Application Monitoring / .NET Monitoring / <monitor name> / <application name> / All Performance Data

Select the required counter, we recommend ‘Monitored requests per sec’.

Measuring System Response Times

In order to reliably periodically measure the response times of a web application, it is necessary to configure SCOM 2012 to play a defined web session repeatedly on a scheduled basis. In our case, the web session that was recorded included the following steps:

  • Login to the application
  • Navigate to a particular search function
  • Perform the search
  • Logout of the application

SCOM 2012 was configured to play the above steps every two minutes, and to record the overall time taken.

Please read chapter 17 ‘Using Synthetic Transactions’ of the text book referred to above for an overview of the process.

Follow these steps for the configuration:

  1. Record the web session

Navigate to Authoring / Management Pack Templates / Web Application Transaction Monitoring

Invoke ‘Add Monitoring Wizard’

Select ‘Web Application Transaction Monitoring’, click ‘Next’

Enter a name for the web application, and select ‘Default Management Pack’, click ‘Next’

Enter the URL of the web application, click ‘Next’

Select the server upon which the process will be executed (we recommend selecting the server that SCOM 2012 is running on), click ‘Next’

Select ‘Configure advanced options or record a browser session’, click ‘Create’

Select ‘Start Capture’

At this point, the default browser should appear with the ‘Web Recorder’ plug-in visible and running (on the left of the window). If the Web Recorder is not present, then

  1. You may need to temporarily disable ‘IE Enhanced Security Configuration’ (if using Internet Explorer on a server). Follow this article for details on how to do this.
  2. You may also need to make SCOM 2012 open the correct version (32 bit or 64 bit) of IE. Follow this article for details on how to do this.

Record the web session, and then click ‘Stop’

2. Adjusting the recorded web session to operate with the authentication mechanism of your website

If your website utilises Active Directory authentication, then follow the steps in the previously referenced chapter 17 to configure this.

If (as in our case), your website uses Forms Authentication, then you will need to follow the additional configuration steps below:

You will need to add to your web application a method by which to obtain the session and authentication IDs. In the case of our ASP.Net MVC application (that holds these IDs in a cookie), we added a new Action Method (as shown below):

How to comply with GDPR
How to comply with GDPR 150 150 Simon Hollingworth

In our previous blog we talked about what GDPR is and an overview of what it means for your business or charity: What is GDPR? In this article, we provide some practical steps for getting your data processes up to scratch before GDPR comes into force in May 2018.

What should you do next to prepare your business or charity for GDPR?

1. Consolidate data

It will be much easier to ensure compliance if you know where all your data is. Document everything you hold across all departments and locations. Creating data maps through data workshops is a useful technique to get started.

2. Understand users

Are your users making copies of customer databases to work remotely; stored on desktops, mobiles or in the cloud? This data should be subject to the same data protection compliance or you could create systems to give access to central databases on the go.

3. Review consent

Consent cannot be inferred. Do you need to replace pre-ticked boxes on your websites or user interfaces? Are you recording how consent was given and providing the right to opt out?

4. Data extraction

Be ready for data requests. How easy is it to comply with information requests from individuals? Review your processes and systems to ensure that data extraction and updates, are intuitive and quick.

5. Privacy by design

Ensure that any new systems developed or adopted comply with privacy from the outset. This is particularly the case with automation or integration software projects.

For help getting your data processes and software ready for GDPR, get in touch with a data expert today.

Software Solved 2017 tech trends
Software Solved 2017 tech trends 150 150 Mark Reed

When I was asked to put together a blog I naturally had a quick browse of the net to see what the ‘experts’ were predicting. In many cases I saw the usual suspects: AI, VR, Machine Learning and IoT is still making an appearance. But I also came across many buzzword titles where the text often did not actually match the heading ie live up to the claim. Here, I have included the top five common themes for software in 2017 and pointed out where the terminology may be a bit awry.

5 technology and software trends for 2017

User experience/user-centred design

The software development community has, in the past, been slow to realise that the users are king. Their appreciation (or lack thereof) defines the reputation of a software solution and by making the users experience central to the design and evolution of a system, giving them the smoothest possible path to complete the tasks they need to accomplish, you can achieve far greater success. This success may be measured in the form of competitive advantage, increased longevity of the solution or just a happier user base. This is not a new concept but the reason for highlighting is that the prominence of this principle has been increasing in recent years and I expect that to continue into 2017 with it becoming routine and normal across the industry.

Big Data (or just big value from your data)

Big data is one of those buzzwords that I was referring to in my intro. It is mentioned a lot as a coming trend and in most cases the author is not referring to the strict meaning of the term. I suspect ‘Big Data’ in its true sense will always be quite niche in the sense that relatively few organisations want or need to process the volumes of data that truly fall into this category. However, what many people are driving at when they talk about ‘Big Data’ is being able to consolidate data from a number of disparate sources and spot patterns to extract meaningful information to drive business strategy. The good news is that all of this is perfectly possible without adopting a costly big data platform and this is an area of considerable growth.

More moves to the cloud

It is over 10 years since Amazon launched AWS and the cloud computing marketplace has now got an air of normality and stability about it that it may have lacked in the past. Where once there was fear of leaping into the unknown with the Cloud’s “Platform-as-a-Service” offerings server-less computing is now a reality and you are more likely to be challenged as to why a solution should not be deployed to the cloud. This is likely to increase further in 2017 with Microsoft now offering most services from their two new UK datacentres the justifications for ignoring the cloud are getting fewer and fewer.


The way we communicate with our colleagues is changing. With remote working on the increase and the burgeoning popularity of Slack and other chat based collaboration tools, system generated email notifications could soon be seen as old hot. Microsoft has recently unveiled Teams as a new tool within Office365 as a direct competitor within this space and we expect to see increasing requirements to integrate with these types of tools.

Shadow IT

Shadow IT has long been the scourge of IT departments with the many potential downsides of direct engagement between IT suppliers and the business and the ramifications of ‘citizen developers’ easily identified. However, the attitude toward shadow IT appears to be gradually changing, partly along the same theme as highlighted under the user centred design heading. The assumption that ‘citizen developers’ are incapable of writing passable tools or that departments directly engaging with IT suppliers will make poor decisions without hand holding is arguably condescending and with resources stretched in many internal IT departments, 2017 could be a year where we see more organisations embrace the power of shadow IT.

For further guidance about how technology trends and developments could improve your business, get in touch today.