Technology companies are often guilty of constantly releasing new innovations and promoting the ‘latest and greatest’ from their offering, while forgetting that their recent ‘latest’ technologies may still only be just catching on with their customers. This is certainly demonstrated in the wider traction and implementation of artificial intelligence (AI), Internet of Things (IoT), and machine learning (ML), which are being deployed in production IT environments as businesses look for ways to improve productivity and efficiency, and better serve their customers. A good example of this is Enterprise Resource Planning software, which can incorporate IoT data, AI, and ML to help with all manner of business processes from monthly financial closure to streamlining production lines.
To take advantage of AI, IoT, and ML, an organisation’s data has to be in top quality condition. However, the disruption caused by COVID-19 is likely to have thrown even the most diligent data managers and data management strategies off course. Now is the time to get to know your data, and reap the benefits that accelerated digitisation can bring to your business.
The need to provision employees with the ability to work from home at short notice, and at varying times depending on their function or location, has not only led to the generation of more company data, but also led to the growth of local data stores developing. Organisations and IT teams who do not know the scale of their data’s proliferation need to do so as a matter of urgency. Data is being generated even faster than before as a result of the pandemic driving a rise in the use of cloud solutions, growth in the number and speed of digitisation projects, and a rise in customers’ desire to purchase and communicate online.
Research from McKinsey clearly demonstrates the pace of change, with findings showing that the implementation of digitisation occurred at a 20%-25% faster rate during the pandemic than firms would have expected before COVID-19. The increase in using advanced technologies in operations was 25 times faster, moving assets to the cloud was found to be 24 times faster, and the use of advanced technologies for business decision-making also rose.
McKinsey also found that customer interactions have become more digitised than ever, with the increase in customer demand for online purchasing or services increasing 27 times faster than firms would expect under more usual circumstances, and customer needs or expectations accelerating 24 times faster.
Together, these factors create a perfect storm of data proliferation. More customers online means more data, which means more data for your AI and ML systems to weather and turn into powerful information. The accelerated growth of both cloud and digitisation projects means organisations are starting to get to know and harness their data.
Out of sight, out of mind
Data has exploded in volume and has been scattered across a myriad of locations including public cloud environments, data centres, remote offices and the edge, often with little global oversight. At each of these locations, data is isolated in specialised infrastructure for functions such as backup, disaster recovery, audit/compliance, network storage, archiving, dev/test, and analytics, often from multiple vendors.
To make matters worse, there are likely to be silos within silos. For example, a single backup solution can require several dedicated infrastructure components, such as backup software, master and media servers, target storage, deduplication appliances, and cloud gateways—each of which may hold a copy of a given data source. Moreover, each infrastructure component may come from different vendors with their own user interface and support contracts. It is not unusual to find four or more separate configurations simply to perform backup for different data sources.
These infrastructure silos have a knock-on impact on operational efficiency. There is typically no sharing of data between functions, so storage tends to be over-provisioned for each silo rather than pooled. Likewise, multiple copies of the same data are propagated between silos, taking up unnecessary storage space. According to IDC, 60% of storage budgets goes towards storing copies of data alone.
Getting to know your data
On the flipside, organisations with great data management solutions or strategies in place may already be gaining more insight into customer behaviour than before the pandemic, as a result of their improved data management systems and increased volume of data. These entities will also be ahead of the rest when implementing communication and customer retention strategies. For everyone else – or the ‘legacy laggards’ – an action plan that is firmly built firmly around maximising the use of data is needed.
The first step will be to take stock of your data. Find out key details such as: what has changed since before the pandemic struck; know where your data is stored; its quality and relevance; whether there is duplication; who has access to what; if there are strategies in place for collection and retention; and if the correct policies or protocols are applied. All of this information can be used to inform what happens next when managing your data. Consider where your organisation needs to be in a year’s time, even five year’s time, when it comes to data management. How will data, AI, ML, or even IoT (if relevant), be leveraged? Defining the data and cloud-based needs of your organisation is also necessary.
Armed with an understanding of where the organisation is currently at and where it needs to be, you will be able to chart the course of reaching your goals. This is where data silos get eliminated, the choices between when to use cloud versus on-premises storage becomes clear, and you are able to reduce the possibility of duplicate data existing. The route map may also require considering the specific nuances of your organisation, including its technology. Where workgroups have traditionally handled their own datasets, or that may have been free to develop this approach during the pandemic, they will need to become more attuned to a centralised approach.
Data is fundamental to innovation. However, the majority of organisations truly use their data as a strategic asset. Many IT teams struggle simply to meet basic service-level agreements for protection and availability, let alone being able to leverage their data for competitive advantage. The challenges brought about by the pandemic have provided those on the front foot with gains and opportunities, while the rest are simply going through one fire drill after another.
Undoubtedly some organisations will be better placed when it comes to knowing their data. However, without this knowledge, the data-led analytics and innovation that underpins AI, ML, and IoT will be less effective. Many organisations have been given the gift of data during the pandemic, and now is the time to get to know it, because without this knowledge it will simply remain an organisational headache, instead of being a powerful enabler.