The Challenges of Seamless Data Management

This article is sponsored by Lenovo.

The past year has been an extraordinary time for business. Old business models have been upended, and new ways of doing business have been constructed. For so many enterprises, five- to seven-year blueprints for digital transformation were squeezed into a couple of months. With all that, organisations are finding that their data now resides in multiple locations, and is growing at a rapid rate. How can enterprises effectively manage their data? What are some security and storage challenges that have arisen?

In a virtual webinar hosted by Lenovo and organised by Jicara Media, senior IT executives from Lenovo, Veeam Software, Liberty Insurance, and Frost & Sullivan discussed some helpful ways to achieve seamless data management.

The forum explored the different aspects of data management, and protection and security that need to be modernised so that the enterprise can transition from legacy infrastructure to a unified, platform-centric future. The panelists discussed how businesses have sought to not only survive but thrive and increase productivity in this era of chaos, in which old ways were taken apart and reformulated according to new needs and business models – all supported by the backbone of technology.

Kashish Karnick, Senior Product Manager, Data Management, Lenovo Asia Pacific, began the discussion talking about the current circumstances and Lenovo’s unique approach to technologically enabling enterprises.

“A lot of the old rules have been set aside because of the pandemic; everything has accelerated on the technology side. At Lenovo, we’ve always had the approach of adapting our portfolio to the customer’s IT strategy, rather than telling them what to do. Approach problems at a stack level, rather than delving into products – understanding and listening to where they want to go and how we can help them get there,” he said.

“Lenovo’s brand is strong – but the company’s capabilities extend beyond the products of PCs, tablets, laptops and into the realm of AI, Telco, Cloud & edge computing. We’re working with a lot of telco vendors to also expand our solution base with them. On the data centre side, we have the Infrastructure Services Group, and storage is just one part of that. We’ve got edge and core solutions, we’ve got the entire compute layer, and networking as well. The company is creating solutions that incorporate the best of its full line of products – to create solutions for the future,” he revealed.

How the pandemic has affected enterprises

Krishna Baidya, Director – Information & Communications Technologies, APAC, Frost & Sullivan highlighted three IT trends brought about by the pandemic.

The first, he said, is that organisations are compelled to do more with less. When budgets do not flow as freely as before, companies are challenged to deliver business value, beyond just maintaining business continuity.

Secondly, enterprises have been forced to rethink how the business is managed. The pandemic has forced businesses to reimagine and rethink business models. Once organisations gain a foothold in their market during the disruption, the crucial thing for them is to innovate to achieve a competitive advantage.

Thirdly, he explained, companies have had to adopt a digital-first strategy. The envelope is being pushed faster today than any time in history. Converting to digital has been a lifeline for many organisations, even for small and medium-sized businesses.

Such change, according to Baidya, has brought about a host of challenges for enterprises.

He explained, “The question for many companies has been: how? Many organisations lack the expertise, the knowledge internally, both from a technology and also from a process perspective. And the question for them has been, how to pivot on existing business models. They’re also having to manage this arduous task of balancing experience, security, and cost, in a nutshell.”

For Sourabh Chitrachar, Regional VP (Asia) – IT Transformation and Strategy, Liberty Insurance, the pandemic has meant a new focus on digital customer experience, which has made data extremely important.

He said, “People are really paying attention to data strategy, digital strategy, and ways of making the experience seamless for the customer, partners, as well as, employees. There is a lot more focus around customer experience/ outside-in view rather than purely inside -out view. Earlier, in insurance, the distribution team would need to physically meet prospective clients and partners in order to enhance business growth. That’s not the case now as most of the business is being done remotely.”

Highlighting the importance of building partnerships, Karnick said, “We too have seen a significant growth in requirements from customers around digital transformation projects. While the pandemic accelerated this growth – we also saw a large number of requests coming from our customers to improve the security of their systems. More of our customers started to implement ransomware solutions – and 256-bit encryption on their existing data – and buying systems from us that could deliver  that.”

Securing data against ransomware and hardware failures

With most employees working from home, enterprises have had to scale up their infrastructure rapidly to cater to the new normal. Meanwhile, cybercriminals have pounced on the opportunity and increased cyberattacks, targeting unsuspecting employees. Ransomware, in particular, has become a pernicious threat.

Mark Bentkower, Alliances Technical Director, APJ, Veeam Software, elaborated: “When we all went home, IT departments were faced with the issue overnight of dealing with 100% remote workers. That blew up everybody’s security models and we all engaged in a very interesting security practicum that maybe we didn’t want to. We were set up to have maybe 20% of our staff working remotely, and overnight it was 100%. And most of us didn’t have the infrastructure to be able to do that.”

“A lot of companies made the decision to say that it’s more important to get people working than it is to worry about the security – we’ll patch that up afterwards. The bad guys were ready to pounce and take advantage of that. We’re seeing this uptick in ransomware. Previously, the security model for many companies had been to keep everything inside the secure hard shell behind the perimeter, and let things be a little bit loose on the inside of the perimeter. With so many people now working remotely, that rigid corporate perimeter just doesn’t exist anymore in a lot of cases.”

Karnick advised that the first step to dealing with ransomware, especially in a remote working environment, is to have open conversations within the organisation and with vendors.

He said, “What’s important is to educate people both at the administrator, as well as the user end on what threats they face – because most of the threats come in human form. Very few of the threats come from brute-force technology. Enabling your team, trusting the team members to do the right things, and having multi-factor authentication on almost every layer of data are critical. Adding in encryption across sites puts that extra layer of protection as well (especially in the cloud world).”

Bentkower shared a tip on how to keep data safe: “If you follow the 3-2-1 rule, you should never lose your data to a black swan event, to ransomware, to a hardware failure, or to anything else.”

The idea is to always have three copies of data on at least two different pieces of media, one of which is off-site. “This can be achieved with a combination of hardware, where you’ve got hardware replicating from site to site, or a combination of hardware and cloud, or tape, which is an excellent way to have one copy offsite. That way if you get hit by ransomware, you still have a second copy of that data somewhere and you have an offsite copy,” Bentkower said.

Strategies for data management

The data deluge engulfing many enterprises has necessitated proper data management, without which the data in the vaults of the enterprise is rendered almost useless. With so many types of data – structured, unstructured, the different workloads on-premise, on the cloud, or in hybrid mode – how can enterprises turn the data into useful insights?

The key, said Karnick, is to focus on organising the data before deciding on the infrastructure needed to store it.

He explained that while both structured and unstructured data can provide useful insights to business, there is a lot of ‘junk’ data that companies keep, as the data might be difficult to integrate with existing systems (like historical data). The first step is to structure, order, and organise it.

“Everyone’s ERP system is different, so how do we organise the data that is compatible with what you’re currently running? What are the future needs, future formats you’re going to be looking at?” Karnick said. “The next step would be where to store that data, and how to make it accessible to different applications.”

Bentkower suggested that one way to organise data is to look at its value over time. For a start, think about the value of a specific set of data to the company. This value might in fact change over time.

“Maybe if you’re a retail company and you’ve taken in an order – the value of the data for that order in the first 30 days, while the order is being fulfilled and being paid for, is probably higher than it is 90 days or 120 days out. Data is going to have to be mobile, so that it should be able to move to a different tier of data that is going to be at a cost commensurate with the data’s value to the organisation throughout the life cycle of that data,” said Bentkower.

Sourabh added that categorising and understanding an enterprise’s data can help balance cost, flexibility, and performance, especially if an organisation is able to identify exactly which data set needs to be real time. He asserted that on-demand or real-time data access is not necessary for every segment of the business user.

The first step is to categorise the data or the user community. This is usually the front end, which needs on-demand access at the tip of the fingers. “When I look at the insurance industry, there is no area – except maybe a couple being where you’re providing a quote to a customer – that has to be real time or when you have a claim and claim surveyors have to do a survey remotely. In that case, the flow of data has to be almost real time. Once we have an understanding of those different kinds of users and their needs, I think it’s easier to design a solution.It’s not about IT designing a solution on their own, but working in conjunction with the business,” Sourabh said.

Verticalising and adapting

Karnick explained that each industry and each company has its own set of challenges and requirements, and the approach he takes is to listen closely to what the company is trying to achieve.

“We look at 3 major factors – compliance, encryption, age. Data Compliance: take healthcare as an example, where HIPAA compliances are important to protect patient data. Data storage systems need to be able to address these needs (which we do) in addition to the technical features.”

“Data Encryption: in the case of many governments, 256-bit encryption is necessary, and it is available across all of Lenovo’s storage and compute systems. Data Age: for companies that need data to be stored for longer periods – 7-8 years or more – the cost of electricity and cloud storage can be prohibitive. In this case, moving the data to tape storage would be advisable,” said Karnick.

Beyond that, Telcos have a different way of working altogether because of their specific set of applications. “Like military-grade specs, they have telecom-grade spec. We have edge servers today that are no bigger than your laptop and are perfect for that industry. You have the automobile sector. They’re looking at getting a lot of computing down at the edge – in the car itself, and then transmitting the data, so that they don’t depend on the internet as much and that’s where edge services and edge systems come into play,” he stated.

“Lenovo’s approach is to take the use case and adapt to it. We’re able to do multi-cloud today – in other words, move data from Amazon to Google to Microsoft, back onto premises, without you having to spend any money on any of the public cloud vendors, giving you true cloud mobility on the data that could be sitting even on bare metal servers,” he revealed.

The future of data storage

Because there are now so many formats and locations in which to store data, it is becoming increasingly challenging to find and retrieve what is relevant. And as the volume of data is set to grow exponentially, storage architecture needs to be designed to meet future needs.

Bentkower considers robust API systems critical for storage architecture. For storage systems to be future-ready, they need to interact with other software and with each other. It cannot be left to the administrators alone – these need to be based on policies that can be automatically implemented, that can be audited, and have alerts and reports. In terms of being future-ready, it is the ability to talk at an API level with various hardware and software that facilitate the movement of that data, and that allows storage to act differently depending upon where it is in its life cycle.

“Something that is a primary piece of storage today might, in a few years from now, still be very valuable as a slower piece of storage further down the line. And if it’s got the ability to move data around and be part of that life cycle, it can last longer and be valuable longer,” he said.

Baidya predicted that the future of storage will centre around a hybrid multi-cloud environment.

He concluded: “Balancing cost and efficiency is going to be key in the planning of the data storage infrastructure. Businesses must use different infrastructure to meet the needs of a diverse set of applications, services, data, and of course, end users. For some of these applications and services, the stability and the performance of an on-premise environment might be the best, but in some other cases they might need the flexibility offered in the cloud. In my opinion, that’s eventually going to be a hybrid multi-cloud environment.”