The data centre market in Singapore is growing dynamically and is expected to reach US$5.7 billion by 2027, according to a report by Arizton. With over 70 operational data centres in Singapore and counting, more companies will move to cloud systems to gain a competitive edge. It is crucial that businesses are aware that transferring data to the cloud does not absolve the responsibility to safeguard their data in the case of cyberattacks.
To what extent should companies be accountable, and what are the obligations of service providers? How can organisations be prepared in the face of unexpected data disruptions?
Local data services provide significant opportunities for businesses to improve operations – cost savings, accelerating data transmission, and improving application accessibility. The location where data is stored is also crucial; proper storage of data allows companies in Singapore to satisfy internal and external regulations under the Personal Data Protection Act of Singapore (PDPA).
Instead of purchasing and maintaining their own servers for data storage, companies can leverage ready-to-use services. In 2021, Singapore launched a new common data infrastructure – the Singapore Trade Data Exchange (SGTraDex) – to support data sharing between supply chain ecosystems, both locally and globally, through a public-private partnership involving maritime stakeholders including PSA International and Jurong Port, amongst others. Though local data centre infrastructures are widely available in Singapore, cloud companies must operate based on the “shared responsibility model.”
A lack of data protection opens the door to security risks
Many companies assume they are exempt from the responsibility of backing up and safeguarding their data if they are using the cloud. However, this is a common misconception that exposes businesses to significant risks — irreversible data loss or losing access to their data.
Basic cloud tools typically retain data for only a few months, while numerous legal regulations require data to be stored for several years. Consequently, recovering data that has not been properly archived becomes a challenging task. The unavailability or loss of data can result in substantial financial losses for businesses, which includes downtime costs and customer attrition.
Given the rapid pace of data creation and modification, it is crucial for systematic protection to happen, with backups occurring as often as every few minutes. Companies should prioritise data availability by implementing reliable backup policies tailored to their needs to ensure business growth.
Who is responsible for data handling?
Despite the critical impact data losses can have on businesses, and the fact that many utilise cloud services and do not physically store data in their own infrastructure, there is still a significant lack of awareness amongst numerous organisations regarding the irreplaceable role they play in data management and security.
Cloud service providers offer service-level agreements that guarantee high availability of applications, server operations, and infrastructure, often with a 99.99% uptime guarantee. However, this assurance does not extend to the protection of consumer data. Instead, the concept falls under a model known as the “shared responsibility model,” a framework that outlines the cloud service provider’s area of responsibility regarding maintaining the security and availability of the service, and the customer’s responsibility to ensure secure use of the service.
With basic services, companies face restrictions and are typically limited to adding or removing users only. This results in a significant reduction in the responsibility the user bears. However, advanced services, such as the creation and management of virtual machines, allow the owner of the data to bear complete responsibility for data security. As the level of user flexibility increases, so does the level of responsibility the user bears.
Anticipating critical roadblocks
An organisation that lacks a robust and tested backup strategy will find itself at risk of losing control and access to its data. To avoid this, companies must conduct regular workload backups and implement a well-managed modern data protection strategy within their cloud. To further strengthen their defence and prepare themselves adequately against such undesirable data incidences, companies must be proactive in considering worst-case scenarios, primarily in the form of simulations for potential outages or cyberattacks.
Another important element in building ransomware resilience is the 3-2-1-1-0 backup rule. This ensures that companies have at least three copies of critical data on two different mediums, with at least one copy kept off-site while another copy is maintained as an immutable offline, or air-gapped, copy. Additionally, testing the data recovery procedure is equally important to guarantee error-free backups and execution according to the plan.
To facilitate the recovery process, backups should be accompanied by snapshots and/or replication. With snapshots, the system’s state at a specific point in time can be captured, allowing resources to be recovered to be identified. On the other hand, replication refers to copying information between different servers systematically. Combining these measures ensures that data is still available should an incident occur, all while increasing the accuracy and speed of data recovery.
Data breaches and cyberattacks are inevitable. The most effective strategy to protect businesses against such incidents is to develop and maintain a solid data recovery plan. The goal is to minimise losses to the business and strengthen their databases in the process.