top of page

Running your own Servers? Take Regular Backups


web based document control software

Companies that use on-premises servers need to keep in mind the importance of regularly backing up their data. Regular backups are essential for protecting a company's data and ensuring that it can be recovered in the event of a disaster or other unexpected event. Some key things for companies to keep in mind about regular backups include the following:


  • Schedule regular backups: Regular backups should be scheduled to occur at regular intervals, such as daily or weekly, to ensure that all data is backed up on a regular basis.

  • Store backups offsite: It is important to store backups offsite, such as in a secure cloud location, to ensure that they are protected in the event of a disaster or other event that affects the on-premises server.

  • Test backups regularly: It is important to test backups regularly to ensure that they are working properly and that data can be restored successfully if needed.

  • Keep backups up to date: Regular backups should be kept up to date to ensure that the most recent version of a company's data is always available for recovery.

  • Use multiple backup methods: Companies should use multiple backup methods, such as both local and cloud backups, to ensure that their data is protected in multiple locations. This can help to reduce the risk of data loss in the event of a disaster.

The risks of not backing up:

  • Viruses can destroy important files

  • Ransomware can take over entire servers and PCs

  • Incorrectly applied updates can corrupt the OS

  • The Server can crash

  • Power Cuts may lead to data corruption

  • User mistakes may delete important files

  • Natural Disasters may affect your data centre operations


Types of backups:

  • Regular Full Backup

    • All the data/documents are backed up.

    • This backup is typically run every day, or sometimes every 2-3 days.

    • Quicker to restore.

    • Takes up a large amount of disk space.

    • Can lead to loss of data generated between backup periods

  • Incremental Backup

    • All the data/documents are backed up once a week or once a month

    • Then, every day, only the files that have changed are backed up.

    • The frequency of the incremental backups can be set to multiple times a day, thus reducing the gap between backups.

    • Would take less disk space than regular full backups.

    • Slower to restore, as more intensive computing is required to calculate the exact state of files that need to be restored.

  • Continuous Backup

    • All files are replicated as soon as they change.

    • Typically this method is followed in cloud storage such as Amazon S3.

    • Replication typically happens across multiple geographical locations.

    • Hard to achieve in on-premise setups.

    • No data loss, as all changes are always backed up.

    • Very quick to restore - with no noticeable time lag.


Location of backups:

  • Same disk on the same server as the main application

    • Very insecure and absolutely not recommended.

    • Cheapest location for backups.

    • Speed of backups is quickest.

    • But if the hard disk crashes, or if a virus infects the server, both the application and its backups are lost.

    • No protection from malware, OS corruptions, disk corruptions or natural disasters.

  • External hard disk or RAID or NAS attached to the main application server

    • Slightly better option - if the main hard disk crashes, you have a backup on a separate disk.

    • Does not protect from virus and ransomware attacks - as the disk is connected to the main server, the malware will replicate itself onto the backup disk as well.

    • Speed of backups is still fast.

    • Can protect from data loss in case of OS corruption or main disk corruption.

    • No protection from malware, power cuts, or natural disasters.

  • Separate FTP Server on a separate network

    • Separate server in the same data centre

    • But not connected to the rest of the servers via share drives / samba / NAS etc.

    • Only accessible via FTP/SCP from main servers.

    • Reasonably good option. If anything happens to the main server, the data would be safe on the FTP server.

    • Speed of backups is slightly slower.

    • Can protect in case of OS corruption, main disk corruption, malware. Do note that malware protection only happens if the FTP server is properly separated from the network of the main server, and is hardened for security.

    • No protection from power cuts or natural disasters.

  • Separate physical location in a data centre in another city or state.

    • Cloud services like AWS can also be used for this.

    • By far the safest backup location.

    • Data is copied using safe protocols like Rsync or Scp, or aws api.

    • Speed of backups is slightly slower.

    • Can protect in case of OS corruption, main disk corruption, malware, power cuts and natural disasters.

    • Can be the most expensive option.

bottom of page