Backup/Restore for large data (hundreds of TB)

My prospect wants to backup large genomic data (500TB) from the point of view of operation mistakes.
The genomic data is not updated frequently. Do we generally back up it to shared volumes using Vbr.py? Could you give me your recommendation for this scale?

Comments

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file