About archives for small studios

I folks ! I have an hardware question : how do you handle archives for small studios?
I’m considering the issue and I wonder if the following simple setup is correct enough.

It consist in 3 copies of any archived project

Copy 1 is in on a Small NAS connected to the network in read only (but we can still access it easily if needed).
The two others, copies 2 and 3 are on disconnected hard drives; One of them is off site.
If possible, avoid hard drives from the same series (using two brands for example).

Copy 1 has integrity check with daemons every week or so ; Copies 2 and 3 every three or six month.
For all of them it checks the hard drive (full smart test) and the data. If one drive fails it is immediately replaced

I’m considering creating a small script which saves all the metadata of all the files (size, cdate, mdate, path, checksum) in Json files (and also creates a full report log); so when running integrity checks you can check if any data is corrupted comparing it to the json.

What do you think about that setup ?
Any experience to share ?

1 Like

One approach that is relatively recent in terms of being possible is to utilise the “unlimited space” offerings by some such as a Team account on Google Drive. I know of one studio that utilises this for backups of simply everything. Because there is no space restriction, little thought goes into making optimisations to what is actually uploaded.

I think that could possibly substitute your copies 2 and 3, but having an optimised copy is equally if not more important in order to actually make use of it on a regular basis. No fun integrity checking or recovering hundreds of terabytes when all you need is one project.

1 Like

The @marcus solution is interesting. One other thing you can do is to make a replication on another site and enable RAID features on your main hard drives.

Hi @marcus thanks for the reply.

I thought about online archive. Like Amazon glacier
In France we have OVH, a big provider, having pretty interesting services
I calculated the coast of 4Tb online to 100euros per year, with one upload and one download

Detailed price is 0.01e/Gb for upload (initial upload) and download (when you need to recover it)
and then 0.002e/Gb/Month for storage
It has plenty of protocols (sftp, scp, rsync, https…)

At least it’s pretty secure option (they guarantee 100% file consistency)
But I wouldn’t go on a google drive option. It’s useful for a lot of situations, but saving a project with hundreds of thousands of files, terabites of data ? Did it worked well in upload and download?

Yes, I could also go on two copies
copy one on a RAID 1 setup
cope two : offline and offsite

less work to handle I guess ?

1 Like