1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Other Bit of assistance with data protection regime please

Discussion in 'Software' started by Mister_Tad, 30 Sep 2014.

  1. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    I used to be right on top of data protection with RAID, replication and a series of rotating external disks but I seem to have become a bit lazy and have swapped some kit out leaving me feeling fairly exposed (though I have far less data to protect these days). Amusingly, I do data management consultancy for a living, but alas I don't have the sort of budget to implement the sort of thing I advise upon and have little knowledge of more consumer oriented stuff.

    So what I have in terms of kit is...

    W2012R2 server, 4TB & 2TB storage drives
    2TB single disk NAS - fairly basic but does have some copy services and allows root CLI access
    1TB and 500GB external disks.

    ~100GB of data I simply cannot lose
    ~600GB of data which would present a serious inconvenience if I were to lose it
    ~500GB of data which I'd be mildly miffed if I lost, but woulnd't lose sleep over
    - 1x W7 Ultimate desktop, 1x W8.1 Pro laptop which I don't fancy ever needing a fresh install on for whatever reason.
    - A mild desire to rip ~500DVDs at some point in the future, and only once

    I'm thinking of using StableBit DrivePool for a No-RAID solution to protect against drive failure in the server for principle data, leaving the remainder of the capacity of the 4TB disk for backups of the desktop/laptop.

    A few things that I'm not quite sure the best way to approach:

    I'd like something like VSS on the server for certain data, but at a more granular level than the entire volume, to avoid bulk changes to large files which I don't care about all that much wiping out the entire repository for things I probably care about a lot more.

    I'd like to have the ability to perform de-duped incremental weekly-ish backups to the NAS for the 700GB of "important" data to get the longest retention possible out of the available capacity.

    I'd like to make the process of dumping the "very important" data to the external disks fairly simple and automated. E.g. the ideal would be the server mails me saying "Oi, I've not done a dump to disk for a while", I plug a disk into the server, the server knows what it is and dumps the appropriate data, mails me when its done and I take the disk to an offsite location swapping it for the other, and the whole things starts again.

    I don't think I need to buy any more hardware (I have ~10TB of disk for just over 1TB of data after all) but don't mind some mild expenditure on software provided that it ticks the boxes.

    Despite living in suburbia and all of the neighbouring streets having access to Fibre/Cable, I'm still stuck in the 90s for some reason with ~3Mb/600kb internet, so online storage really isn't much of an option at this point.

    Any ideas would be greatly appreciated.
     
    Last edited: 30 Sep 2014
  2. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    No recommendations at all? :sigh:
     
  3. jinq-sea

    jinq-sea 'write that down in your copy book' Super Moderator

    Joined:
    15 Oct 2012
    Posts:
    8,823
    Likes Received:
    721
    I face a similar dilemma, which I am giving some thought to at the moment. I'll post back here when I have had time to 'digest' it a little more?
     
  4. Votick

    Votick My CPU's hot but my core runs cold.

    Joined:
    21 May 2009
    Posts:
    2,321
    Likes Received:
    109
    Why stable bit?
    You do know you have Storage Spaces in 2012.

    I run the following:
    Win Server 2012:
    OS:- 128GB SSD (This is backed-up to 3TB Backup Drive)
    DATA:- 6X2TB HDD's in Parity Storage Space making a 9TB Redundant Virtual Disk (If I remember) including ReFS, Also VSS for quick file restores.
    Backup: 3TB HDD used for only OS Disk backups using the built in Windows Backup. - I've no need to backup the DATA drives as A) They are backed up per below. B) Should I have to replace my OS disk I can restore or do a fresh install and then re-mount the storage space and boom it's all back.

    I then backup only all the data I can't afford to loose to a 1 Bay NAS drive on the network and to online storage.
     
  5. Margo Baggins

    Margo Baggins I'm good at Soldering Super Moderator

    Joined:
    28 May 2010
    Posts:
    5,649
    Likes Received:
    268
    I would echo Votick on storage spaces.

    For the backup to external disk - cant you do something simple like a robocopy script? You could make it a scheduled task, and just make sure you plug the disk in when it's expecting it - not as plug and play as your solution, but entirely free.
     
  6. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    I wasn't keen on storage spaces, there's a few nice to haves from the likes of DrivePool (e.g. SSD landing zone, folder based duplication policies) but what put me off storage spaces the most was the closed nature of it - i.e. if something pops in the server with DrivePool I can just grab the drives out and read the files from any windows PC, not so with storage spaces.

    The potential speed issue also put me off storage spaces, it's not exactly a grunty server I'm running (J1900, 8GB) and DrivePool seems to have far less an overhead.

    I'm willing to be convinced, and free is nice, but DrivePool is only £12.50, so there's not really a lot in it.

    For the backups, this is basically what I used to do, and it's fine for a while until it isn't. From experience I know that A) I'll forget to do it for longer and longer intervals and B) A year down the line I'll decide I want to change something and have to re-work-out what I did in the first place before I can tweak it.

    I'm quite happy to pay a sensible amount for software that's going to take the reins and sort these things out, and mean that I don't have to consciously keep on top of it.
     
  7. Votick

    Votick My CPU's hot but my core runs cold.

    Joined:
    21 May 2009
    Posts:
    2,321
    Likes Received:
    109
    Well I use to use Drive Bender and Stablebit and they never balanced properly.
    Performance wise Storage Spaces is faster and in some cases faster than NFS.

    As for not being able to read the data on the drives - I can;t remember if you can or not as I've never even bothered to try.
    I have however been through 3 re-installs due to it not liking the move from HDD to SSD for OS and from Server 2012 R2 Essentials back down to Server 2012 as Office 365 integration wasn't working.
    Each time, Pop open powershell, mount and claim ownership of the pool, 60secs later, all re-mounted and shares online.

    Only downside is you need to know how it works before you add drives.
    Drive quantity matters.
    Example:

    I had 3 X2TB Drives, I couldn't add in an additional drive as I created the pool with 3 disks, Only with another 3 disks could I add in more storage. If you created the pool with 5 disks, again you need another 5 connected to expand the pool.
     
    Mister_Tad likes this.
  8. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    You can re-add pools to a 2012R2 system with no problems at all, but it has to be 2012R2 - i.e. I wouldn't be able to rip out a drive and read the files in 7/8.1.

    There's a re-balancing plugin for DrivePool I've seen, however it's not something that's relevant to me. The principle data that's duplicated will done so over 2x disks, the backup data from the PCs will be one one disk (outside the remit of DrivePool).

    Your last comment probaly highlights best that Storage Spaces isn't for me. I have 2x drives, one "main" and one "copy", and may add another in future, and may replace the copy disk with a larger one, and may or may not want to duplicate my DVD rips if I ever get around to that, and may want to do something else entirely. Flexibility, basically.

    If I had way more data, and as such way more disks, then I'd likely be jumping straight on Storage Spaces not least for the efficiency of parity pools.

    I am going to do a couple of tests and see if/how de-dupe and DrivePool work together. I suspect this would mean I'd be back to only being able to read the drive under 2012R2 though, but might be a trade off I'm willing to make if I get decent savings - however majority of the content is compressed media so not expecting the earth.

    So my perusing has brought me to Bvckup for this task - https://bvckup2.com/

    Not strictly backup, but an incremental copy engine. The key difference between this and the multitude of ways for doing file syncing for free is that you can assign copy sets to devices (as opposed to drive letters) and set them to real time so when a known drive is plugged in it says "Hai!! I know you!" and updates from the last copy set - pretty slick, or so that's how it appears. Going to see how the trial goes. $20 if it does what I think it does.

    For the remaining backup tasks I was thinking Acronis TrueImage 2015 would do the trick across the board, but then forgot that software vendors don't like to let you use consumer versions of their software on server versions of OSes, meaning the 3 license pack isn't quite as good value (and there's no way they're getting £600/£1000 for the server edition, as nice as the central management of all of the backups would be for the latter).

    I'm quite happy to manually schedule OS drive images on the server though, leaving me just needing a way to back up data to the NAS, scheduled, catalogued and incrementally - without breaking the bank. Any suggestions?
     
  9. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    Now trialling Areca Backup for the data backup from the server to the NAS. Seems pretty competent for basic file backups with inc/diffs and compression. It lacks native scheduling, but it will export a job to a batch file which is easy enough to task schedule. It can also do sub-file incremental backups, but I'm not yet clear on how this impact recoverability. That said, I don't suspect there will be a great deal of change within large files anyway, so probably a non-issue.

    It's not "my first backup" at first glance, and it's not entirely how to get going, but browsing through the tutorial, it doesn't seem to bad to get to grips with.

    I was a bit reluctant to go down the open source route for something like backup, but have read a lot of very good feedback for it and it seems best of the free options. It's either this or paying through the nose for proprietary "server" backup software.

    Makes me feel better that there's nothing proprietary about how backups are catalogued and stored, so I can keep an eye on what it's doing outside of the software before I decide to trust it, and if everything hit the fan there's nothing stopping me from dragging and dropping files from the archives it creates.

    I'll just be using windows server backup for system images - going for weekly with low retention considering there's not really going to be any change once things are up and running.
     
  10. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    So upon trying out DrivePool, discovered that it doesn't play nice with ReFS, which if I thought about I could have figured out. I'm as of yet undecided if this is a deal breaker for me - I was keen on the benefits of ReFS, and also keen on DrivePool.

    Anyone have any bright ideas for a similar approach to soft-mirroring on two mismatched drives (thus creating a 2TB mirrored area and another 2TB standalone area). Simultaneous writes to the two would be the preference, both post-process and real-time could bog things down a bit. (Maybe this train of thought is pushing me back towards Storage Spaces? EDIT: Pretty sure it is, thanks Votik, I've seen the light, eventually!)
     
    Last edited: 8 Oct 2014
  11. Votick

    Votick My CPU's hot but my core runs cold.

    Joined:
    21 May 2009
    Posts:
    2,321
    Likes Received:
    109
    Excellent now to just pay the £9000,000,000 consultancy fee. :) :)
     

Share This Page