TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: What's your backup setup?

127 点作者 iansowinski大约 8 年前
I'm rethinking my backup workflow and I'm curious about other people setups - both hardware and software. How does your backup setup looks like guys and girls?

91 条评论

pYQAJ6Zm大约 8 年前
I rely mostly on Borg backup¹.<p>1. First, I run it locally on my desktop against a repository I keep on the same drive (&#x2F;home&#x2F;backup); then<p>2. I update, with rsync, a copy of this repository I keep on a dedicated server with full disk encryption; and, finally,<p>3. I spin up an EC2 instance, mount an S3 bucket with s3ql, and rsync the copy from the previous step up to the one I keep on this bucket.<p>This process is (embarrassingly) manual.<p>The backup repository is encrypted with Borg itself, and if I am in need of recovering something I do it from the local copy. I never mount the repository in remote locations.<p>¹<a href="https:&#x2F;&#x2F;github.com&#x2F;borgbackup&#x2F;borg" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;borgbackup&#x2F;borg</a>
评论 #13696544 未加载
评论 #13697056 未加载
评论 #13704107 未加载
zytek大约 8 年前
To each of you guys having those extensive backup solutions (like NAS + cloud sync, second nas, etc)...<p>.. do you actually TEST those backups?<p>This questions comes from my experience as a system engineeer who found a critical bug in our MySQL backup solution that prevented them from restoring (inconsistent filesystem). Also, a friend of mine learned the hard way that his Backblaze backup was unrestorable.
评论 #13694934 未加载
评论 #13695119 未加载
评论 #13696550 未加载
评论 #13704027 未加载
评论 #13695229 未加载
评论 #13695043 未加载
评论 #13695065 未加载
评论 #13695099 未加载
评论 #13695084 未加载
评论 #13706671 未加载
评论 #13695059 未加载
评论 #13695640 未加载
HeyLaughingBoy大约 8 年前
Wait for disaster, then panic.
评论 #13696779 未加载
评论 #13712014 未加载
评论 #13696778 未加载
chilicuil大约 8 年前
I&#x27;m fortunate to only depend on a single platform, Linux in my case, so I rent a 1TB vps[0] to whom I rsync[1] every day . Then depending on the criticality of the service I&#x27;m backing up I create weekly&#x2F;daily&#x2F;monthly snapshots (rdiff-backup). I encrypt sensitive data using symmetric aes 256 (gpg).<p>[0] <a href="https:&#x2F;&#x2F;www.time4vps.eu&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.time4vps.eu&#x2F;</a><p>[1] <a href="http:&#x2F;&#x2F;javier.io&#x2F;blog&#x2F;en&#x2F;2014&#x2F;04&#x2F;08&#x2F;backups-git-rsync-rdiff-backup.html" rel="nofollow">http:&#x2F;&#x2F;javier.io&#x2F;blog&#x2F;en&#x2F;2014&#x2F;04&#x2F;08&#x2F;backups-git-rsync-rdiff-...</a>
评论 #13699447 未加载
评论 #13699734 未加载
评论 #13698420 未加载
git-pull大约 8 年前
I like doing a fresh install of Linux&#x2F;BSD&#x2F;macOS every 3-6 months. My tools&#x2F;flow are opinionated, but outlined below:<p>For configs &#x2F; dotfiles:<p><a href="https:&#x2F;&#x2F;pypi.python.org&#x2F;pypi&#x2F;dotfiles" rel="nofollow">https:&#x2F;&#x2F;pypi.python.org&#x2F;pypi&#x2F;dotfiles</a><p>I keep my personal config @ <a href="https:&#x2F;&#x2F;github.com&#x2F;tony&#x2F;.dot-config" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;tony&#x2F;.dot-config</a> if you want to glance or copy&#x2F;paste from it (MIT-licensed).<p>Another trick is partitioning. Keep media files in a separate partition. Also great if you dual boot.<p>rsync + ssh for copying files from other machines: <a href="http:&#x2F;&#x2F;troy.jdmz.net&#x2F;rsync&#x2F;index.html" rel="nofollow">http:&#x2F;&#x2F;troy.jdmz.net&#x2F;rsync&#x2F;index.html</a><p>I use vcspull (<a href="http:&#x2F;&#x2F;vcspull.git-pull.com" rel="nofollow">http:&#x2F;&#x2F;vcspull.git-pull.com</a>), an app I wrote, to re-clone my projects to the same familiar directories.<p>Keep books on Kindle.<p>Have your own personal self-hosted git with gogs (<a href="https:&#x2F;&#x2F;github.com&#x2F;gogits&#x2F;gogs" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;gogits&#x2F;gogs</a>). You&#x27;ll need at least the 10&#x2F;hr DigitalOcean account though, or else compiling will fail due to memory limitations.<p>I use digital ocean extensively to host old old projects on the cheap. Its like what dreamhost used to be in 2006.
drvdevd大约 8 年前
Primary LAN backup:<p>--<p><pre><code> - 7TB ZFS pool running on Ubuntu Xenial - hardware: an old laptop with 6 cobbled together external USB 3.0 drives making up the pool - each vdev (3 total) in the pool is mirrored - standard tools on top of that: time machine + netatalk, NFS, samba, SSH+rsync, ZFS send&#x2F;recv, etc. - external drives need battery backup (can&#x27;t recommend the case where you don&#x27;t have battery backup for ZFS vdevs) -- no ECC RAM </code></pre> Off-site backup:<p>--<p><pre><code> - Ubuntu Xenial running in google cloud, 10GB root volume, 3.75G RAM, 1 vCPU - secondary (backup) disk is currently only 1TB, with a zpool as filesystem. Easily expandable by adding virtual disks as vdevs (plus I trust their hardware slightly more than my own). - using ZFS send&#x2F;recv to selectively backup important datasets and keep cost as low as possible </code></pre> Secondary LAN Backup and Restoration Testing:<p>--<p><pre><code> - a separate 8TB disk on another ghetto piece of old x86 hardware, no redundancy - restored from the offsite backup to get 2-for-1: backup and restoration testing </code></pre> Encryption:<p>--<p><pre><code> - everything local uses dm-crypt - as for google cloud, I also use dm-crypt. If I want to conceal the keys from remote memory, I use nbd-server to expose a block device across SSH</code></pre>
评论 #13697039 未加载
towb大约 8 年前
Fun story. I ran &quot;rm -rf ~&quot; by mistake just the other day. A misconfigured program had created a dir named ~ inside of my home folder and I was a bit quick to type the command. No harm done because I had setup a cron to rsync everything daily as late as last weekend. Upgraded my backup solution to rsnapshot, still looking out for even better solutions. Phew!
tetraodonpuffer大约 8 年前
For me it is<p>- 2x local connected backups, two identical copies usually, one on an internal HD separate partition, one on a home NAS. Usually once a month or so, more often if I am doing something specific<p>- 3x rotating external backups in a bank safety deposit box, every 3-4 months or so will rotate one of the backup sets there<p>all disks are encrypted of course.<p>I am surprised a lot of people pay per month to backup online when a safety deposit box is usually way cheaper, and you can&#x27;t beat the transfer speed. A standard bank safety deposit box seems to fit 3 3.5&quot; hdd perfectly, or 6-7 2.5&quot; hdds, and that&#x27;s a lot of TB for not a lot of money.<p>I always rsync --checksum a second time after backing up, and am starting to think about writing a py script or something to calculate checksums and save them on the disks so I can check them at any time, but this said with the implicit redundancy above of having 2x nearline + 3x offsite it should be fine I would think.
pvdebbe大约 8 年前
My poison is flexbackup<p><pre><code> [I] app-backup&#x2F;flexbackup Available versions: 1.2.1-r12 ~1.2.1-r13 Installed versions: 1.2.1-r12(05:37:21 PM 03&#x2F;03&#x2F;2014) Homepage: http:&#x2F;&#x2F;flexbackup.sourceforge.net&#x2F; Description: Flexible backup script using perl </code></pre> Pretty old, but some software is like that, able to be finished.<p>I run a couple cronjobs on it, doing full backups every Sunday and differentials against the full backup every other day of the week. The backup target is a RAID1 backed disk on a NAS.<p>Flexbackup produces tarballs essentially, with indexes and the usual add&#x2F;remove tracking. Compression can be naturally applied. It all relies on the common Unix tools. Just yesterday I updated my 4-year-old configuration to try out new partitions and incremental backups; A minimal example config for flexbackup:<p><pre><code> $set{&#x27;pictures&#x27;} = &quot;&#x2F;stor&#x2F;amy&#x2F;pictures&#x2F;&quot;; $compress = &#x27;gzip&#x27;; $compr_level = &#x27;3&#x27;; #1-9 $device = &#x27;&#x2F;srv&#x2F;harry&#x2F;pictures-backup&#x2F;&#x27;; </code></pre> Associated crontab:<p><pre><code> 30 4 1,15 * * &#x2F;root&#x2F;backup-scripts&#x2F;backup-pictures-full.sh 30 4 2-14 * * &#x2F;root&#x2F;backup-scripts&#x2F;backup-pictures-incr.sh 30 4 16-31 * * &#x2F;root&#x2F;backup-scripts&#x2F;backup-pictures-incr.sh </code></pre> With the scripts essentially saying,<p><pre><code> &#x2F;usr&#x2F;bin&#x2F;flexbackup -set pictures -full \ -c &#x2F;root&#x2F;flex-confs&#x2F;pictures.conf &gt;&gt; &#x2F;root&#x2F;backup.log 2&gt;&amp;1 </code></pre> and<p><pre><code> &#x2F;usr&#x2F;bin&#x2F;flexbackup -set pictures -incremental \ -c &#x2F;root&#x2F;flex-confs&#x2F;pictures.conf &gt;&gt; &#x2F;root&#x2F;backup.log 2&gt;&amp;1 </code></pre> respectively. (They also contain some find(1) invocations to remove older full backups and obsolete incrementals.)
评论 #13694889 未加载
Snortibartfast大约 8 年前
I have two USB-connected hard-drives which are switched every week and one is moved to another location.<p>The drives are encrypted with LUKS&#x2F;dm-crypt. Encryption key is a file with random data stored in the &#x2F;root dir, so the disk encryption is not safe from local attacks. Key is also stored off-site (not in the same location as the off-site disk of course.)<p>A cron-script runs rsnapshot daily, which collects data from the local host and from remote hosts.<p>Remote host backup goes via ssh, using a passwordless ssh-key, with a forced command in authorized_keys which is only allowed to run rsync. The script below must be modified so the rsync command match the actual command which rsnapshot executes. Also note that the path names can only contain [&#x2F;a-zA-Z0-9]. It&#x27;s a bit restrictive I know, but I tried to lock it down as much as possible. Just edit the regex if needed.<p>&#x2F;root&#x2F;.ssh&#x2F;authorized_keys:<p><pre><code> from=&quot;1.2.3.4&quot;,command=&quot;&#x2F;root&#x2F;bin&#x2F;rsnapshot_wrapper.sh&quot; ssh-rsa AAAA... </code></pre> &#x2F;root&#x2F;bin&#x2F;rsnapshot_wrapper.sh:<p><pre><code> #!&#x2F;bin&#x2F;sh LOG=&quot;$HOME&#x2F;rsnapshot_wrapper.log&quot; if echo &quot;$SSH_ORIGINAL_COMMAND&quot; | grep -E &gt;&#x2F;dev&#x2F;null &#x27;^rsync --server --sender -v*logDtprRxe\.iLsfx --numeric-ids \. [&#x2F;a-zA-Z0-9]+$&#x27; ; then $SSH_ORIGINAL_COMMAND else echo &quot;Invalid command: $SSH_ORIGINAL_COMMAND&quot; &gt;&gt;&quot;$LOG&quot; fi exit 0</code></pre>
whit大约 8 年前
(1) Backup whole disk to time machine automatically (2) Backups every week or so to an external hard drive (3) Daily backups to google nearline via arq (4) Manual backups of important documents to tarsnap
kohanz大约 8 年前
Professional (code): mostly taken care of by client infrastructure as I&#x27;m a freelance developer, but I basically rely on cloud source control (BitBucket, GitHub, etc.)<p>Personal (photos, etc.): Don&#x27;t really trust the cloud, so I have a QNAP NAS in RAID1 configuration with two 3TB WD red drives. We upload photos&#x2F;videos from our phones (and also store important docs&#x2F;files) here. I replicate this drive every 4-6 months and store it in a safe deposit box at our bank (in case of fire). Not perfect, but I think good enough. Haven&#x27;t &quot;tested&quot; it, but since family photos and videos are the most important part of it, there isn&#x27;t really much to test (we view pics off of the NAS regularly).
arximboldi大约 8 年前
I have my &quot;home&quot; as a master Syncthing folder, that I sync with a RPI3: <a href="https:&#x2F;&#x2F;syncthing.net&#x2F;" rel="nofollow">https:&#x2F;&#x2F;syncthing.net&#x2F;</a> I have it set up to keep a few revisions of every file.<p>Syncthing is not really meant for backup, but I really like that it just happens over the network, in the background, without further intervention. I am clumsy, lazy and not disciplined enough for other backup setups that require action (e.g. connecting the computer to something sometimes, manually triggering backups, etc...)
评论 #13695492 未加载
billpg大约 8 年前
I wonder how the various online backup services would handle a request to delete the backup.<p>In a ransom-ware situation, the bad guys might have the keys to a backup service. The existence of that backup would make it pointless to actually pay the random so they have a motivation to do what they can to delete the backups.<p>I would have no problem opting-in to a provision where they will not delete backups for (say) a year, no matter what.<p><pre><code> &quot;Delete my backups now! I pay you and I demand you obey my request!&quot; &quot;No.&quot;</code></pre>
评论 #13695138 未加载
AdmiralAsshat大约 8 年前
Local:<p><pre><code> - An external for music, pictures, and ROMS, another external for video - A backup of each external for travelling - A third backup of both externals onto a single larger external </code></pre> Off-site:<p><pre><code> - Crashplan - Google Drive for source code and important documents </code></pre> All my source code is on multiple laptops and kept backed up through Github. I should probably start including the GitHub folder on my CrashPlan as well, just in case my repo ever gets deleted or something.
anotherevan大约 8 年前
My desktop Linux machine (which is always running) doubles as the backup server, backing up itself, a couple of RaspberryPi’s (one Kodi, one home automation) and my SO’s windows machine.<p>Backup using rsnapshot to an external USB drive that is LUKS&#x2F;dm-crypt encrypted. Every Wednesday the SO swaps the USB drive with a sibling kept at her office.<p>I really like the way rsnapshot works with complete looking images for each backup, but unchanged files are hard-linked across images. Makes it super easy to just mount the drive and grab the backup of that file I just corrupted.<p>For the windows machine, I’m using Cygwin to run an SSH server and rsync. Before running rsnapshot, the backup script sends a wake-on-lan to the PC, then SSH in to run a batch file that sets the go-to-sleep timeout to never, and make a shadow copy of the drive which is what goes to the backup.<p>Then rsnapshot does its rsync over ssh thing to do all the backups.<p>Afterwards, SSH again to run a batch file that cleans up the shadow copy and resets the go-to-sleep timeout back to twenty minutes.<p>§<p>Unfortunately I’ve got some sort of weird problem where it dies while doing the backup of the root folder on the local drive. I’ve run spinrite on the drive, and blown the dust out of the machine, but no change. Last time I had this problem the power supply was failing under demand, but I’ve stress tested it and that doesn’t seem to be the cause this time… sigh. Bit hard to gather diagnostics as the machine is completely locked up when I come in the next morning…
systemtest大约 8 年前
Simple setup. Two USB harddrives. One at home, one at work. The one at home is plugged in at all times doing hourly Time Machine backups. The one at work is disconnected and laying in a drawer. Encrypted with HFS.<p>Every other week I take the home drive to work and take the work drive home to swap duties. I never have the two disks at home, one is always at work disconnected from power.<p>This is my personal balance point between comfort, no cloud and a reliable backup.<p>Backups are tested by restoring to a new HDD every now and then.
DracheZahn大约 8 年前
For my desktop systems (About 3TB across 3 Systems)....<p>1. Cloud Replication - All files&#x2F;docs are stored in one path under VIIVO (an encrypted folder utility) All encrypted files are replicated to Dropbox &#x2F; OneDrive paths for cloud replication. Only encrypted data is replicated to the cloud.<p>2. Cloud Backup - Full system is encrypted and backed up to CrashPlan by Code42.<p>3. On-Premise - All user files and folders (encrypted and not encrypted) are backed-up with two different storage paths in a NAS based Apple TimeMachine (Mirrored Drives)<p>4. Local - Daily and frequent folders are also replicated with SyncMate to a local USB3 Flash Drive<p>5. Off-site - About once a year, I backup my data files, apps, and critical files to external USB drives and ship them off site to my parents for storage just in case. These drives are usually encrypted with VeraCrypt or just Apple Encryption.<p>6. Tax documents, personal documents, scans, and important personal files are copied periodically to a rugged USB flash drive and placed in the home fire safe.<p>For my Servers&#x2F;Array 10TB (Includes my Apple TimeMachine Archives)<p>1. On-Premises External USB drives provide daily backups with local Synology Backup tools<p>2. On-Premises - RSync some data to external WD NAS<p>3. Cloud Backup - Cloud backup some paths with ElephantDrive<p>4. Off-site - Monthly backups with USB Drive rotation with drives sent to parents home to be stored in fire safe.
评论 #13783897 未加载
zabana大约 8 年前
Not sure if you&#x27;re talking about data or actual dev workflow but I will share my setup with you:<p>In terms of Data, everything I own is backed up in Google Drive. (Photos and Documents mostly, I don&#x27;t take tons of pictures and ALL the music I listen to is on soundcloud)<p>In terms of dev workflow, it&#x27;s pretty interesting. My macbook air died on me last week, and because I can&#x27;t afford to get another one (or even a decent pc for that matter) I&#x27;ve fallen back to my raspberryPi. The browser is a little bit slow sometimes, but I have to say that I&#x27;m quite impressed by how well it performs.<p>Because it&#x27;s a bit limited in terms of hardware capabilities, I&#x27;ve bought 2 VPSs from scaleway which I&#x27;ve provisioned using an Ansible playbook I wrote.<p>I was up and running and ready to work within minutes.<p>Now it&#x27;s a bit inconvenient because I&#x27;m used to being mobile and taking my laptop with me everywhere, but it&#x27;s a perfect backup solution for now. Obviously I don&#x27;t watch netflix on it of play video games, but for 35 quid you can&#x27;t really expect much.<p>Edit: the playbook I mentioned can be found here if you want to take a look: <a href="https:&#x2F;&#x2F;github.com&#x2F;Zabanaa&#x2F;night-city" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;Zabanaa&#x2F;night-city</a>
SnowingXIV大约 8 年前
Glad you brought this up. I use a NAS drive as a mapped network drive, that&#x27;s cloud synced with one drive for business, and I also have that NAS doing hyper backup to both Google Drive and a local plugged in external HD.<p>There was a sync problem that I had to address but before that I went to check to see if I could download the backup from Google Drive (this is very slow for larger backups) and open with hyper explorer to restore all the files at least to my computer so I could provide end users with what they need.<p>Once the .zip file completed and the many parts downloaded and extracted I went to open the backup file with hyper explorer. Everything looked good but of course I need to test a true restore so I want to see if I could save a pdf and open it.<p>&quot;partial file restored&quot; - guess what it couldn&#x27;t open.<p>That sent me into a panic. Now nothing was lost or down because the cloud sync was the only thing having issues so everyone could still work and properly access the NAS but now I&#x27;m thinking &quot;great my backup wasn&#x27;t a backup because it&#x27;s useless.&quot;<p>I&#x27;m currently in the midst of trying to figure out what to do now, the external works but I wanted the offsite hyper backup to be my savior in case of a flood&#x2F;fire or external HD failure.
SippinLean大约 8 年前
Re: Crashplan: I recently learned that Crashplan will fail silently to backup most of your data over 1TB. The only fix is allocating it more RAM via a console command. None of this is made known up front, I didn&#x27;t notice until I tried to restore files that weren&#x27;t there.<p>Re: Arq: It used to have issues with missing files. Has anyone restored from an Arq&#x2F;Google backup recently that can speak to the reliability?
评论 #13700236 未加载
gargravarr大约 8 年前
More seriously, on a personal level, I run Deja Dup on my Mint laptop to a USB disk that&#x27;s LUKS-encrypted. Of course, that&#x27;s not enough, so I have a script running on my home DHCP server - when a DHCP lease is granted, if the MAC matches my laptop&#x27;s ethernet adapter, it runs rsync over SSH to my user folder (on a RAID1) on the server, doing a one-way sync. From there, I have an LTO3 tape drive that I got cheap on eBay, and I dump the folder to tape with tar weekly (cycling through the tapes, of course).<p>Anything irreplaceable, I keep in Dropbox, mirrored to every machine I have access to. If I were to manually delete the folder by accident, I&#x27;ve got 7 days to restore it on the Free tier. And if Dropbox itself does a GitLab, chances are very high that I have one of my machines with a recent sync powered off, so booting that up without network will get me a reasonably up-to-date folder.<p>It&#x27;s a lot of moving parts, but everything is usually nicely in sync.<p>I recently reinstalled my Mint laptop and restored from the Deja Dup backup, so I&#x27;m reasonably confident it would work in a DR scenario.
howlett大约 8 年前
I used to have a subscription to crashplan but that wasn&#x27;t flexible (or cheap) enough when you try to backup multiple machines&#x2F;phones.<p>Now I have a raspberry pi with an encrypted USB drive attached where I sync all files from laptops&#x2F;desktops&#x2F;phones&#x2F;truecrypt-drives (I have an instance of pydio-cloud running too).<p>Then, once a week (or once a day depending on the folder) I sync everything to rsync.net.
mironathetin大约 8 年前
Mac: Carbon Copy Cloner and Time Machine on separate usb disks. I use the system scheduler to wake the machine at night, mount the disks, start both backups, unmount and sleep the Macbook again. Rock solid, runs every night since years. Even swapping the harddrive is a matter of 30 minutes to play back the latest ccc clone.<p>I have to find a similar backup solution now also for my Linux based Thinkpad. I am looking into Mondo rescue, because it promises to create a bootable image on an external drive (just like Carbon Copy cloner). For me, it still fails, but this is Linux. Needs more time and research.<p>This is a personal backup of one computer only. I have bad experiences with centralised backup solutions. In every case you need to reinstall the operating system at least before you can access the backup. I also forgot my password once, because access to the backup is not frequently needed and well meaning admins constructed crazy pw rules. So even though I had a backup, it was not accessible any more.
评论 #13695114 未加载
wkd大约 8 年前
At home I use Dropbox for some files and Resilio Sync for others<p>At work we make heavy use of version controlled configuration management where we can recreate any machine from just rerunning the ansible playbook and duply backup for databases and other storage.<p>While duply was trivial to set up, nice to work with, and much more stable than any other solutions that we were using previously if I were to do it again with more than a handful of machines I would have likely looked into reversing the flow with a pull based backup just to have a better overview since I don&#x27;t trust a `duply verify` monitoring to catch all possible issues.<p>Cloud backup is managed by a server fetching the data and then backing it up with duply.<p>We also run a rsync of all disk image snapshots from one DC to another (and vice versa) but that is more of a disaster recovery in case our regular backups fail or were not properly set up for vital data since it would take more effort to recover from those backups
dsl大约 8 年前
I purchased a large safe that has ethernet and power pass-thru. Stuck a NAS with RAID 5 inside and use it as a Time Machine target for all of our laptops.<p>Additionally everything in the house runs BackBlaze for offsite backups.<p>Once a year I restore a machine from backups to test (usually I&#x27;ll copy the drive to an external first just in case).
jfindley大约 8 年前
There&#x27;s not really a lot of detail in your question so I&#x27;ve no idea what sort of solution(s) you&#x27;re interested in. One suggestion, however - if it&#x27;s a linux&#x2F;unix based box you&#x27;re backing up, you&#x27;re looking for a hosted solution and you care about security, tarsnap is excellent.
balladeer大约 8 年前
Personal:<p>CrashPlan. Backs up my personal laptop&#x27;s user directory. I&#x27;ve excluded few directories like ~&#x2F;Library. The speeds are really bad and their Java app sometimes makes me bang my head against the wall and almost always sets my laptop literally on fire. Thought of moving to BackBlaze many times but their version retention policy just doesn&#x27;t click for me.<p>Out of this backed up data some are kept in my Dropbox folder (out of which some personal&#x2F;crucial data are encrypted). And everything that goes into CrashPlan is encrypted on my machine. And yes, I&#x27;ve restored from CrashPlan and once in a while I keep testing some random folders and files to see whether they are actually up there in the cloud or not. I guess I should do a full restore some day (but given their speed and my geographic location it may take weeks or months).<p>I use SuperDuper! to clone my laptop&#x27;s user folder on a 256GB portable hard disk (my laptop has 128GB) every 2-3 months or so and have instructed it to not remove the deleted stuff but add added stuff. I also copy my docs, pics, videos, and music to my 2TB portable hard disk regularly (and I keep testing it).<p>(edit: I&#x27;ve recently disabled auto-upload to Google Photos. Now I do it only for those photos that I want to upload form its Mac uploader app)<p>Work:<p>Code goes to our own gitlab setup, rest of the stuff to my company&#x27;s Google Drive. Sadly we don&#x27;t have some holistic backup setup at work. It&#x27;s a startup. I had dropped an email to both IT and Engineering heads. They replied saying it was an interesting idea and they would look into it, I knew they wouldn&#x27;t.<p>Going forward, I want to have my own BorgBackup or something like this (client side encryption, de-duplicated, compressed, kind of fire and forget solution) solution hosted on a small&#x2F;mid sized VPS in place or CrashPlan&#x2F;BackBlaze or along with these readymade cloud solutions. Something with a GUI would have been nice though. Something lightweight, minimal, but solid (BackBlaze&#x27;s interface is awesome).
sshagent大约 8 年前
Whatever you end up going with you have to actually regularly restore the data and simulate a disaster recovery. Whilst it makes sense to have automatic checks in place, IMO its always worth manually doing the recovery. Prove it all works, it sets expectations and shows issues.
scipiobarca大约 8 年前
Personal back up - way more complicated than it needs to be!<p>(1) Chronosync and ArqBackup are installed on a Server. (2) Each client machine has Chronosync Agent installed. (3) The Agent backups specific files and folders(according to a schedule) to the Server. (4) Chronosync on the Server will back up to a second hard drive on the same server. (5) ArqBackup will then backup the files on this second hard drive to Amazon AWS (in encrypted form).<p>Separately, I have independent Time Machine back ups on external hard drives as well. Some of the core client machines also have backup&#x27;s occurring to SpiderOak.<p>I have done minimal restore tests but part of the reason why I back up in the way I do is because I expect one or more of the backup&#x27;s to fail when I need to restore.
maturz大约 8 年前
Cloud + <a href="https:&#x2F;&#x2F;github.com&#x2F;duplicati&#x2F;duplicati" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;duplicati&#x2F;duplicati</a> Encrypts and compresses before sending data to the cloud and lets me restore files from specific days if needed.
leonroy大约 8 年前
One honking big Supermicro SC836 chassis with a Supermicro low power board in it.<p>Stuck FreeNAS on it and backup everything to it using nightly rsync and ZFS replication where possible. It has 48TB of storage (16x 3TB).<p>Critical bits get synced to Amazon Cloud Drive (which took an age).<p>For backing up my ESXi VMware box I use Nakivo - it&#x27;s an amazing piece of software - never once had an issue with it and I&#x27;ve used it many times to revert a broken virtual machine.<p>I&#x27;ve had a lot of experience with hardware failing in my IT life. Been close but very lucky that I&#x27;ve never lost data from a disk or corruption failure. Finally bit the bullet and bought all that kit <i>just</i> for backups. Well worth it.
closeparen大约 8 年前
- Full computer in Crashplan with user-specified key.<p>- Non-sensitive documents I care about on Dropbox.<p>- Code I care about (have time invested in) with git remotes on Github, Bitbucket, or a personal Gitlab server.<p>- For &quot;continuity&quot; I carry personal property insurance that can replace my laptop.<p>I don&#x27;t bother with external drives or NAS devices because the scenarios I feel are most likely are burglary followed by natural disaster; I don&#x27;t want to rely on something in my home to protect something else in my home.<p>After hard drive crashes I am usually grateful for a clean slate, and at most pluck one or two files out of backup when the need arises.
benjohnson大约 8 年前
Giant 45tb(available) ZFS pool at Hetzner for $350 (depending on exchange rates) per month.<p>Rsync with crontab for Unix things. Cygwin with RSync and Volume Shadow Service triggered by Scheduled Tasks for Windows things.<p>0.007 USD per GB per Month
alkonaut大约 8 年前
Nas and lots of custom scripts and programs completely switched off and unused. Too much hassle for something that should just work.<p>Instead: Cloud backup to CrashPlan for pennies for 10 machines. Already saved my butt several times.<p>So my only tip - don&#x27;t do anything yourself. Doing your own backup is like writing your own crypto. It will bite you.<p>A reasonable compromise is to use your own backup <i>in addition</i> to a service. However, use them independently - don&#x27;t back up your backups to the cloud, backup your machines to both places. Otherwise your custom setup can still be the weakest link.
deepaksurti大约 8 年前
I have a Mac Mini, my main work machine and an MBP, used when I travel. I backup both my Mini and my MBP to Lacie Thunderbolt SSD drives using Carbon Copy Cloner which kicks off the backup procedure every night.<p>I also backup my photos, documents to both iCloud and DropBox.<p>I don&#x27;t use iTunes on my MBP, my Mini is connected to another external SSD which serves as my iTunes disk that is also backed up.<p>Whenever I travel, I just sync my MBP to be update with my Mac Mini.<p>I am also looking at using iDrive backup [1], but have not done so.<p>[1] <a href="https:&#x2F;&#x2F;www.idrive.com" rel="nofollow">https:&#x2F;&#x2F;www.idrive.com</a>
lowrider130大约 8 年前
I have a large (24TB) RAID6 at home and backup all my files there. It&#x27;s large so I have room for all my DVDs, BluRays, and developer VMs. I have a smaller (6TB) RAID1 in another state at my parents house for off-site backup of important files. Both are running mdadm and set up to email me with any events. I have a cron job that runs rsync once a week and emails me the result. Both systems are on an UPS. I have tested to make sure they are working as expected. All my systems are running Linux, so I can access with sshfs or sftp using ssh keys.
feistypharit大约 8 年前
I want total control, so a Synology NAS box setup with two disks in a mirror, 1 SSD as cache, and one hot failover. All laptops backup to it. It backs up to amazon s3 and to a second synology NAS.
gtf21大约 8 年前
I use CrashPlan (as does my family) and I keep an archive encryption key so it&#x27;s encrypted on my side. I&#x27;ve found this fantastic (for example when my sister&#x27;s laptop died as we needed to retrieve her dissertation). It&#x27;s quite cheap and has unlimited storage. I don&#x27;t back up everything on here, only the important stuff.<p>I also have a Time Machine drive that sits on my desk for quick access &#x2F; just to have another option (although it is not encrypted so I might wipe it and find a way to use TM with encryption).
评论 #13696456 未加载
shoover大约 8 年前
Most random docs, todo lists, invoice scans, etc. are in Dropbox or Google Docs.<p>Home pics, music, videos: CrashPlan central. I also set up a local CrashPlan archive to a local NAS, but OS X can&#x27;t keep the IP address resolved.<p>Work: all projects are in source control. Configs and working trees are backed up a few times per day to JungleDisk. JungleDisk has saved me several times when I accidentally deleted a folder or overwrote a file before checking it in. It&#x27;s also handy for using the restore function to copy configs to a clean dev machine.
Freezerburnt大约 8 年前
Wow. No one using Bacula? It seems a little cumbersome to get set up, but once I did it, I can more or less forget about it.<p>I run a mixed bag of Linux, OSX, Windows machines and each one of them gets incrementally backed up each night, and a full baseline once a month to a machine on the home network. Nothing fancy.<p>Then about once a month or when I think about it I copy the backups to an external drive and take it off site.<p>Worst case loss is a month. Seems cool to me.<p>And yes, I quite often ensure I can restore files - usually by stupidly deleting them at the source.<p>No one else uses Bacula?
freetonik大约 8 年前
I love Arq and I&#x27;ve described my setup in a blog recently <a href="https:&#x2F;&#x2F;hello.rakh.im&#x2F;backup&#x2F;" rel="nofollow">https:&#x2F;&#x2F;hello.rakh.im&#x2F;backup&#x2F;</a>
评论 #13695212 未加载
Loic大约 8 年前
For the past few years, I have been using a mix of rsync against an in-house and an external server + encrypted USB drives[0]. The key to encrypt the external drives is using a simple algorithm based on the serial number of the drive and a very long string stored in a Yubikey.<p>I never reuse the drives, just accumulate them.<p>[0]: <a href="https:&#x2F;&#x2F;www.ceondo.com&#x2F;ecte&#x2F;2016&#x2F;09&#x2F;simple-secure-backup&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.ceondo.com&#x2F;ecte&#x2F;2016&#x2F;09&#x2F;simple-secure-backup&#x2F;</a>
fimdomeio大约 8 年前
I separate everything in years. Current year gets synced every week via rsync with a drobo (drobo duplicates the data amongst all the drives). I also have a disc in another location that gets synced once a year at christmas with the archive. It was a bit of an investment but then it&#x27;s pretty cheap to run.<p>I know it&#x27;s not perfect. If I delete something without noticing and sync it afterwards it will be lost forever, but I&#x27;m running this for the last 10 years and never really had a problem.
nathcd大约 8 年前
I&#x27;ve been thinking of playing with bup [1][2] for personal stuff, so I was hoping I&#x27;d see that someone here had played with it. I don&#x27;t see any mention of it yet, so if anyone has used it and could share any thoughts, I&#x27;d love to hear them!<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;bup&#x2F;bup" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;bup&#x2F;bup</a><p>[2] <a href="https:&#x2F;&#x2F;bup.github.io&#x2F;" rel="nofollow">https:&#x2F;&#x2F;bup.github.io&#x2F;</a>
r3bl大约 8 年前
One local one, one external one.<p>In the external case, it&#x27;s just a Nextcloud instance, constantly syncing the most important files.<p>In the local one, there&#x27;s an external hard drive connected to Raspberry and cronjobs that scp into it.<p>So, three constant copies of everything out of any importance. I &quot;test&quot; the backups regularly because I&#x27;m playing files from a backup on a Raspberry connected to my sound system and constantly downloading files on my phone from Nextcloud.
评论 #13697804 未加载
dsego大约 8 年前
I have a MBP with 500GB internal SSD so all my data is in one place.<p>1) Carbon copy cloner &gt; external 1TB drive, manually every few days<p>2) Arq backup &gt; Amazon Cloud Drive (runs in background)
评论 #13696016 未加载
LatexNinja大约 8 年前
I run linux for work and windows for gaming between a laptop and desktop. For the files I use frequently I unison those between machines. For backups I send everything to an encrypted removable HD on my home network using some rsync scripts I wrote. For the cloud you can&#x27;t trust any of them with your data privacy. But I still send off some stuff to amazon cloud drive (encrypted of course) using rclone.
rollcat大约 8 年前
ZFS or btrfs in most places.<p>All devices with snapshotting capabilities, keep hourly, daily, weekly, monthly snapshots.<p>Once per day, all devices rsync their &#x2F;home to NAS. (I would use ZFS send&#x2F;receive, but I want more selective backups.)<p>NAS also keeps snapshots. A daily snapshot of the most critical data is encrypted and sent off to an offsite server.<p>Hadn&#x27;t lost a single byte in years (since shortly before implementing this scheme :P).
knz大约 8 年前
<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12999934" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12999934</a><p>Recent discussion on the same subject. Rural land in Tennessee, Arq&#x2F;Google, and Crashplan were the top three.<p>Personally I use Google for photos and documents, and have them synced across multiple computers. I also make a copy of it once a year.
brandonhall大约 8 年前
1TB backup drive which is partitioned. Half is data and the other half is time machine. The data partition is mirrored to Google Drive. Then, I use Arq Backup to mirror time machine backups to Google Cloud Storage. In other words, there is always a true local and remote copy of everything. Very cheap and works well.
source99大约 8 年前
For my personal dev machine I simply use github and dropbox. I&#x27;m sure there are more complete ways of storying my full system but I&#x27;ve actually never needed it...knock on wood.<p>That being said I can re-create my system from scratch in 3 hours so if I spend my more time than that on backup I think its a waste.
atmosx大约 8 年前
I have an HP proliant microserver with 16GB of RAM in the office. It has 4x2TB disks in mirrored Vdev ZFS (RAID1) running gentoo.<p>All my backups get there first. Some of them are stored in the cloud using tarsnap.<p>I use crown scripts and riff-backup to fetch daily snapshots of my servers (EC2, RPis, etc).
et-al大约 8 年前
Piggybacking on this, for those of you juggling a few external hard discs, how do you keep track of what&#x27;s on what? Does anyone use git-annex?<p><a href="https:&#x2F;&#x2F;git-annex.branchable.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;git-annex.branchable.com&#x2F;</a>
blakesterz大约 8 年前
I always love seeing answers to this question!<p>I&#x27;m all about Duplicity:<p><a href="http:&#x2F;&#x2F;duplicity.nongnu.org&#x2F;" rel="nofollow">http:&#x2F;&#x2F;duplicity.nongnu.org&#x2F;</a><p>Sometimes I feel like it&#x27;s a bit complicated, but I&#x27;ve yet to find anything that will do any better.
评论 #13695249 未加载
评论 #13695273 未加载
评论 #13695039 未加载
luca_ing大约 8 年前
I use rsnapshot to aggregate a bunch of machines on my NAS.<p>I&#x27;ve been intending to for a few months O:-) to then save this aggregated backup somewhere on the internet. Not sure if e.g. tarsnap, or a minimal vserver with rsnapshot or rsync yet again.
grantpalin大约 8 年前
* a local unRaid machine as, among other things, a backup destination<p>* a local Synology DS216+II as secondary backup of the essential data<p>* run backups for both above using Bvckup<p>* essentials are also backed up to Crashplan<p>* OneDrive for non-private things that I&#x27;d like to have easy access to
ksec大约 8 年前
I really wish something easier is on the market. Time Capsule doesn&#x27;t do iOS devices, and with no Off Site Backup.<p>It would have been perfect if I could backup my iPhone &#x2F; iPad to Time Capsule and have it back up to iCloud as well.
alex_hitchins大约 8 年前
Iv&#x27;e had one very large project back in the day that wanted source on CD and printed copies of the source code. I&#x27;d never thought of a printed page as a backup, but I guess it&#x27;s &#x27;a&#x27; method.
zwerdlds大约 8 年前
FreeNAS via rsync to an old leftover ReadyNAS Duo with mirroring.<p>FreeNAS ui makes this super simple from that end, and, despite being pitifully out of date, the ReadyNAS supports rsync to the extent that I need.
untog大约 8 年前
Apple Time Machine. It&#x27;s pretty great in the &quot;set it and forget it&quot; world of backup solutions. All my actual code also lives on Github, so there&#x27;s always a remote copy.
ksk大约 8 年前
For people using rsync and the like - Does anyone have data on the amount of wear caused by reading the entire HDD (modulo OS&#x2F;libs) over and over again to compare against the backup?
abricot大约 8 年前
My main concern is the family collection of photos, video and scanned papers.<p>I use a combination of cloud backup (Jottacloud), local disk mirror and dumping a yearly batch of BD-Rs in a bank box.
chauhankiran大约 8 年前
This might help: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12999934" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12999934</a>
overcast大约 8 年前
TimeMachine for local nightly, in case I need to recover quickly from yesterday. ARQ for hourly&#x2F;daily&#x2F;monthly offsite to Microsoft&#x27;s live drive..
scktt大约 8 年前
i use a simple nas + external hdd + rsync to backblaze.<p>the macbook uses the nas for timemachine backups to an external hdd, the external hdd is backed up with rclone to backblaze once an hour every hour unless its already executing a backup.<p>any iphone backups are rsynced to the nas&#x2F;external hdd then rcloned aswell.<p>iphone photos are kept in icloud, including any added to the macbooks photos app.<p>about $0.75c per month for 200gb sofar for backblaze and $1.49 p&#x2F;m for 50gb of icloud
sreenadh大约 8 年前
I have a rather simple approach to backup. I have my dropbox folder is inside my Gdrive folder. So, its basically backedup in 2 places.<p>I hope this was about personal backups?
planetjones大约 8 年前
Rclone scheduled tasks to Amazon cloud drive with its unlimited amount of data.<p>And a 5TB USB drive too that I occasionally backup to manually e.g. Time machine backup.
kennydude大约 8 年前
I use Nextcloud to put stuff onto a Mac Mini where I&#x27;ve got Backblaze running on there, which emails me monthly to make sure it&#x27;s working.
bookofjoe大约 8 年前
Last year I tossed my Western Digital external hard drive in the trash. Who needs it when I have multiple clouds (iCloud&#x2F;Amazon&#x2F;Google)?
评论 #13695130 未加载
cpr大约 8 年前
Backblaze for all 6 Macs in my family. SuperDuper! imaging nightly for my dev MBP to external drive.<p>Not sure I really need Backblaze when backing up nightly.
评论 #13694996 未加载
PawelDecowski大约 8 年前
1. Local<p><pre><code> a) Apple TimeCapsule b) NAS (2 x 3TB in RAID-1) </code></pre> 2. Off-site<p><pre><code> a) Amazon Glacier b) GitHub, DropBox</code></pre>
drewjaja大约 8 年前
Currently just use time machine to backup my iMac
评论 #13694917 未加载
soulchild37大约 8 年前
For personal laptop I used two external hard drive as Time Machine backup.<p>For web app I store the mysqldump to Amazon S3 daily.
zie大约 8 年前
tarsnap. (<a href="http:&#x2F;&#x2F;tarsnap.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;tarsnap.com&#x2F;</a>)
TurboHaskal大约 8 年前
tarsnap for &#x2F;etc and other configuration files.<p>dotfiles on a private Bitbucket repository.<p>For pictures, videos and stuff I have a 1TB drive in my desktop and an USB 1TB drive which I normally use in my laptop. From time to time I plug the USB drive into the desktop and sync them with Unison.
etherdoc大约 8 年前
Macs at home get incremental back-ups to:<p>a. ARQ-&gt;Dropbox<p>b. an old Time Machine<p>Also,<p>a. iPhoto, desktop, docs -&gt; iCloud<p>b. most non-sensitive documents stored on Dropbox
antoniorosado大约 8 年前
Backblaze on my MacBook. Also have a time capsule at home. Pretty simple setup.
vladimir-y大约 8 年前
Duplicati with uploading to a few different online storage services (clouds).
TarpitCarnivore大约 8 年前
Synology NAS for storing files Backblaze B2 &amp; Crashplan for offsite
KiDD大约 8 年前
21TB FreeNAS zRaid3 + Automated LTO3 Tape Backup + BackBlaze
gaspoweredcat大约 8 年前
my main stuff is backed up both on my google drive and my ibm server on a raid 5 array. cant imagine ill need anything more than that really
yegortimoshenko大约 8 年前
Tarballs stored in Amazon Glacier.
kjsingh大约 8 年前
git on visualstudio.com for versioning with dropbox also syncing the git folder.
linsomniac大约 8 年前
I built a small veneer on top of ZFS and rsync that I&#x27;ve been running for well over a decade. It has worked flawlessly, mostly because it is so simple.<p>A few years ago I got my company to release the code: <a href="https:&#x2F;&#x2F;github.com&#x2F;tummy-dot-com&#x2F;tummy-backup" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;tummy-dot-com&#x2F;tummy-backup</a><p>I use it almost exclusively with Linux systems, but it should work with anything that rsync does a good job with.<p>The hardware is mostly commodity rackmount boxes with 8-12 drives running ZFS (zfs+fuse or ZFSonLinux more recently). Deduplication takes a shockingly large amount of RAM, so mostly I disable it.<p>The job of tummy-backup is to schedule backups, prune older backups, and complain if something is wrong. There is also a web interface for creating backups, manually running them and managing them, and exploring and recovering files (via a downloaded tar file).<p>BACKSTORY<p>I was running a small dedicated hosting business, and we had hundreds of machines to back up. We started off with the old hardlink + rsync trick, but it had two problems: Append-only files would cause huge growth (log files, ZODB), and managing creating and deleting the hard links would take tons of time.<p>We tried backuppc for a while and liked it, but it still had the problem with append only files growing, and lots of our backups were taking more than 24 hours to run.<p>So I took my old rsync+hardlink script, which had proven itself really robust, it lives on in this file:<p><a href="https:&#x2F;&#x2F;github.com&#x2F;tummy-dot-com&#x2F;tummy-backup&#x2F;blob&#x2F;master&#x2F;sbin&#x2F;zfsharness" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;tummy-dot-com&#x2F;tummy-backup&#x2F;blob&#x2F;master&#x2F;sb...</a><p>I started using it on Nexenta when they had their free release. That was ok, but about once every month or two the boxes would fall over and have to be rebooted. I realized in retrospect this was probably due to not having enough RAM for deduplication or just not having enough RAM period. Or maybe bugs in ZFS.<p>But Nexenta wasn&#x27;t something our staff had experience with. So I started testing it with fuse+zfs. This also had bugs, but the developers worked with me to find bugs, and I created stress tests that I would run, sometimes for months, to report problems to them. Eventually, this was pretty reliable.<p>Now I am running it with ZFSOnLinux, and that has been very stable.<p>I&#x27;d love to try it with HAMMER to get deduplication, but I just haven&#x27;t had the cycles. btrfs was also on my radar but at the time I was really looking for an alternative to ZFS, btrfs had been getting more and more unusable (I ran it for a year on my laptop but then experienced several data corruptions every time I tried it for 2-3 years after that).<p>Recently I&#x27;ve been playing with borgbackup of my laptop. I was hoping I could use it as an engine to get deduplication, but it really is only designed for a single system use. For a single system it seems good.
kirankn大约 8 年前
I use a Synology NAS box
miguelrochefort大约 8 年前
- Google Drive<p>- OneDrive<p>- GitHub<p>- Gmail
gargravarr大约 8 年前
Step 1: Panic
nkkollaw大约 8 年前
I use a Mac.<p>I noticed that actual files on my laptops are less and less every year: I use iCloud Photos for my ~50GB of photos, and they are downloaded only when you try to open them. I have documents and desktop files on iCloud (get downloaded on demand as well), and I use Google Photos as an Apple Photos backup. All of my more recent projects are on GitHub. I guess I only keep non-vital files on my laptop.<p>Having said that, I have an old 1TB Time Capsule at home (where I work from), and let macOS do its incremental backups automatically every hour. In addition, I usually launch a manual backup whenever I make some big or important change.<p>I transfer my data from the most recent backup whenever I buy a new laptop and I&#x27;m usually ready to go in a hour or so, they work wonderfully.
评论 #13696714 未加载
评论 #13699028 未加载
bbcbasic大约 8 年前
For home I rotate 2 hard disks in 2 locations and backup using EaseUS Todo.<p>I also put some stuff on Dropbox for a quick backup if I don&#x27;t want to wait until the next time I to a disk backup. Dropbox + zipped folder =~ private single user Github repo :-)
juiced大约 8 年前
My backup strategy is top-secret, I don&#x27;t want anybody to know where my files are located and how it is recoverable, especially not everybody on the internet.