During a regular work week I work a couple of days in a co-working space, the other times either from home or at a client site - and I’m tired of carrying around my laptop all the time.<p>My dream setup would consist of a couple of low-fi workstations, say a bunch of Intel NUCs or the like, one at each site. Then, whenever I leave in the middle of work at one site, I can go to the other and pick up from there. However, due to sub-optimal internet connectivity I can’t use thin clients with a cloud-based X server. At each work location, I’d need to run my jobs on that machine (either bare metal, or virtualised). Is there a way to e.g. synchronise one virtual machine image across the workstations, when I switch locations? I wouldn’t need to run two workstations at once. Bonus points if a program or configuration, that I install one machine also shows up on the other workstations. I have no special requirements, any Unix-like OS with a window system is OK.<p>Is anyone of you running such a setup? Or have you found a similar solution? I have that nagging feeling I can’t see the forest for the trees, given that I haven’t with IT setups since 1999. Thanks!
A couple of ideas that don't exactly fit your question but might be relevant to others in a similar position:<p>- You could buy a more portable laptop that you wouldn't mind carrying around, for example the new Macbook Retina or an MS Surface.<p>- Even a poor Internet connection might still be good enough for command-line work, and you can get a lot done on the command line. I've had great success keeping all my code and data on a central server, and editing code either directly in command-line Emacs via mosh (<a href="https://mosh.mit.edu" rel="nofollow">https://mosh.mit.edu</a>) or in a local Emacs instance using tramp (<a href="http://www.gnu.org/software/tramp/" rel="nofollow">http://www.gnu.org/software/tramp/</a>) to save changes directly to the server. This has the advantage that I can start long-running jobs that persist on the server even as I'm moving between locations. For data science work, iPython notebook is also very easy to set up for remote web access.
Biggest issue I see with syncing system images/full file systems around is that you can't easily do a merge: If you arrive at a new location and for some reason the sync with the latest changes isn't completed yet, you probably wouldn't want to wait for that to complete before you can safely work.<p>For that reason, I probably would go with only syncing home/data dirs, and maybe doing so explicitly or using something like git-annex. If you want to sync installed packages and system config, maybe force yourself to apply those only with scripts or a config management tool like puppet or ansible.<p>But there are options to avoid online sync: If your system fits in 1TB or less, you could only carry a external 2,5" SSD (or maybe even just a USB key) around and boot all machines off that. Or even carry an entire NUC, they still are more compact than a laptop.
One incredibly simple-minded possibility could be to keep VirtualBox or Qemu VM disk images on a USB HDD, and suspend the VMs to the HDD to move between machines.<p>Also, since it's better to err on the side of saying something irrelevant than not saying something at all, check out <a href="http://xpra.org/" rel="nofollow">http://xpra.org/</a>.
Depending on exactly what it is you're doing, maybe keep a master vagrant file that describes your dev environment (and every time you install a new app or modify the env, update the vagrant file accordingly)? Then on each workstation you use, you can just spin up a dev vm from the vagrant file.<p>That's one way to quickly replicate your env across workstations. Then all you need is git/rsync/btsync + a few bash scripts to keep your data in sync.
I have an X1 Carbon thinkpad. It's not a problem at all to carry around---very light and thin. I have separate mouse+monitor setups in different places (e.g. office, home). So I just plug it in wherever I am. I keep the monitor and mouse plugs right there on the desk(s), so it takes two seconds to plug them in.<p>I even have mulitple separate power adapters. I have one in my kitchen and another in my bedroom, for instance. They are literally 15 feet away. Totally worth the $70 or whatever since I can easily move back and forth multiple times a day.<p>Carrying around and dealing with a power adapter is 60 to 70% of the hassle of carrying a laptop and that can mostly be eliminated.<p>Oh, I also have a separate pair of headphones in each location AND one that always stays in my bookbag.
My company gives employees laptops with cisco anyconnect. It basically punches a VPN back to corporate over ant HTTPS connection it can get its hands on, which means that it can map all of your network drives, citrix apps run fine, etc.<p>By encouraging everyone to work off network drives, everything tends to stay in sync. Other solutions include having a server-located user data folder (desktop, documents, etc. - but that slows down for regional sites or mobile data if you load it up too much) and horrors like sharepoint or other remote content storage programs (HP records manager, etc.).<p>For code, I run a git server on a network share and point visual studio at it. My code gets committed to it and I can see it from any device on the network, including via VPN.
zerotier-one (<a href="https://www.zerotier.com/" rel="nofollow">https://www.zerotier.com/</a>) is useful to create a consistent network between individual nodes in different places. You can then use consistent hostnames to connect to a given node on the zerotier network instead of having to think about where a particular node is connected.<p>I've tried using BTSync and syncthing (<a href="https://syncthing.net/" rel="nofollow">https://syncthing.net/</a>) to automatically keep directories in sync with each other, but there's quite an overhead to have real-time change propagation and it doesn't seem to add much value if you aren't actually using the remote nodes at the same time. Instead I've reverted to using Unison (<a href="http://www.cis.upenn.edu/~bcpierce/unison/" rel="nofollow">http://www.cis.upenn.edu/~bcpierce/unison/</a>) when I've finished working in one location and want to push those changes to another node.
I'm a UI guy + Rails developer, I own 3 Macs (1 Mini and 2 MBPs). For most of the stuff, I simply use iCloud. For the actual source code of what I work with, I use Google Drive w/ GIT and I CD into it no matter which system I'm working with.<p>Actually, Macs are quite enjoyable in the sense that everything else is taken care of, by Apple - The App store handles all my software installs and purchases between computers, iTunes handles all my music collection and ensures all 3 Macs have the same tracks (I use Apple Music w/ Match) and iCloud makes sure all my docs are available from any device I use (iPad/iPhone/etc.)<p>As for browsing and bookmarks, I use Chrome.
I work with a linux machine at home, and windows 7 at the office. Here’s some of the bits and pieces I use to keep data synchronised as needed.<p>LastPass and XMarks to keep passwords and bookmarks synced across both (and across browsers as I use both Chrome and Firefox).<p>Dropbox. In particular, I run Pidgin for my instant messaging (with skyp4pidgin) on and encfs encrypted fuse file system. (On the Windows side, I use Boxcrypter Classic for encfs.) I also use Dropbox to push repositories back and forth (I use Mercurial for that, but Git would be just as good.) If I didn’t need to use Dropbox for sharing some common folders with others, I would consider SpiderOak and then not worry about encfs.<p>On the work machine, I have a fairly large TrueCrypt file container that holds my Firefox profile and Chrome user data, Dropbox folder and any other data I don’t want other users of the computer to be able to access when I’m not there.<p>I can also ssh to my home machine from the office, which is handy if I’ve forgotten something. I also use it for running rsync and/or unison as appropriate one some data (got to keep that office music collection in sync!). Cygwin on Windows for rsync and unison clients (as well as a bash terminal.)<p>I can also ssh to linux machines at work from home, and use x2go to rdesktop/vnc to office machines when I need to occasionally. Have found this much faster than alternatives (rdesktop/vnc over VPN, or xpra).<p>Hope that gives some ideas of things you could try if a full VM image shuttling back and forth does not work for you.
I personally can't wait for 1) OSes that can move seamlessly from phone UI to full desktop UI depending on the size of connected screens, and 2) hardware that can handle heavy dev work (multiple VMs, lots of random I/O) that fits in a smartphone form factor and power envelope.<p>Then sync would no longer be necessary because all you would ever need to use is that 1 unified device.<p>Until then, I use network shares on a cloud server through VPN for media, and Git and Vagrant/Docker for dev stuff.
Welcome to the club! I've tried thinking about this for many many years...<p>You best bet is to run your workstation in a VM and put your data files on a mount to the underlying file system and then run Unison file sync across the two file sets (the VM image(s) and the files).<p>Putting all your files inside the VM will work as well - but copy times will be longer because the whole thing will be bigger.<p>With your bandwidth limitations anything magical is out of the question; live file sync is an attractive idea but it's too slow for large file sets - Dropbox, Lsyncd, Git Annex - all suffer when dealing with a lot of files across low bandwidth. Unison, on the other hand, works well with a USB key as the transport mechanism - and only runs when you ask it to - so it doesn't keep transferring files as you change them (which seems like a good idea - but is really a PITA when you're not actually sharing files with other people).<p>Going a bit fantastical here - but IMHO the gold standard is block level sync with DRDB - which will keep two or more systems identical at all times; but it's too heavyweight for your purposes.<p>More practically, just install the same software on both your workstations and sync your data and dot files (e.g. .home) with Unison.<p>This is what I have been doing for over 5 years - and it works really well.
Git. Also, occasionally doing work with tmux on a Linode. I really like the VM disk image on a USB that some people are suggesting though. And, again, if you used tmux or screen, it would make resuming work easier.
What exactly do you want to sync? Is there any data, or is it only your environment?<p>Almost all the ideas so far mention moving the whole thing with you (usb sticks with system or vm), or working remotely (X / mosh / similar).<p>There's another possibility if you don't require any (or only static-like) data to be migrated - use some system which can be updated / rebuilt in sync. Setup a server salt / chef / puppet and make sure your local machines sync to it periodically. Whenever you need some modification, you have to implement it in your server and push.<p>Advanced version - implement it over something like NixOS for consistent builds.<p>Pros: anything fails - just rebuild it (even remotely if you can set up the satellite servers to boot over network); you won't forget your physical media; you can update all machines at once; you can store all configs in a repo with full history; if you need per-site specialisations, templates are great for it<p>Cons: if you need to sync any large chunks of data, this is not a good idea anymore
I use a Macbook Air and an iMac 5K at work. I keep these synced up heavily with cloud-based solutions like GDrive, Evernote, and VPN access to the hadoop clusters I work with.<p>Config-wise, I just built a shell script that I ran on both machines to install the same packages and libs I use for work.<p>For my startup, we use something like this <a href="https://github.com/thoughtbot/laptop" rel="nofollow">https://github.com/thoughtbot/laptop</a> (modified to our specs of course) to keep our machines all synced up. I'm thinking of using Docker or Vagrant to consistently provision machines with regular updates though. This has to be explored.
I use Unison file sync: <a href="https://www.cis.upenn.edu/~bcpierce/unison/" rel="nofollow">https://www.cis.upenn.edu/~bcpierce/unison/</a><p>Does exactly what it says on the tin!
All of my machines are linux, so I like to use ansible for it. I can sync my dotfile git repos, configure the OS how I need it, automate syncing vim plugins, get all of the PPAs I need. This doesn't quite solve the problem of half-finished work on one machine being available on a different machine (unless git counts), but at least your systems are now set up identically everywhere you can. One option is to work remoted into a VM somewhere in the cloud very close to you (to minimize latency), I've heard of people doing that.
I'm a manager doing sometimes dev work, so I have a lot of different data to sync. Here are my solutions:
1. Personal - OneDrive for documents, Yandex.Mail+IMAP sync for mail on devices, GitHub for code, OneNote for quick notes on cell phone and Confluence on Atlassian Cloud. I also used for some time Mindmeister.com for storing mind maps online (unfortunately they don't have an app for Windows).
2. Corporate - MS Exchange and lots of services behind VPN (Git, Confluence, SharePoint to name few).
I do the sync up between several devices:
1. Corporate desktop workstation (the most powerful beast for working with code, Windows)
2. Corporate laptop (Windows)
3. Personal tablet (Windows)
4. Personal cell phone (Windows)
Most of the sync up (except Git) happens automatically, so I don't worry about pushing changes to server - just open the document/email on phone that I've recently edited on tablet or laptop.
The only trickiest part is development environment setup (IDE settings, app/web server configs, build system configs etc): I've unified the setup of all my devices, so it will be easier to transfer them with scripting (I'm using PowerShell). I've also set up simplicity of dev. environment setup as a team goal at work, so everyone could pick up the code on bare machine and be ready for debug in 10-15 mins max. Most of this achieved by following conventions, that are either standard or recommended by vendor of software, and using the right tool, rather than generic solution.
Despite I'm using Windows, half of my team works the same way on Macs. I'm pretty sure this is also possible on Linux.
USB keys are known to be too unreliable in the long run, so my advice is a big STAY AWAY from them; but I can testify an external box, connected via a USB2 / USB3 cord, plus an (Intel 530 series) SSD can do wonders as a portable, full GNU/Linux system (Kubuntu in my case). Boots to desktop in under 1 minute even over USB2 (USB3 is a lot faster), and is very usable despite the USB bottleneck (<i>way</i> better than any USB stick). 120Gb, even after installing a full distro, leaves you with plenty of space to carry along your favorite music, virtual hard drive images, so you can carry along your Windows virtual machine(s) intact, and already configured [1]. You can then use TimeShift (or manual rsync cronjobs) to backup your full system to local drives/USB keys, and sudo apt-get upgrade the system whenever you get a reliable internet connection.<p>Minor issues I have experienced with this setup:<p>1. The biggest issue of all is, due to the peculiarity of USB power design, most SSD's might experience failures when the machine is powered off. This is a known issue with USB keys too! Believe me, booting a system every time not knowing if it's going to end in the dreaded fsck mess with thousands of errors really is a nightmare. Going with the Intel 530 series SSD's (which is a known solution) fixed the issue as they have larger capacitors, though Samsung SSD's might be reliable too.<p>2. USB power: on most laptops, your single USB power cord is unable to power the external drive; you might have to use the popular Y (double connector) USB cable.<p>3. Stay with open source video drivers. Especially swapping nVidia <-> AMD drivers is a recipe for disaster.<p>4. Audio HW devices will stack up, possibly resetting their settings when you return to a previous machine.<p>5. You might sometimes have issues with UEFI when you first boot your system on a new machine.<p>[6] VirtualBox VM's have sometimes issues when the system unexpectedly swaps eth0<->eth1 interface names, but this is easily fixed in the network configuration before booting them; also swapping AMD and Intel processors is risky, and can unexpectedly hang your Windows system on boot; BTW don't forget to carry along on your external drive every hard disk image they might use, or the VM's won't boot. Of course this applies to Vagrant boxes too, but is less relevant as you can destroy/recreate the box any time.
I have a MBP, desktop (Xubuntu), netbook I rarely use (Xubuntu) and Android phone + personal webspace (just use it for a simple wordpress and owncloud+mail accounts). My setup is pretty consumer/not very sophisticated. I recently removed my last Windows VM but will probably create a new one soonish because there's the occasional windows only thing I need to do. That VM is just kept up to date and has antivirus etc. but no synching with the other machines (shared folder with the host)<p>- GitHub/GitLab<p>- owncloud with personal "webspace" for data, especially important for Android (+ a photobucket account for images for random funny forum posts)<p>- Zotero for Firefox (with said owncloud) for my academic work. This is worth its weight in gold. I prefer to work with Latex but you are often forced to use Word/LO and the plugins are amazing (instant citation/bibliography; snapswitch to a different format).<p>- Thunderbird (and Outlook on OSX, meh but needed/convenient for work would prefer 100% TB)<p>- Firefox sync (not really using it anymore but probably should)<p>- KeePass to store my passwords across systems<p>The overlap between the machines isn't huge and I mostly just keep them up to date. The stuff I use on multiple machines is on the same version if possible but not automated in any way.
My personal backup strategy is pretty horrible (I just randomly put important stuff on external storage), I need to up my game there.
Not even keeping my configuration files etc. on github as of now (planned for the near future)
tl;dr: Need better backups, need scripted os sync<p>Note: I don't use my phone all that much for stuff I can do on other machines (don't mobile browse a lot). It's mostly an IM/call platform with occasional data use.
I've been working on customized EC2 instances, DO droplets and Nitrous (<a href="https://pro.nitrous.io" rel="nofollow">https://pro.nitrous.io</a>) since 2012 and haven't looked back. There is the issue of connectivity, but I'm unproductive without an internet connection so it has worked well for me.<p>I use tmux and then connect to the session from work, home, etc... and setup ssh config (<a href="http://nerderati.com/2011/03/17/simplify-your-life-with-an-ssh-config-file/" rel="nofollow">http://nerderati.com/2011/03/17/simplify-your-life-with-an-s...</a>) so I have shortcuts to all of my remote machines. App environments are built with docker, snapshotted, and pushed to dockerhub.
If you can use Microsoft and provided you have USB 3 ports on your workstations, Windows To Go could be a good fit.<p><a href="https://en.wikipedia.org/wiki/Windows_To_Go" rel="nofollow">https://en.wikipedia.org/wiki/Windows_To_Go</a>
You could always just use your phone as your desktop.<p><a href="http://www.andromiumos.com/" rel="nofollow">http://www.andromiumos.com/</a><p><a href="http://thenextweb.com/microsoft/2015/04/29/windows-10-will-let-you-use-your-phone-as-a-full-computer-sort-of/" rel="nofollow">http://thenextweb.com/microsoft/2015/04/29/windows-10-will-l...</a><p><a href="http://www.gizmodo.co.uk/2012/02/who-needs-a-pc-when-you-can-just-hook-up-your-android-phone-to-a-monitor-and-keyboard/" rel="nofollow">http://www.gizmodo.co.uk/2012/02/who-needs-a-pc-when-you-can...</a><p>Ubuntu Edge (no longer available)
I keep my dotfiles in git, and I was setting up my new laptop the other day, so I figured I'd do it right. I added an Ansible file to provision the laptop from scratch. It installs all my packages, tools, preferences, etc, and it also goes and gets my repos from a server and puts them in the right places.<p>It's idempotent, so whenever I need to make a config change, I just change that file and run "provision", and everything is the same across machines. I'm not sure if this answers all of your questions, but it's good enough for work (personal data can go on something like Dropbox or Syncthing).
google drive to sync documents, normal files, etc.<p>git for code.<p>then i have VMs on EC2 for specific tasks.<p>>tired of carrying a laptop<p>i solved this with a lighter weight laptop.<p>you can usually leave your multiple displays wherever you need them, and plug into the laptop.<p>edit:<p>looking back, i'm very invested in google's ecosystem of "synced" things: google apps.<p>it's almost to a 90's microsoft extent -- where i will actively avoid and discourage incompatible formats or platforms.<p>at the very least you can use any of google's things with an OK computer and an internet connection.<p>that leaves me wondering what will "obsolete" the "google office"
The key ones for me that let me leave the laptop at home;<p>- Outlook: MS Exchange does syncing on email/calendar/tasks (I use Nine for my mobile mail client).<p>- GDrive: General file storage/accessibility.<p>- Chrome browser: Sync's our tabs/history. I use Tab Cloud to save sessions across Chrome too.<p>- One Note: I use this for meeting notes and brain storming.<p>- Google Keep: I use this like sticky notes thats accessible cross devise.<p>With this mix I can easily move between my laptops/mobile devices as needed.
I've tried few approaches, which worked quite well:<p>- Linux on DigitalOcean droplet through ssh/mosh+tmux (usage of CLI/TUI software and stable network connection is required)<p>- full Linux system on USB 3.0 flash drive (reliable backup configuration is required)<p>Now I use similar Linux desktops on each desktop/laptop with:<p>- notes syncing using unison<p>- mail stored on mail server and accessed through IMAP client (mutt)<p>- dotfiles managed with git repository<p>- projects managed with various VCSes<p>- sensible data managed with pass/gpg
I got tired of carrying a laptop around so I get a smaller laptop.<p>You sound like you would do well with that idea, but maybe it takes the form of an external (e.g. USB) storage device that you store virtual machines on <i>but don't forget about backups</i>. Perhaps you have a script snapshot and rsync diffs between your storage and your machine so that each machine's local storage can also be your backup.
I tried running a home server with data, running a home server and forwarding X from there over ssh, limiting my needs to CLI (tmux, irssi, vim, etc) and running everything on a remote server, using built-in sync for various apps (like notes)...<p>None of it ever worked well so I bought a nice leather messenger bag and learned to like carrying my laptop around and using the small (13") screen most of the time.
For my notes use private GISTS or Evernote.
For projects use BitBucket or GitHub.
For syncing program configurations use my own written script that creates symlink to hard drive from Google Drive folder.
Chrome does a decent job syncing all web related stuff for me.<p>Not perfect but it is ok.
You can get around the internet limitation by using a USB stick with a live linux environment and boot from that? Downside is that when you lose the stick you're boned.
With windows, Ive done very well with goodsync. It tries very hard to keep things synchronised nicely, including when the connection between clients is patchy.
Unison. Know where your data is. Don't sync oses. Instead, keep their setup in scripts. You can get faster sync if you can say which side is newer by using btrfs snapshot over the network.