OSX is an OS where you run VMs for Linux and Windows.
I keep seeing people saying how great and fast it is to do stuff on their mac and run Windows for a few apps incl. mail client, office suites, some dev tools and many java apps (cause they're often not so nice on OSX)<p>Then I see them with the Linux VM for all the dev tools, the hack tools, and so forth, because, well, hacking them up to work in OSX is always a pain, and even using stuff like brew is often a pain (+ delay)<p>Well I dunno, I run Linux native and I get stuff done faster than the ones running Linux in a VM of course. I also have Linux VMs under Linux when necessary, with KSM and it's a lot faster and memory efficient than when running OSX native on the same hardware.<p>I install stuff in about <5s where it takes a VM to boot/resume in OSX, and about 5-10min to install from scratch when the app _supports_ OSX (ages if it doesn't but has more or less compliant sources)
Wonder if the author has considered checking out <a href="http://vagrantup.com/" rel="nofollow">http://vagrantup.com/</a> unless it does a bit too much for what they're after.
There are some useful tips in that article.<p>Using qemu in a no-gui mode is pretty cool. I've used VirtualBox, VMWare Fusion, and Parallels on OSX before, but I hate having to deal with the GUI window (and having the system reserve resources for it).<p>Since OSX is based on Unix under the hood, it shouldn't be too difficult to for a developer to use either OSX or Linux as their development machine. It's not like the knowledge gap between Windows and OSX/*nix machines. So other than messing around with Linux, the article fails to mention why he doesn't use OSX (other than some unfounded 'flaky' and 'impossible' critiques).
There is a problem with doing things this way. I want my development server to be always on, always available. It does things in the background and sometimes I use it when I'm away from the office.<p>Yes, you can keep your desktop machine on all the time, but that's a huge power suck. Shut down your desktop when you're not using it. Most development servers just don't need to be that beefy anyway.<p>A better approach, if you're absolutely committed to virtualization, is to just spin up an instance on Rackspace or Amazon. You'll pay about 10 bucks a month, which is less than you would for keeping your desktop running 24/7. On top of that, if your desktop dies, you won't lose your data.
I currently use a similar environment but with VirtualBox. Does qemu offer any advantages over it? I was considering recreating my VM so I may as well try qemu.
I recently got into an endless back and forth with a guy on reddit because I said that as a web developer, I require a mac (instead of a windows machine). When asked why I said that it is a hell of a lot easier to mimic testing/dev/production environments on a mac than windows and the only reason to choose a mac over a linux distro is because of the adobe dependance. His stance was that it isn't important to have your machine muddled up trying to replicate the dev/staging/production machines when you simply can develop on a remote box via some sort of ssh/file mounting. To which I replied with not always having access to remote dev machines, etc.<p>What is your take on this issue? I was actually surprised that people want to develop on remote machines. It reminds me of "ftp to see a change"
Unless you're on an ancient laptop I don't see why you wouldn't run a Linux VM if you are developing web apps.<p>What is the advantage of trying to make OS X your development environment and then deploying to a Linux host?<p>Unless you have an older laptop with less than 4 gigs of ram why not? Even with 2gigs it's not that system intensive to run a Linux guest os.
I also "deploy" to my local Linux VM for when I want to test a multi-server set up or try something new without messing up my main computer. Otherwise, I code and test almost exclusively on OSX. Haven't run into any situations yet where I wished I was coding in Linux as well. Linux doesn't have Photoshop or iTunes.
I'm following the instructions in the post on my Macbook Pro (2010); however I get 'Booting from DVD/CD... 180MB medium detected. Boot failed: Could not read from CDROM (code 0005) No bootable device.' Does anyone have any ideas why this is not working? I am using the netinstall Arch Linux ISO. Many thanks!