It often happens that you're on a project that must be accessed over the 'net even for development. (Case in point: PayPal integration. Facebook app. Anything where a proprietary SaaS calls back your URLs.)<p>What's your or your team's preferred method to write code then? (The one you do most often, if you use more.)
1). Net-accessible dev and staging machines. Net-accessible staging servers are fairly easy to set up if you have a repeatable way of doing deployments. Security can be a mite tricky. Net-accessible dev boxes are modestly more difficult than staging if you do them right. (Simplest way is to use reverse SSH tunnneling on a net-accessible box under your control to your dev box, which is a single trivially Googleable command. The reason this takes thought is that giving the world access to an app server on your machine is potentially a Very Bad Idea.)<p>If the idea of having HTTP on your laptop does not fill you with mortal dread, Twilio has an OSS called localtunnel which makes your laptop web-accessible in seconds.<p>2). Source control for <i>every</i> change, deploy scripts to get it to environments safely and repeatably. DVCSes handle lots of little changes fairly elegantly, and you can squash a range of them after you e.g. figure out the magic incantation to get a foreign API working right. (Turning thirty one-character commits and ten deploy tags into a single "Foreign API now works" commit.)<p>I have done cowboy coding in Putty in the past. There is no excuse for it.
I used to work solely on a remote Dev box, using ExpanDrive to edit locally. I would have an ssh session open to run commands/debug/shell.<p>I've recently starting moving towards local Vagrant instances. I don't like dealing with local library installs on my Mac, and Vagrant makes it stupid easy to repeat my dev setup. It also gives me a good starting point for provisioning servers (using Puppet), thought I haven't fully consolidated the local dev and server Puppet manifests, yet.
Well having done a Facebook app, step one set the URL to <a href="http://localhost:1234" rel="nofollow">http://localhost:1234</a> or something on Facebook and you are away for local dev work. Unless you are working on something different to what I did, it just inspects the source URL.<p>Try using a web service mocking software and learning how to work your hosts file. Soap UI comes to mind... or whack together manual services on test boxes.<p>If you capture an coming response, maybe using fiddler or something, or if using visual studio its a breeze. Then just set up a basic web service that returns you data as you captured, no processing needed at all. Or even just set up some proper routing. There isn't really an excuse doing cowboy work will bite, maybe not today, but later.<p>I've seen to many huge f/k ups and been responsible for €50,000 mistake myself mucking with a live server with no testing environment.<p>My favourite for this is virtualisation BTW. Taking server snap shots of production, dumping them in a sand box where I have mocked third party set ups, and hack away. Makes releasing, testing, everything breeze. Big time cost up front, but it pays itself back ten fold on the first smooth release.
BCVI: <a href="http://sshmenu.sourceforge.net/articles/bcvi/" rel="nofollow">http://sshmenu.sourceforge.net/articles/bcvi/</a><p>It's like a combination of "I SSH into a server, and work there using VIM/EMACS/etc." and "I use a local VIM/EMACS/MC/etc to open files remotely over SSH/FTP.".<p>You SSH in and type "vi foo.py" in the remote shell, but the file opens in your local Vim. When you save it's automatically transferred across.
None of the above. Google AppEngine does it all for me with the click of a 'deploy' button. Sysadmin work is so 80s.<p>To be honest, since GAE all I do now is open notepad2 and code, nothing else, then deploy and presto. I remember the old days when having apache, filezilla, svn, and a plethora of sys tools was the order of the day.<p>Coding is more fun when it is pure coding and nothing else than coding. Thanks GAE.
I want to get good enough at Vim to just do all my coding in it, but for now using an SFTP bookmark + Gedit in Ubuntu is convenient and feels like I'm editing the files locally. I still have an ssh console open for gitting / grepping etc.<p>Trying to do the same sort of thing with MacFuse in OSX was an absolute nightmare though (slowwww), but perhaps I was holding it wrong.
I do all development on remote servers because I like to code from many different machines. Setting up a LAMP stack on all my potential dev boxes would be painful and cumbersome, and may interfere with other functions that I intend those machines to perform.<p>This is very easy to do with EC2. A single micro instance is free for a year with a new EC2 account. These make great dev environments where you can setup the exact toolset and environment you need. THis is also makes for convenient deployment if you are ultimately deploying to EC2.<p>For coding, I prefer Coda if I'm on a Mac and Aptana on a windows box, both are chosen precisely for their excellent SFTP support -- I find mounting remote systems as local drives be very laggy since file loading and saving locally tend to be a locking operation for most editors, meaning the IDE freezes up for the 1-2 seconds it takes to upload.
I use VIM for cowboy coding.
I don't need to do this as much as I used to, so I normally use either a mounted ssh filesystem, or "sharing" through a DVCS these days.<p>A few years ago I worked at a place where they gave us a local windows box and a remote headless linux box where the code ran (this was before virtualization was ubiquitous). I used cygwin on windows and forwarded X11 sessions running on the remote linux box to my windows box over ssh.<p>windows w/putty: <a href="http://tldp.org/HOWTO/XDMCP-HOWTO/ssh.html" rel="nofollow">http://tldp.org/HOWTO/XDMCP-HOWTO/ssh.html</a>
OSX: <a href="http://dyhr.com/2009/09/05/how-to-enable-x11-forwarding-with-ssh-on-mac-os-x-leopard/" rel="nofollow">http://dyhr.com/2009/09/05/how-to-enable-x11-forwarding-with...</a><p>It let me run eclipse and other GUI apps on the linux box, but have them mostly feel like native apps (well, native X11 apps) on the windows box.
Over the years I think I've done all of these. They're all a little painful. Being in Australia means that latency is an issue, I'm sure fast DSL in California is different.<p>If you're using SSH, an editor and terminal that supports GPM mouse over SSH (eg: iTerm2 on OS X and Vim on the backend) makes life easier.<p>It's rarely necessary. For Facebook in particular, 95% of the integration happens at the web browser level via signed URLs and/or Flash - Facebook's servers rarely talk directly to your servers or vice versa. Just run a web server locally and put 127.0.0.1 in your hosts file.
Facebook let you use localhost as the canvas page; no idea if they also let you local network addresses as well. So that's one problem solved :)<p>For debugging calls such as those received by PayPal, I start by writing a small script that logs the GET or POST request, then write a simple test page that POSTS to the local script that I'm developing.<p>At any rate, if the only way to test the majority of your code is via a POST, there's something wrong with the way you're writing code; the objects you've created you should be able to test independently, for a start :)
I use a cooperating set of Mac apps: connect to the server using Cyberduck, browse to a file and double-click to open it in TextMate, edit the file and type command-s to save, peripheral glance to the corner of the screen where Growl puts up a notification that the transfer completed successfully, then type command-shift-r to tell TextMate to tell the currently open web browser to reload its front-most window to see if the change had the desired effect.<p>A number of Mac file transfer clients and text editors support this type of integration.
we do remote coding mostly for the pairing and code reviews. two or more people connect to same server, first tmux (or screen, which we used in the past) and vim from inside tmux for editing.<p>to integrate Facebook and and PayPal remote coding usually not required, it is simple to configure a dyndns account and a port forwarding on the router to continue to work on your own local development machine. Any hassle of configuration pays off pretty fast due to increased pase of development and lower overhead.
Since I got an Imac at work, I mount the remote file system over SSHFS and it works quite good.<p>Before I used ZDE 5.5 for PHP dev. and opened the files remotely via SFTP. It was also not bad, though ZDE 5.5 is really buggy.<p>What I miss is an editor which supports opening files via SFTP and doing that fast (as ZDE) and is not that buggy (not as ZDE) and supports autocomplete.
For the case of Facebook app development, there are few great substitutes for an SSH tunnel routing traffic back to your local dev environment. All the local dev tools your used to, including TextMate if that's how you roll, with a little bit of extra latency.<p>You can roll your own, but I'm happy to use Tunnlr.
I vote for the custom rsync script, as it's really easy and I get to keep working locally:<p><a href="http://www.exratione.com/2011/08/use-rsync-scripts-for-painless-deployment-during-development.php" rel="nofollow">http://www.exratione.com/2011/08/use-rsync-scripts-for-painl...</a>
Push to a GIT server, we have a branch called 'live', then we run a script on the servers that pull down the deploy branch. We also have a branch called 'development' for our dev boxes
I generally use (a) my local Emacs install in tramp-mode, (b) Fetch—it has a feature that lets you edit a remote file using your default text editor, or (c) TextWrangler's SFTP mode.
vim + git + git repo hosting service + fabric<p>make changes locally, merge into remote environments branch, run fabric command that pushes change up to git hosting server, ssh's into remote server based on current branch then pulls changes from repo hosting servers & runs a few other bits remotely (migrations etc) depending on what has changed<p>I use codebasehq.com btw, been ace but i dont see a great need to use their deployment service.
sftp to the server with Coda, which puts your remote files in a sidebar; ftp+files you're working on in a single window (though you can break them into new windows if you want). Also has svn control baked in, but I've only used Versions for that.<p>I've been meaning to try TextMate, but love the built in ftp with Coda.
i ssh into a server, but i learned about using a local VIM instance to edit remote files yesterday (on HN, of all places) and will be doing it that way from now on. no more need to spread my fine-tuned vimrc to every server i use!
I've shot myself in the foot enough times with "trivial" or even "one character" edits that everything goes through the normal source control + deploy process.
I import the remote filesystem into Plan 9 and edit it there.<p>Which might be u9fs via ssh | ftp | drawterm | cpu | sshnet & AoE | import | sshnet & ftp | import /net & AoE | import /net & ftp<p>The possibilities are numerous