In our environment we have a development server located in a datacenter that runs Git and we have a number of developers who each have their own development space on the server. Basically each programmer has access to the dev server and maintains their own Git repo and we use SFTP to transfer files back and forth to do the actual code editing. We have dev urls setup in apache to each of the developers site folders on this server.
With phpED, i see the plan here is to download the complete remote site so that a local copy is maintained on the workstation. This was fine (albeit a slow process initially) up until I went to do a Git checkout into a different branch on the dev server. Now the only step i can see to do next is a smart sync in phpED to get the workstation local copy updated. This process takes a really long time for phpED to scan near 10000 files to determine what needs to be downloaded, plus it sucks up time determining what to do with any issues found.
I'm guessing there is a better workflow for this as i cant be the only group of programmers operating this way. I understand some programmers like to run local webservers, databases etc direct on the workstation, but in our company we prefer to keep as little as possible on the workstation and have as much as possible stored out on the remote servers in datacenters.
Any ideas on how to handle this? I have had to revert to using only the Explorer tab so that I can work direct off the server files. This works ok, but now i don't get the ability to group projects to workspaces, etc.
Help.
