Why are MAMP virtual hosts working on my laptop but not my new iMac? - mamp

I have two Macs (Macbook Air & iMac) and want to use Dropbox as the source for local website files and their corresponding databases.
I setup the databases following these instructions (text version here), moved the website files to Dropbox and synced the contents on both devices.
I've ensured that the correct files are in place as per virtual host instructions for MAMP, such as:
added the domains to my hosts file at /etc/hosts
uncommented the httpd.conf to include httpd-vhosts.conf
added the ServerName and DocumentRoot like I'm supposed to (even changing out my user name in the path)
My laptop (the original computer where the sites were developed) works fine with the migrated databases and site files, but but when I start up MAMP and go to any dev virtual host url on my iMac, I'm met with a MAMP favicon and a blank screen with no content whatsoever.
How can I get my desktop to play nicely and access the databases correctly without error?
Thanks for any help addressing this issue.

After manually deleting the MAMP folder from the Applications folder, I reinstalled a fresh copy of MAMP and ensured that enough time had passed for my Dropbox to sync, in addition to removing any conflicting file issues.
It seems that giving yourself a significant time delay between firing up MAMP on either machine helps in preventing database conflicts that must've been causing the issue.

Related

Plesk Umlauts in Paths not working

after a server-migration of my Wordpress Website from an old shared server to a new AWS EC2 machine with Plesk running, everything worked fine. However the only problem I'm facing is that older image uploads (that worked totally fine on the old shared server) with "Umlauts" in the name are not working on the new machine.
e.g. /uploads/2018/Würth-1.jpg
/uploads/2018/Wu%CC%88rth-1.jpg:1 Failed to load resource: the server
As mentioned, it worked on the older server. In general I know that those character should be avoided and I mostly do, however in this case it's an older upload and still has it.
Is there a way to make a server-setting to make those work?

MAMP Pro running multiple hosts through to xip.io only resolving to one host

I'm using MAMP Pro v3.5 for local development. I have multiple dev sites running successfully without a problem. When it came to testing a site on my phone, I used the out of the both 'Name resolution [x] via Xip.io (LAN only)' option under the hosts tab. The first host I turned that on with was fine and worked very well (using address like www.siteone.dev.192.168.0.10.xip.io).
The problem came though when I attempted to setup a second dev site on Xip.io using the above method. Now using a different URL (using address like www.sitetwo.dev.192.168.0.10.xip.io) for the second dev site, no matter what I try, I get the first host that was setup rather than the expected second.
Is this a bug with MAMP Pro, or it's just not capable. I've tried turning off the 1st host I setup with MAMP Pro, but it still shows up as the site that is getting served under the second xip.io address I setup.
Would really appreciate one of the MAMP people to respond and confirm if this is expected behaviour. An extended Google search didn't turn up anything.
Thanks
Brendan

Access Google Drive locally on Chromebook via Crouton

I am using a Samsung Chromebook with the Crouton chroot environment (https://github.com/dnschneid/crouton). This has revolutionized my view of how practical a Chromebook can be for developer-type work. I love it.
But now I am wanting to synchronize files between my various PCs and laptops. Using git is certainly an option, but it requires one to manually check in my work. What if I forget? I have been spoiled lately using either Dropbox or Google Drive to automagically keep my files all nicely synched up. The problem now with Crouton on my Chromebook is that I do see any obvious way to have project folder synced using Google Drive. I assume Drive would be the easier route since its a Google product. But if Dropbox can be made to work, that would be awesome too.
Has anyone looked into this and found a workable solution?
Although I haven't attempted to get it working yet, this project allows you to mount Google Drive to your Linux file system:
https://github.com/dsoprea/GDriveFS
You can access the locally synced Drive folder (as used by the rest of ChromeOS) from within a chroot at this directory:
/var/host/media/fuse/drivefs-[unique ID]/root/
Note that the unique ID is different on each machine (or possibly each google account?) - you will need to find this yourself.
This can also be accessed from the ChromeOS shell here:
/media/fuse/drivefs-[unique ID]/root/
Dropbox works fine for me within a crouton chroot.
% sudo apt-get install nautilus-dropbox
see http://www.liberiangeek.net/2012/04/install-dropbox-in-ubuntu-12-04-precise-pangolin/ for a full description.

Connecting Coda to local Wordpress install hosted with Mamp Pro

I have been using Coda and regular version of Mamp for local development for longtime without getting into this permission mess. I recently, upgraded to Mamp Pro and setup it with VHosts. I have a site example.com with it's root path set to /Users/john/Sites/example. I have set the owner and group to www in Mamp Pro.
The moment I got all this configured I started having problems with Coda. It keeps asking me the username and password to edit the local files at /Users/john/Sites/example. I guess I have to enable FTP on my Mac and then add a site in Coda to stop it asking me to enter username password for every single file. However, I have no idea on how to get this working. I am using Lion 10.7.2
Additionally, I have setup etc/hosts file for pointing example.com to 127.0.0.1
UPDATE: Though the accepted answer by #mini does not directly answer this question. It is still an elegant solution with seamless integration with Coda.
Consider using DesktopServer instead (along with Coda 2). Unlike MAMP, you can work on template theme files directly with WYSIWYG preview, LAN share for mobile device testing (with WordPress, -not just HTML sites), enables AirPreview to work with WordPress, copy, import/export to live sites, etc. Setup is easy as it manages your vhosts, database, and project files in about three mouse clicks:
http://www.youtube.com/watch?v=Pw9-F8etBPY

What strategy do you use to sync your code when working from home

At my work I currently have my development environment inside a Virtual Machine. When I need to do work from home I copy my VM and any databases I need onto a laptop drive sized external USB drive. After about 10 minutes of copying I put the drive in my pocket and head home, copy back the VM and databases onto my personal computer and I'm ready to work. I follow the same steps to take the work back with me.
So if I count the total amount of time I spend waiting around for files to finish copying in order for me to take work home and bring it back again, it comes to around 40 minutes! I do have a VPN connection to my work from home (providing the internet is up at both sites) and a decent internet speed (8mbits down/?up) but I find Remote Desktoping into my work machine laggy enough for me to want to work on my VM directly.
So in looking at what other options I have or how I could improve my existing option I'm interested in what strategy you use or recommend to do work at home and keeping your code/environment in sync.
EDIT: I'd prefer an option where I don't have to commit my changes into version control before I leave work - as I like to make meaningful descriptive comments in my commits, committing would take longer than just copying my VM onto a portable drive! lol Also I'd prefer a solution where my dev environment stays in sync too. Having said that I'm still very interested in your own solutions even if they don't exactly solve my problem as best as I'd like. :)
A Distributed / Decentralized Version Control System solution will suit your needs, Git, Bazaar, Mercurial, darcs... you have plenty alternatives.
Use a version control software like SVN, SourceOffSite, etc. You just have to check-in all your changes and get the latest changes when you want to sync.
Or you can use Windows Live Sync -> https://sync.live.com/foldersharetolivesync.aspx
Hasn't anyone recommended rsync? Use an rsync client to send the diff between files. You can apply these diffs thus bringing your file up-to-date. For the smallest file transfer it's probably the best idea.
I simply use an external portable notebook drive and do all my work on that. All my PCs have it set to the same drive letter. So no copying anything .. I've not attempted to run VMs this way, however, but I don't see any reason it shouldn't simply work.
i use dropbox.
We use Citrix and then I do a remote desktop connection to my PC at work. It is not the fastest solution in the world, but it does eliminate the problem of keeping two or more workstations up-to-date.
Here is a solution I use.
Set up a VPN between the office network and the laptop.
Install the VisualSVN Server
Load all projects in the SCC.
When at the office I check out a project, work on it and then check it in. When at home or around the world I connect to the office via VPN, check out my project, do my thing then check it in. Via the VPN connection I can also RDP to my dev boxes and or servers.
Hope this helps. Good luck!
I either connect remotely to the office SVN, or VPN in and remote desktop my dev or desktop machine and carry on working. It's very rare I sync any files, but when I do it's usually with DropBox (although you can't really do that with large files).
Write program, that will syncronize all your data through internet, and then shutwodn your computer, so at the end of the day you launch it, and go home, and when you come home all data is already there
We work with a distributed team, so it is vital everyone has easy and secure code repository access. For this, we use SVN over ssl/https. It works great, reliably and secure.
Depending on the VM software you are using why don't you set up 2 different VM disks, keep your user profile/dev files on one disk and the OS and other programs that change rarely on the other.
This way you can probably get away with only having to copy the larger disk image when you've installed something new and end up only copying a single virtual disk containing your work.
Just setup a SVN server at home, forward your router port and get on with your life. rsync is also a good, fast solution. Just remember to use it over SSH.
I had a similar problem. But fortunately we had a source control server (TFS) configured so I use to work only from the local Virtual Machines stored on my external drive and than check in the required files to the TFS as an when required.
you haven't specified the OS and virtualization system, but if you're working VM images that can be mounted, e.g. XEN on linux, then you could mount the image and sync it via rsync.
i connecting to the office net work and download the lates version form svn
use the Dev mysql server
so i am just like anther computer in the office network
I imagine that most of the time spent copying involves the database. Is that right? If so, can you not simply connect to your work DB from home using your VPN connection?
You would still copy your source files (or use a source code control system as others have suggested), but this would only take a fraction of the time.
If all you need is a virtual machine from your work computer, then you could mount a remote catalog (using nfs or smb) where is your virtual machine files store and run that virtual machine from there. This should be faster than using remote desktop.
I also use DropBox, and that is key because it is important to keep it simple.
It is generally better if you can have some type of remote desktop ability, because this will allow you to use a standard workstation configuration, and it will allow for consistent connection to network resources (database server, business servers like workflow, etc).
Working offline, in my opinion, is ok for certain tasks, but overall there are obstacles for systems which connect to other resources (unless you plan to move those resources to your home box).
It was a problem for me too. So, the company bought me a laptop, and I do my work on it, at home or anywhere else.
I have a set up where a folder on one machine is synced to a folder on another machine. any changes to the contents on one machine is also made on the other machine within a minute.
So you could sync the top level folder of your work files, and have then sync to your home machine. What I like about this is that syncing is completely transparent. As far as the user experience goes, I'm simply using the file system. No external app to interact with.
I use Live Sync Live Sync from Microsoft to this. You'll need to create a Windows Live ID to use this system. It works for windows and macs.
Dropbox and Microsoft's Live Sync are good options that have already been mentioned. My personal favorite is Live Mesh, also from Microsoft. The one great feature that puts it above the other two, in my mind, is the ability to specify which folders get synched on which computers, and where the folders are located. So, for example, I synch my Visual Studio 2005/Projects folder between my work machine and my dev box at home, and I synch Visual Studio 2008/Projects between my side gig VM and my home dev box.
i have a macbook with all my dev software on it; when i go to work, i start it in target firewire mode and plug it into my work macpro with the fast processor, lan connection, big monitor, etc. this way i never have to leave my user folder but i have access to all the software and hardware available at work.
Why don't you just use version control? A DVCS?
Find here a tutorial on DVCS for Windows users (very simple)
http://codicesoftware.blogspot.com/2010/03/distributed-development-for-windows.html
Some ideas:
Use network storage (with SSD cache if speed is a concern), either for your code or to host your VM.
Separate data and OS into two virtual disks in your VM.
Google drive, Onedrive, Dropbox etc.
If you use Visual Studio (Code), try the Live Share extension.
Dockerize your environment. Alternatively, I keep a bash script for all the setup I did, so I could almost one-click reinstall my dev environment anywhere.
Use a second version control, covering your whole work directory. Commit and push everything before switching environments, then pull and hard reset your commit in another machine.