LAMP - Jenkins - Git setup as deployment - deployment

I've got a LAMP stack, running a web application at /var/www/html/. Right now, I have the source code hosted on BitBucket, and whenever there is an update, I'm simply doing a git pull in the /var/www/html/ directory to update the code. Not really ideal, and want to change it.
I tried setting up Jenkins with Git plugin to do this automatically, but I'm running into permissions problems when doing it directly to the /var/www/html directory. I've got the directory as group writeable by the www-data group, and have added the user jenkins to that group, but to no avail.
What's the best way for me to have Jenkins run and deploy the code to that directory?

Do you also have the /var and /var/www directories as (at least) readable to that group? I had similar problems until I made permission changes (I used setfacl) to all these directories.

Related

github deleted files on computer

So I'm trying to learn how to use GitHub... now my computer's out of whack.
fooled around with a folder - c:/documents/class/lab1part1, typed git init for that folder. Couldn't upload, made some mistakes.
Changed folder to c:/documents/class/lab2part3 and ran git init. made mistakes.
Went back to c:/documents/class/, was able to successfully upload (git init, git add, etc).
Went to github.com and all folders and files were there... except for folders (with files in it) lab1part1 and lab2part3.
So, I deleted that repository and started all over again...
Googled, one site said to use the command "rm -rf $HOME/.git" to undo the git init on my folder.
I typed that in git bash... and things went to hell.
This was done on a macbook pro that dual boots OSX and Windows 10 (clearly this occurred on the windows partition).
Right now - I'm unable to access the start menu, my programs that were in the task bar are gone, I'm unable to access connect to a network or any of my files.
What happened, and how to I fix this - i cannot afford to lose my labs!!!!
Putting aside the rm -rf step:
your lab1part1 and lab2part3 were not on GitHub because you create a git repo in their parent folder c:/documents/class/: when you pushed that repo, the two subfolders were recorded as nested git repo gitlinks (a special entry in the index)
as long as those folders are on your local machine, you will be able to add, commit and push once again (but you will need to push to two separate GitHub repos)
So save your folders elsewhere (backup), re-install Windows, and try agian.

Git push remote work but no changes visible

I am currently facing a little problem with git when pushing to my remote server.
Everything works fine on local server, it commit successfully and so do push to my remote repo. Git show master and git show remote-repo/master show the commits modifications done so everything should normally work.
Still when accessing my website nothing has changed. The files simply haven't changed. The remote repo is a bare one, my app is built with Flask, Gunicorn and I use supervisor to manage it. I tried to change git remote path to a new directory created with copy of my files in it, restart supervisor and gunicorn, restart my server, but nothing changed. What is strange is that first push worked. I'd net on a problem of directory structure or oath, still everything is the same. Except that my files are stores in one more directory than local. But i tried to copy files in the upper directory and it did not change anything.
I am kinda lost now and would be grateful for decades for any help or clues about this problem.
Thanks !!
Still a little confused on your set up. You develop on local and push to a server. Does the server have the bare repo or does the server hosting the website have the bare repo?
When you push to a bare that doesn't mean the working tree has changed since bare is just a bare repo- no working tree. If this is the case you have to follow these steps:
1. push from local to server that contains bare
2. pull from the remote bare location from the production server
(the one that contains the website)
3. restart apache or whatever you have
Hope this helps. Please let me know if I am on the right track.
Ok so i've got it working finally ! To do so i init another --bare repo and created a folder for website files. Then, in the bare repo, add to /hooks folder a post-receive SH/script file to check-out files when push received from bare directory to the one for website files. I edited supervisor conf and nginx one, reloaded and restarted all. it's okay now !

Jenkins: FTP / SSH deployment, including deletion and moving of files

I was wondering how to get my web-projects deployed using ftp and/or ssh.
We currently have a self-made deployment system which is able to handle this, but I want to switch to Jenkins.
I know there are publishing plugins and they work well when it comes to uploading build artifacts. But they can't delete or move files.
Do you have any hints, tipps or ideas regarding my problem?
The Publish Over SSH plugin enables you to send commands using ssh to the remote server. This works very well, we also perform some moving/deleting files before deploying the new version, and had no problems whatsoever using this approach.
The easiest way to handle deleting and moving items is by deleting everything on the server before you deploy a new release using one of the 'Publish over' extensions. I'd say that really is the only way to know the deployed version is the one you want. If you want more versioning-system style behavior you either need to use a versioning system or maybe rsync that will cover part of it.
If your demands are very specific you could develop your own convention to mark deletions and have them be performed by a separate script (like you would for database changes using Liquibase or something like that).
By the way: I would recommend not automatically updating your live sites after every build using the 'publish over ...' extension. In case we really want to have a live site automatically updated we rely on the Promoted Builds Plugin to keep it nearly fully-automated but add a little safety.
I came up with a simple solution to remove deleted files and upload changes to a remote FTP server as a build action in Jenkins using a simple lftp mirror script. Lftp Manual Page
In Short, you create a config file in your jenkins user directory ~/.netrc and populate it with your FTP credentials.
machine ftp.remote-host.com
login mySuperSweetUsername
password mySuperSweetPassword
Create an lftp script deploy.lftp and drop it in the root of your .git repo
set ftp:list-options -a
set cmd:fail-exit true
open ftp.remote-host.com
mirror --reverse --verbose --delete --exclude .git/ --exclude deploy.lftp --ignore-time --recursion=always
Then add an "Exec Shell" build action to execute lftp on the script.
lftp -f deploy.lftp
The lftp script will
mirror: copy all changed files
reverse: push local files to a remote host. a regular mirror pulls from remote host to local.
verbose: dump all the notes about what files were copied where to the build log
delete: remove remote files no longer present in the git repo
exclude: don't publish .git directory or the deploy.lftp script.
ignore-time: won't publish based on file creation time. If you don't have this, in my case, all files got published since a fresh clone of the git repo updated the file create timestamps. It still works quite well though and even files modified by adding a single space in them were identified as different and uploaded.
recursion: will analyze every file rather than depending on folders to determine if any files in them were possibly modified. This isn't technically necessary since we're ignoring time stamps but I have it in here anyway.
I wrote an article explaining how I keep FTP in sync with Git for a WordPress site I could only access via FTP. The article explains how to sync from FTP to Git then how to use Jenkins to build and deploy back to FTP. This approach isn't perfect but it works. It only uploads changed files and it deletes files off the host that have been removed from the git repo (and vice versa)

Can I chgrp a directory through Subclipse?

I develop on one machine in Eclipse. When I commit the files to the server, the owner is set to 'svn' and the group is set to 'daemon' (neither of which are me). I'm trying out a framework. It requires one of its directories to be writable by Apache. Apache is group 'nobody'. I'd like to chgrp nobody /path/to/directory but I can't do it directly since svn owns the files.
Is there a way in Eclipse (Subclipse module) that I can send a chgrp command?
Im confused by your question.. When you commit files they should be going to the repository. As long as the files within the repository are writable bay all the svn user accounts that part of it is fine.
Now if you are going to deploy these to a server for test/production/whtever you want to do a svn checkout or export. Typically you login is as whatever FTP user you might login with an ftp client (so that you know apache can read files by this user:group) and you then issue the export or checkout command and it performs that operation as the user that issued the command.
In short you should never have a situation where anything in your checkout/export is owned by anyone other than the user you checked out/exported with. So you should be able to ssh in (or perhaps us ftp/sftp) to change the permission/user/group of the files youve deployed. But this has nothing to do with Eclipse, or even SVN.

Mercurial - Could not lock working directory

We have a web-app, that we're deploying to a remote Ubuntu server.
The app is stored on BitBucket, and we also have Fabric scripts we're using to automatically deploy the app.
On the server, we have the files for the app in /var/www/name_of_site, this folder being a Mercurial repository. The files are owned by the user www-data, group www-data, and are group-writable.
When I attempt to login to the server and do a "hg add" inside the repository, I get a:
adding fabfiles/fabfile.py
abort: could not lock working directory of /var/www/site_name: Permission denied
I tried adding myself to www-data, and it still gives that error message. I'm able to create folders/files inside /var/www/site_name fine.
Have I set things up incorrectly here? Should the permissions be different?
Cheers,
Victor
Check if you can add/remove files inside the .hg directory – tonfa Oct 27 at 10:27