im new using git actions.
So im using actions from git to automate deployment from my repository to my sftp server, everything works fine, but when the action is on execution takes to much time, around 20 minutes, on my repo exist files like system, application, and dist so, that kind of files i don't want to re-upload.
Doing a little research I found out that they can be ignored certains paths.
I found out that they can be ignored with "paths-ignore" but for some reason it's not working, this is my file.
I want to ignore all the content inside application/cache, application/config, application/core .. etc.Or all the folder "application/cache", "application/config".. etc
What am I doing wrong? It's possible to do that?
First I have a suggestion:
Don't put all these files inside your repository, git is for versioning code and not for storage.
paths-ignore is a config for on: event, which means when that on you have changes in your repository, except changes in paths-ignore, start the workflow.
There is no way to reduce too much time in your workflow because every job creates a new machine that gets all your git repository and starts some workflow. But you can read more about the SFTP-Deploy-Action and use the parameter local-path to upload just what you want, maybe your workflow drop to 10 minutes.
Related
can we configure our github actions yml file such that whenever we make commit to our repository, it automatically prepares the empty package.xml file with the components commited and deployment begins to the target org?
I know how its done when we have packaged.xml file along with the componets in it,but here the package.xml file should be empty and whenever we commit our changes to the repository, yml automatically prepares the components to be deployed based on the commits in the empty package.xml file and then finally deploys to the org
For triggering event you'll need to define "on" in the yml. You can start with what's in https://github.com/trailheadapps/lwc-recipes/blob/main/.github/workflows/ci.yml - on any commit/pull request to main branch unless it's just a readme change. And allows for manual triggering too
on:
workflow_dispatch:
push:
branches:
- main
paths-ignore:
- 'sfdx-project.json'
- 'README.md'
As for actual commands...
What's your Github repository's format? Old school metadata api format (with package.xml, Account.object containing dozens of fields, listviews, validation rules) or new source tracking format (Account is a folder, every single field gets its own small xml file, most important directory is probably "force-app/main/default")?
You should be able to call sfdx force:source:convert -d mdapi in your github action to create a temp directory called "mdapi". If you're making a managed package read up about the "-n" option. It will contain your changes but converted from source to mdapi format.
There are things it will not do that a hand-crafted package.xml would (description, post install class) - but again, these tend matter when you make managed packages, for normal usage you should be fine.
After convert try sfdx force:mdapi:deploy -d mdapi -l RunLocalTests -w -1 -c (metadata format deploy, which directory, which tests, wait as long as it's needed, just validate, don't really deploy)
There are sfdx plugins to make it smarter (deploy a delta between 2 commits, not a full project every time). https://wiki.sfxd.org/books/sfdc-tools/page/notable-sfdx-plugins-and-resources
I am using teamcity to build my Unity3d projects. When I am selecting branch in custom build or when build is triggered from "not master branch" git performs clean, and it removes my Library folder. I need to persist this folder because it is a cache that builds a huge amount of time. When I stay on master, everything is fine and this cache is reused. How can I do this? I want this folder to be shared between my branches.
I tried to create multiple VSC roots, but it copies my repo for every branch. I also disabled all "clean" options that I found in settings. But nothing helps
You could try one of these:
In VCS Root settings you could set Clean Policy to Never. It sets whether and when TeamCity should exec git clean in working dir. Default value is "On branch change" which I guess is your case. But it means that you should manually clean your working dir from build artifacts. For more information see here
You could use Unity Accelerator
You could backup your Library folder in the end of every build and restore in the beginning of the next one
To set expectations, I'm new to build tooling. We're currently using a hosted agent but we're open to other options.
We've got a local application that kicks off a build using the VSTS API. The hosted build tasks involve the Get sources step from a GitHub repo to the local file system in VSO. The next step we need to copy over a large number of files (upwards of about 10000 files), building the solution, and running the tests.
The problem is that the cloned GitHub repo is in the file system in Visual Studio Online, and my 10000 input files are on a local machine. That seems like a bit much, especially since we plan on doing CI and may have many builds being kicked off per day.
What is the best way to move the input files into the cloned repo so that we can build it? Should we be using a hosted agent for this? Or is it best to do this on our local system? I've looked in the VSO docs but haven't found an answer there. I'm not sure if I asking the right questions here.
There are some ways to handle the situation, you can follow the way which is closest to your situations.
Option 1. Add the large files to the github repo
If the local files are only related to the code of the github repo, you should add the files into the same repo so that all the required files will be cloned in Get Sources step, then you can build directly without copy files step.
Option 2. Manage the large files in another git repo, and then add the git repo as submodule for the github repo
If the local large files are also used for other code, you can manage the large files in a separate repo, and treat it as submodule for github repo by git submodule add <URL for the separate repo>. And in your VSTS build definition, select Checkout submodules in Get sources step. Then the large files can be used directly when you build the github code.
Option 3. Use private agent on your local machine
If you don’t want add the large files in the github repo or a separate git repo for some reasons, you can use a private agent instead. But the build run time may not improve obviously, because the changed run time is only the different between copying local files to server and copying local files to the same local machine.
I'm working on a simple project with other people. They use Eclipse to build it, but I don't like Eclipse and wrote a makefile and some batch/bash scripts to do the job for me.
I want to keep track of changes I make to these files, but I don't want others to see them in the main repo (at least not on the default branch, it would be okay to have my own). I could make a subrepo, but I don't want to type the folder each time I build something (besides, keeping makefile NOT in the root would be a bit awkward).
What are my options?
Use MQ-extension
Have adding these needed (personal local) files in MQ-patch(es)
Work locally with patch applied, unapply patch before push
I was wondering how to get my web-projects deployed using ftp and/or ssh.
We currently have a self-made deployment system which is able to handle this, but I want to switch to Jenkins.
I know there are publishing plugins and they work well when it comes to uploading build artifacts. But they can't delete or move files.
Do you have any hints, tipps or ideas regarding my problem?
The Publish Over SSH plugin enables you to send commands using ssh to the remote server. This works very well, we also perform some moving/deleting files before deploying the new version, and had no problems whatsoever using this approach.
The easiest way to handle deleting and moving items is by deleting everything on the server before you deploy a new release using one of the 'Publish over' extensions. I'd say that really is the only way to know the deployed version is the one you want. If you want more versioning-system style behavior you either need to use a versioning system or maybe rsync that will cover part of it.
If your demands are very specific you could develop your own convention to mark deletions and have them be performed by a separate script (like you would for database changes using Liquibase or something like that).
By the way: I would recommend not automatically updating your live sites after every build using the 'publish over ...' extension. In case we really want to have a live site automatically updated we rely on the Promoted Builds Plugin to keep it nearly fully-automated but add a little safety.
I came up with a simple solution to remove deleted files and upload changes to a remote FTP server as a build action in Jenkins using a simple lftp mirror script. Lftp Manual Page
In Short, you create a config file in your jenkins user directory ~/.netrc and populate it with your FTP credentials.
machine ftp.remote-host.com
login mySuperSweetUsername
password mySuperSweetPassword
Create an lftp script deploy.lftp and drop it in the root of your .git repo
set ftp:list-options -a
set cmd:fail-exit true
open ftp.remote-host.com
mirror --reverse --verbose --delete --exclude .git/ --exclude deploy.lftp --ignore-time --recursion=always
Then add an "Exec Shell" build action to execute lftp on the script.
lftp -f deploy.lftp
The lftp script will
mirror: copy all changed files
reverse: push local files to a remote host. a regular mirror pulls from remote host to local.
verbose: dump all the notes about what files were copied where to the build log
delete: remove remote files no longer present in the git repo
exclude: don't publish .git directory or the deploy.lftp script.
ignore-time: won't publish based on file creation time. If you don't have this, in my case, all files got published since a fresh clone of the git repo updated the file create timestamps. It still works quite well though and even files modified by adding a single space in them were identified as different and uploaded.
recursion: will analyze every file rather than depending on folders to determine if any files in them were possibly modified. This isn't technically necessary since we're ignoring time stamps but I have it in here anyway.
I wrote an article explaining how I keep FTP in sync with Git for a WordPress site I could only access via FTP. The article explains how to sync from FTP to Git then how to use Jenkins to build and deploy back to FTP. This approach isn't perfect but it works. It only uploads changed files and it deletes files off the host that have been removed from the git repo (and vice versa)